Tuesday, December 06, 2005

Handicapping SEO for 2006

Getting large numbers of likely-to-buy customers at low prices using search engine optimization techniques is becoming harder each day. Harvesting clicks with a high propensity to buy requires increasingly sophisticated strategies and dynamic, real-time tactics.

The market is flooded with keyword buyers that range from the most sophisticated SEO mavens managing thousands of keywords and variations with proprietary software to complete newbies with a land rush mentality eager to get in on the “ground floor” of the newest, best hyped medium. The result is higher prices, lower clickthru rates, less stable top rankings and efforts by the smart money players to find ways to separate clickers from buyers.

According to the Performics division of DoubleClick, (www.doubleclick.com/us/knowledge_central/documents/RESEARCH/dc-search-0511.pdf) , the average cost per click increased from $27 to $30 from July to September 2005. Similarly keyword costs jumped from an average of $20 to $26 in the same time period. This reflects, in the opinion of Performics search strategist, “the fact that campaigns are getting bigger and growing across the board.” More marketers are buying more keywords and their variations across more sites bidding up the prices and squeezing the available inventory.

Yet there might be a few bright spots according to Fathom Online (www.fathomonline.com) who reports that keyword prices have declined 11 percent when you compare November 2004 with November 2005. Yet this data reflects only the generic terms (e.g. “shoes”) which most savvy marketers have already abandoned as too expensive and too broad in favor of multi-word customer segmented phrases (e.g. girls black patent leather party pumps). B2B marketers have made a similar migration abandoning the inflated prices of broad terms like “CRM” in favor of narrower technical phrases like “marketing automation and resource management tools.”

Search firms, eager to take in this bonanza, are making more inventory available as quickly as the can. Google separated Adwords from contextual ads, thereby increasing the number of and easy access to more sellable clicks and giving marketers some openings to test new tactics. Look for other leading search engines to follow suit as they slice the salami thinner and thinner in the quest to have more inventory to sell.

Then look for new opportunities to place targeted ads on blogs, RSS feeds, podcasts and web video. Search will remain hot for the foreseeable future and search engines will do everything they can to maximize selling opportunities and to encourage higher bidding.

Just to keep it interesting, Google also re-jiggered the ranking algorithms which make it harder to get into the top positions and harder to maintain top rankings since, on Google, the highest bid does not necessarily yield the top ranking. Others are likely to follow suit because the harder it is to rank number one, the more cash is thrown at the problem. Performics observed that the proportion of keywords that maintain the top rank for an entire month is steadily declining so you can bet that marketers are already husbanding dollars and planning A and B campaigns to insure they rank above the fold, at the top of the right hand paid column or in the beloved blue bar.

Two ideas are shaping the next wave in SEO experimentation and the race for competitive advantage. The first notion is to make clicks more personalized by tracking where prospects click, determining who they are and dynamically serving content or unique landing pages designed to improve the chances that they will convert to buyers. The idea is to match profiles, stored in cookies, to inbound clicks and use these profiles to trigger customized messages or offers.

This is the central idea behind Amazon’s recommendation engine, which has been widely accepted by consumers without too much outcry about privacy or Big Brotherism, and has been baked into their A9 search product. MSN is developing this targeting capability based on the millions of profiles it has amassed through Hotmail, MSN Mail and other MSN features and services though we haven’t seen any real life case studies yet.

In theory persistent MSN cookies, corresponding to customer segments, can trigger select messages aimed at distinct segments, maybe as discrete as segments of one, which will result in more buys per click. In practice, this kind of micro-targeting will reduce buyers’ remorse and holds out the promise of better efficiencies for well crafted SEO campaigns.

A variation would be for individual sites to use cookies to track repeat visitors and use stored data to trigger unique messages. So if you’ve looked around on a particular site, then you come back through a search engine, the site would recognize you as a repeat visitor, know what you looked at previously, and serve up content designed to continue the conversation. The cookie trigger would zero-in on what they inferred you are interested in and engage you in a way likely to either sell you something or convince you to identify yourself.

In theory this multi-channel tracking might mark you as a “hot” lead and/or allow marketers to find several ways to identify or engage you in the course of a complex (b2b) or high value sales cycle. The gating factor is the availability of low cost technology to cookie, track, and dynamically serve content based on a matrix of business rules and inference patterns.

The second idea that could impact search effectiveness is clustering the results to surface relevant results faster and thereby accelerate the awareness-consideration-purchase cycle among clickers. Grokker (www.grokker.com) and Clusty (www.clusty.com) use graphic devices to group and organize the results of a search engine inquiry. In theory a clustering engine acts as a surrogate for the person conducting a search. It gets you to the information you want faster, makes you happier and speeds you along toward buying.

In practice the clusters are still fairly big buckets of content that may or may not correspond to any individual’s interests and intentions. The technology still cannot read and sort everything with equal clarity so often some of the items in the clusters shouldn’t be there. There is no data with which to even guess at the value of this tool in moving people through a buying cycle. Clustering search engines are tertiary players in today’s SEO market.

Nonetheless look for personalization and clustering to be two highly touted SEO tactics in 2006 as marketers seek efficiencies and advantages and as agencies and search engines try to keep the merry-go-round turning and the dollars flowing.


Post a Comment

<< Home

Site Meter Subscribe with Bloglines
Search Popdex: