SEO mavens find scraping switches in Bing
Microsoft last week released guidance for webmasters and search-engine optimization strategists, explaining what Bing's up to in the background. A look at the white paper revealed some smart thinking about where viewers' eyes move on a page of search results, as well as a much more determined "scraping" effort than Google currently uses.
Betanews asked Dan Rosenbaum, a New York-based SEO expert, to walk with us through the white paper and note any features or telling omissions that caught his eye. Overall, he says, "the big changes from Google are in the user interface, and they look pretty interesting and effective. [Microsoft] took the standard heat map of a search engine results page and said, 'How can we make that useful?'"
There is currently a trend among search engines (translation: at Google) to derive search results snippets from the content that's right there on the page, rather than from what it found in any <description> metatags it might encounter. That makes for meatier, more relevant snippets, but some sites that optimized for a different search sensibility -- sites that the search engines interpret as simply long content-free lists of products, for instance -- are finding the change dismaying.
Going by what's revealed in the white paper, Rosenbaum says that Microsoft is taking an even more aggressive approach to scraping sites for content. "So," he says, "sites that are upset that search engines are giving readers and prospects 'all the good stuff' will not be made happier by Bing."
But, he adds, not only is the change a win for readers, SEO folk have known for some time that it would be. He cites a recent essay at FutureNow by Jeff Sexton comparing "long" (content-rich, substantiated) and "short" (minimalist, pared down) copy, which says baldly that "content-rich sites typically out-convert minimalist designs because they more completely answer the prospects' questions." Bing and Google both have changed their search underpinning to reflect that; optimizers may find the change "SEO-hostile," as Rosenbaum phrases it, but there's little they can do but live with it.
We both noticed that the language of the white paper was very human-agency-oriented -- "the Bing team chooses," to take but one example. "I'm sure they're trying to automate as fast as they can," Rosenbaum said, "but there wasn't a word about how the Bing team chooses. Based on what? Volume of search? Number of backlinks? Number of clicks from other favored sites? Anchor text? They don't say."
Rosenbaum says that though there's a heavy implication that both the algorithms and the crawlers are essentially unchanged from earlier MSN iterations, "they don't say anything about the inferences drawn by the software at home base." There's no direct discussion of the algorithm, nothing about inbound links; the paper sticks strictly to matters of on-page SEO (that is, what's actually on the page being parsed by the crawler).
"Bing: New Features Relevant to Webmasters" is available in PDF or XPS form from Microsoft.