Is Google Measuring Traffic Patterns

You’re probably aware that Google now has many different ways to collect data about websites and visitor behavior and general traffic patterns. Since I’m always one to look where search engines, particularly Google might be heading I thought why not speculate a bit on how Google might be incorporating some of the traffic pattern data they’re gathering and how they might be applying it to their algorithm or business model in general.

I came across a forum thread at Webmaster World not long ago entitled Google algo moves away from links, towards traffic patterns, which is what originally got me thinking about the subject. Coincidentally a thread started at Webmaster-Talk about the same time brought up the article Jagger, Google Analytics, and the Future of Search & SEO also talking a little about how Google might be using traffic data they collect through the Analytics program in their algorithm. Both the thread and the article are interesting reads and have had me thinking the last couple of weeks about how all this data might be used at Google now and in the future.

Ways Google Can Collect Data

Before getting to the speculation let’s think a little about some of the ways Google can collect data. There’s the obvious data they collect from their spiders about web pages and how often people do or don’t click on links in the search results pages. Google now also spiders sites with Mediabot, the AdSense robot, for those sites that have AdSense displaying. Google knows who and how much is bid on keywords through AdWords. With the inclusion of Analytics they now can collect information about traffic patterns on any site that signs up for the program.

They can also collect information about individual users though gmail, desktop search, personal search, cookies, web accelerator, the Google toolbar, among other services. Take a look at the Google sitemap and notice the different privacy policies to see some of what they collect.

Given how often Google’s site is used and how many people are using at least some of their tools and services that’s quite a lot of information. None of this is meant to imply that we should all be paranoid about Google. Microsoft certainly collects it’s own share of information as does Yahoo and I’m not suggesting any of them has less than reputable motives for collecting the information. It’s just to know that search engines are capable of learning a lot about how we search and view websites and web pages. While the speculation here will apply to Google it can equally be applied to the other major search engines.

Traffic Patterns As A Factor In Search Results

So what might Google do with all that data about traffic patterns? One possibility is they might use it to tweak the results displayed for a given query. It probably wouldn’t be a complete overhaul of the algorithm, but they might use traffic patterns as a ranking factor. Imagine a simple scenario where Google displays web pages to a query, but discovers that people clicking on the top result generally click back immediately. They then click on result #2 only to stay on the site and not return to the search results. Might Google not decide that web page #2 was really more relevant to the query than page #1 and reverse their order over time in the results pages.

The simple example doesn’t even require much of the traffic data Google can collect. With the toolbar and maybe Analytics they could also tell how long you stayed on each page as well as how many other pages on the site you visited. They would even know if and when and how often you go back to the page and the site. The behavior of one or two people couldn’t really say much about a site’s relevance to a query, but millions might. Would Google knowing which sites keep visitors the longest on their pages come up with some kind of quality rank that becomes part of the algorithm along with page rank and trust rank.

Traffic Patterns For Spam Removal

Another possibility might be to determine that some sites which rank well get very little traffic or that people spend very little time at all on the site. Perhaps a given site has a large number of links pointing to it that Google knows about through it’s spiders, but those links are seldom if ever clicked as determined by visitor traffic patterns. Might Google surmise that some of those links were of the hidden variety or just placed on some very unrelated sites. The links might be discounted in weight or even set off a red flag about the sites in question.

Spiders and robots can often be fooled into thinking a spam site is a legitimate site. People will often spot the offending page and site easier. Why not make use of traffic patterns to find some of these spammy sites and remove them from the index. When Google comes across patterns where visitors are nearly always clicking away from a site as soon as the get there couldn’t they determine that those sites are of little quality and lose position in future ranking.

Traffic Patterns To Downgrade Less Relevant Results

Lets take an example of a site that sells hammers. Not only do they sell hammers, but the general consensus is their hammer is clearly the best on the market and maybe even the best ever manufactured. Because of the quality of their hammer a lot of other sites link to theirs. Sites link to their home page, their products page, their articles about hammers and other tools. They have a quality site and a quality product and with all the links they receive they naturally show up on the first page of many related search queries.

Suppose that site adds an article to their site about how useless screwdrivers are. They even make the claim their hammer makes all screwdrivers obsolete. The article still generates many backlinks due to the sheer lunacy of what they’re saying and the site itself links to it often through it’s navigation and other internal cross linking between pages of the site. Add to the linking the fact the page is somewhat optimized itself.

The screwdriver article will probably appear in search results for queries containing the word screwdriver though it’s probably fair to say it’s not the page people were looking for when they performed the search. People click though since the page is in the top 3 of the results, but come back very quickly as soon as they realize the page has nothing to do with long lasting screwdrivers or whatever their search happened to be. Seeing this perhaps Google would downgrade the page from screwdriver searches while ensuring the pages on the site about hammers are still prominent in search results. The screwdriver page isn’t spam and neither is the hammer site itself. Still the page might not be as relevant to some queries as it might at first appear to the algorithm.

Other Possible Ways To Use Traffic Patterns

What other ways might Google use traffic patterns? I mentioned personalized search earlier. Perhaps that’s all traffic patterns would be about. Similar to the way Amazon will recommend products based on other people who have purchased products similar to the ones you do, Google might present web pages to your queries that others have apparently found useful for the same or similar queries. Maybe people who use the plural of a certain keyword tend to spend more time on one site while people who use the singular of the same keyword spend more time on another site. Could the order of the results be skewed based on future searches for either the plural or singular of the same keyword.

Another possibility might be to use traffic patterns to learn something about sites and their relative worth when it comes to advertising. It might be difficult for Google to determine certain levels of relevance such as between a site about driving gloves and one about automobiles, but traffic patterns might be used to indicate that people who view one site typically view the other as well adding a measure of relevance between them. The information could be used to present different AdSense ads than might otherwise be served improving click through and revenue for the ads on each site.

The relevance determined by traffic patterns might be used to modify the link weight of one site to another. The algorithm might have trouble determining relevance that the traffic patterns can more easily show. While Google appears to be getting better at determining relevance between web pages they are still no match for human beings. And what one person considers relevant another might not. Traffic patterns though might reveal more about relevance by adding visitor behavior into the equation.

One thing is clear. Google and the other search engines are able to collect a lot of data that can reveal web traffic patterns. With all that data it would be unrealistic to think it’s not being used. Isn’t Google’s mission to organize the world’s information and make it universally accessible and useful. Aren’t traffic patterns another form of information.

With all the information they can collect about traffic patterns it’s reasonable to assume they’re using it and given that traffic pattern data will be a little more difficult (though not impossible) to manipulate than on page information or even linking patterns it’s likely traffic patterns will be incorporated as some part of the ranking algorithm. How they might be incorporated only Google and the other search engines can say, but there’s a good chance the traffic patterns of visitors while on your site might affect where your web pages rank in the future. Maybe building a usable site and providing quality content for your visitors is important after all.

Download a free sample from my book, Design Fundamentals.

Leave a Reply

Your email address will not be published.

css.php