Google Web Accelerator

[return] [read comments] [post a comment]

It appears that Google has given up on just how much they can achieve using their current systems when ranking websites. Some well known, and seemingly basic "black-hat" techniques like hidden text have not been prevented and it's time for Google to move on to a new method for ranking websites. Currently most users don't mind google observing them, now they are spying on their users to bring the best results possible.

Monitoring their users

This is bound to cause a lot of controversy, no-one likes the idea of being spied on, with thousands of Spyware, Adware and various usage trackers programs that exist today, people are beginning to become aware and are protecting themselves from this type of attack.

In 2001, Google started tracking their users, with the launch of the Google Toolbar. With the advanced features enabled (activating the PageRank display), Google can, and does, log each and every site a user may visit. This in itself can potentially reveal further information such as the amount of time on a particular page, where you went, came from. Using this data it could be possible for Google to analyse this information and decide whether a website has actual value to the user. For example, if a website has a high ranking for many keywords and someone views the site and clicks back almost immediately this could be a clear sign that the site had no useful information. Although just one user might not impact the site ranking in any noticeable effect, what if this trend was identical with almost every user who visits this site? Wouldn't this be a clear sign that the site in fact not relevant and maybe shouldn't be ranking highly these keywords or even penalised in some way by Google?

A new technology

But the information from the Google toolbar is somewhat limited. However, on 4th May 2005 Google launched a new tool called Web Accelerator. Being a proxy, it can potentially monitor absolutely everything transmitted between you and the Internet. Google claim the Google Web Accelerator can speed up a broadband internet connection. It achieves this by prefetching content (ie. download further content from a website while your most likely reading a current page) and compressing files. This in itself has caused major debate and fury with webmasters for many reasons including problems with incorrect stats reports and PPC ad concerns and cookies problems to just name a few.

Changing the way they rank websites

Ultimately the data collected from Web Accelerator can add some very useful information which they can use for their ranking algorithm. I believe the main reasons for its presence are;

  1. Changing the importance of links - On any given page, links that are not being clicked on could be given less weight than ones regularly clicked. This could greatly help in the prevention of hidden links or even links not clicked on regularly which could affect the weight on bought links and link exchanges which usually do not get much human traffic.
  2. Prevent cloaking - Since Google will know the contents of a site (stored in their proxy) they can compare it to their cache copy. If it consistently has a different version of the same page then a red flag could be raised.
  3. Clicks from Search Engines - Well they already do this (using random samples) on Google, as does pretty much every other major search engine. But now they can potentially monitor click-thoughs from other search engines such as MSN and Yahoo to analyse competition.
  4. User Activity - For each user - How often do they view the site, how many pages, how long on each page, do they revisit and how regularly? Did they land on the page using search engines/bookmarks/typed directly in the address bar?

What now?

As of the 12th May Google halted all downloads of Web Accelerator due to a security breach. I doubt they will pull the plug on the entire product, as they will very likely be working on a patch as we currently speak.

It looks as if they couldn't find anyway to greatly reduce the amount of spam pages currently within their index - rather than just using Google's own systems it seem they will be relying more and more on the user to see if a site is spam or relevant.

In the future, will we be seeing the same people who were willing to purchase expensive links now be buying services which can simulate the browsing patterns of humans viewing their website?

User Comments

Post a Comment

Your Name:
Your Email: