Posted in: Archive
Without a doubt you’ll have noticed Google’s recent moves in the search space focusing publishers on speed. It makes sense for them to encourage faster websites commercially for them, as lowering their bandwidth consumption, and easier quicker indexing if the web translates directly into savings of millions of dollars. I thought it would be interesting to take a look at exactly what they said and done to date about improving website speeds, and the tools and information they have given us, right up to the present point, and what impact those decisions are having on their strategy overall going forward.
6th May 2005 – Google announce Web Accelerator, an extension for Firefox which prefetches pages, delivers them through Google servers dedicated to serving accelerator traffic. The project was later discontinued January 20, 2008 following numerous bug reports.
28th April 2009 – Matt Cutts video predicts that speed isn’t yet but could be a factor in the algorithm, could potentially become a factor, quoting Larry Page wanting to make the web ‘like a magazine’.
23rd June 2009 – Google research blog publishes the research they have obtained from the SERPs, denoting just how much of an impact that speed has on users.
10th August 2009 – Google announce new architecture inside the Plex – dubbed the Caffeine update, it concentrated on indexing more of the web, faster.
12th November 2009 – Chrome / Google research publish information on a potential new protocol ‘SPDY‘ for delivering information faster over the web.
13th November 2009 – Webpronews had the scoop on a video interview with Matt Cutts out of PubCon 2009, a webmaster and publisher conference in Las Vegas. Matt mentioned that the Caffeine update which had been tested would roll out in January. The first mention of site speed potentially being a ranking factor. January 2010 target date for the start of the Caffeine rollout.
2nd December 2009 – Google roll out a new ‘Site performance‘ tool as a lab feature in Webmaster tools, allowing those interested to begin to track latency.
3rd December 2009 – Google announce its own public DNS service, designed to speed up DNS lookups across the web.
8th June 2010 – Google Caffeine completes its entire rollout across all datacentres.
30th September 2010 – Google announce a new image format for the web ‘WebP‘ – which offers up a faster alternative to JPEG’s.
3rd November 2010 – Google release mod_pagespeed, an Apache module which helps manage website speed server side, performing optimisations on the fly before content is served.
17th March 2011 – Google tweak Google Adsense code, lowering the time taken to deliver ads by rendering iFrames (allowing background requests to occur).
31st March 2011 – Page Speed online available across all browsers, including mobile devices, catering for the growth of mobile devices on the web.
09th April 2011 – Google make it official that site speed is now used in the algorithm, affecting a small number of queries, but still sending webmasters into a frenzy.
04th May 2011 – Google Analytics tracks page speed, provided an extra piece of code is added to websites, giving insight into the pages on your website causing significant browser (and Googlebot) load.
If that wasn’t enough evidence of Google’s focus on speed, have a quick look at this graph. It’s a timeline of content coming onto the web relating to the terms ‘google speed’. Clearly, the emphasis in recent years has grown significantly internally, as has the size of the web. 1 trillion URL’s according to Google and growing. Although not all of these are added to Google’s index, some due to spam, but mostly not all are added due to the cost of indexing.
Where is is all headed?
Speed is one part of the story, Matt Cutts has been quoted time and time again that building a site for users, and Google will deem it important. Site speed is good for users, so it stands to reason that Google begin to use that as a ranking factor.
But what comes after speed?
In my opinion, we should all begin to look at engagement as the next potential ranking factor in Google’s algorithm if its not already. Whilst this is a particularly difficult metric to examine and analyse, in my opinion it is the next logical step in Google’s game of chess. We’ve already seen them using social engagement. Once site speed has been cleaned up, they can begin to more accurately examine engagement as a metric which can be used for ranking overall. Time on site, average pages per visit, time spent per page. All factors which in principle reflect a ‘sticky’ or quality site – and ‘quality information’ which will no longer be affected purely by the speed the information is delivered if their guidelines have been followed.
If, and that’s a massive ‘if’ Google do manage to get their hands on the data they need to accurately determine the level of engagement happening on your site, you can expect the impact of social sharing to be significantly down played in the Google algorithm. If you have experienced social traffic in any way shape or form, you will already know that the quality is normally rubbish. High bounce rates, with low engagement. That’s not where Google want to position themselves.
For e-commerce, its already been proven that social visitors don’t convert or interact with your content. They come they share, they leave and bounce rates go through the roof. If that is the behaviour of visitors from these platforms – is it really an endorsement of quality for Google to use so heavily? Sure its a link that counts for something, but its certainly not as a strong a signal as your average hyperlink.
The move to site engagement over social engagement makes sense, especially in a world where they are rapidly losing the social battle to Facebook. Twitter in many ways, isn’t half the threat that Facebook is to Google. They don’t have the data on individual web users that makes targetted advertising on the Facebook platform so powerful.
Further evidence of this move towards engagement can be seen in the shake out from the recent Panda update. The following blog post giving generalist guidelines from Google showing what they deem to be a good site algorithmically.
“Are the articles short, unsubstantial” – the shorter the article, the less time spent reading it.
“Is this the sort of page you’d want to bookmark, share with a friend, or recommend?” – a hint at the role social is playing, but also – engaged visitors share more content than others.
‘Are the topics driven by genuine interests of readers of the site..’ – repeat visitors would provide insight to how well an article is received. Feedburner already provide publishers with information on that.
Overall, the adage of building a quality site, with quality content that generates links still holds a lot of weight, but with the algorithms Google have now access to, in my opinion it makes complete sense to place as much if not more emphasis on the retention of the visitors that you already receive, with techniques to make your site stickier, more responsive and visitors more engaged going forward. My next scheduled post will take a look at some of the techniques I’ve learned that help your site to improve those metrics. Subscribe so you don’t miss out on it.