Google Search Algorithm Update: Freshness
Google Updates Search Algorithm to Include Freshness Factor
So I thought it would be rude not to mention the recent update to Google’s algorithm, the significance of which I’m still deliberating on. On Friday (04.11.11) Google announced that they had amended their search ranking algorithm in a way which would affect approximately 35% of search results.
When you consider that almost 80% of online search is done through Google (according to some reports), this 35% represents an awesome number of results.
So what exactly has changed? Well, Google are calling it ‘freshness’. Basically, they’ve amended their algorithm so that it differentiates more finely and more intelligently between old and more recent or “fresh” results.
The idea is that someone searching for the results of the world’s greatest football club’s* last game wouldn’t want to be greeted by information on the 1960-61 League and FA Cup Double when they typed ‘Spurs results’ into Google. They’d want details of the latest game (a 3-1 win over Fulham away if you must know), or at the very least the results from the current season or cup competitions. It’s precisely this kind of recent (and presumably more relevant) result that Google is aiming to provide more often.
Many of you will be wondering at this point – so what? I must admit, when I first heard about the update last week I had to double take – hasn’t Google already achieved this? Search for anything and you’re able to adjust the time parameters at the side of the SERPs, specifying anything from the past hour to the past year. This allows you to filter out older results or indeed specify results from years gone by.
Well the difference with this latest update is that Google have tried to imbue their search algorithm with the ability to discern how important ‘freshness’ is to any given search term. That’s because freshness doesn’t always equal relevance. Many searches are aimed at gathering information on historical events, such as the keywords ‘what team has won a major trophy in all of the last six decades?’
So in theory, when you make a search, the Google search engine should now be more able to make a good decision regarding whether you want the very latest pages in the results or are less bothered by freshness.
With this in mind then what does all this mean for us in SEO? Well, it’s early days yet, but many people in the industry are worried that this will lead to an increase in spam. The argument is that businesses will simply take to creating low quality blogs with duplicate content which is simply designed to be published according to a schedule, with the express aim of appearing in search due to its ‘freshness’.
Whilst this is a danger, I don’t think that it will lead to a flood of low quality sites ranking well in search. Don’t forget that this is an update, not a complete overhaul – Google will not have forgotten the lessons learned and fixes implemented in its Panda update. In the early days especially, I anticipate that the affect of freshness on rankings will be downplayed, it will simply be another factor which is thrown into the mix to decide where to rank your site.
Having said that, I think there are a couple of points to make regarding how to make sure your sites are ‘freshness friendly’.
The first one is something we’ve been harping on about here at Creare since time began – regularly updated content. If your site hasn’t been updated since Spurs became the only non league team to win the FA cup back in 1901 then it’s not going to benefit much from Google’s freshness update. However, get yourself a blog, start publishing good content relevant to your site on a regular basis and Google could realise that you’re providing good quality content at the time when users are searching for it.
The second point is a little more technical, but according to WebProNews, a lot of webmasters are all but rushing to integrate timestamps into their xml sitemaps. This little additional tag could help show Google that your content is recent and fresh, therefore helping it to rank better in search results.
Finally, with these new changes it’s important to make sure that your site is being crawled quickly, easily and regularly. If Google can’t index your content quickly or at all, then it’s going to be out of date in terms of freshness by the time they finally are able to crawl it. Drop by webmaster tools for crawl diagnostics, improve your internal linking and update your site regularly to make sure Google drops by often enough to pick up your pages hot off the press.
Ironically, only time will tell how much of an affect this update will have. I anticipate that the end result will be an improvement in the quality of search results for the end user, and will probably mean a little more work for us in SEO!
*This is not necessarily the view of the management.