In our last post we looked at how some of Google’s algorithm updates affected the SEO world and changed the way web content is written. The story continues as we look at some of the more significant updates that Google put in place in order to improve the quality of their search results.
Further changes to Google’s algorithms have been put in place over the years, some offering small refinements or tweaks to previous updates, while others bringing in completely new features or approaches. Some of the most significant updates include:
January 2005 – the ‘nofollow’ update.
With links becoming something to be carefully sourced rather than just bought in bulk from a link network, webmasters were increasingly looking for a way to distance themselves from links that may look ‘spammy’ to Google. For example, the ‘Designed by Website Agency X’ links that are commonly found in website footers became a concern, as there could be thousands of these on just a single website, all pointed to the homepage of the designer’s website – very typical of what Google terms “poor quality links”. Google therefore introduced an HTML tag called ‘nofollow’, which can be applied to footer links just like this. In essence, this nofollow tag says to Google, “please don’t infer any SEO from these links, for good or bad”. This way, webmasters have much more control over what kind of links are being published on other websites: it’s critical, from an SEO point of view, that Google sees a “natural profile” of links to a website. Having too many links from low quality/directory websites is a strong indicator that a website has been buying links, a practice that Google is highly critical of and will negatively affect SEO. Conversely, it’s impossible to avoid these kinds of links 100% of the time, particularly for web design companies, which is why Google recommends a natural profile of links. Any good website will naturally have a mixed profile of links, and that’s what Google’s looking for.
June 2005 – Google XML Sitemaps Update
XML sitemaps are a crucial part of SEO, but that’s only been the case since they were introduced to Google’s ‘Webmaster Tools’ software in 2005. In order to provide the best and most comprehensive search results, it’s crucial that Google gets to see the full picture of each website it indexes. While crawling the site using scripted bots usually gives a fair impression of the website’s pages and layout, and XML sitemap goes one step further, and provides Google with a complete map of a website’s page, including the page structure and hierarchy. By setting up an XML sitemap and submitting it to Google, webmasters can directly inform Google of which pages are the most important, and also let Google know each time a page is created or updated. From Google’s point of view, this helps keep search results accurate and up to date – as with most aspects of SEO, XML sitemaps are underpinned by Google’s aim of providing the best search results to its users.
June 2005 – Google Personalised Search Update
“Google Personalised Search is a personalised search feature of Google Search, introduced in 2004. All searches on Google Search are associated with a browser cookie record. Then, when a user performs a search, the search results are not only based on the relevance of each web page to the search term, but also on which websites the user (or someone else using the same browser) visited through previous search results. This provides a more personalised experience that can increase the relevance of the search results for the particular user. Such filtering may also has some side effects, such as creating a filter bubble.
Changes in Googles search algorithm in later years put less importance on user data, which means the impact of personalised search is limited on search results. Acting on criticism, Google has also made it possible to turn off the feature.”
October 2005 – Local Search/Maps Update
With Google’s ‘local’ search results being so ubiquitous these days, it’s hard to imagine search results pages without them, yet until October 2005 that was exactly the case. Google’s ‘Local Business Centre’ had already been launched earlier in the year, but this update was the first time that information from Google Maps was integrated in to the local data.
By introducing ‘local’ search data and integrating it with its Google Maps results, Google began a strong push into the ‘offline’ search market, a tactic that helped redefine what search engines were used for. With Apple’s iPhone just two years away, Google was perfectly placed to take advantage of and influence the cultural shift towards ‘on the go’ search results that we now take for granted.
August 2008 – ‘Google Suggest’ Update
The ‘Google Suggest’ feature was introduced in August 2008 as a way to speed up and improve search queries. Instead of waiting for searchers to type in their whole search query, Google began showing a list of ‘suggested searches’ underneath the main search query as it was typed, a feature that is still in place to this day. Google went on to use this technology to power its ‘Google Instant’ search results, where the search results would update in real-time as the searcher types out their query – this was later rolled back in July 2017 as Google sought to bring its desktop search interface more into line with its mobile interface.
Where did Google go from here? Check back soon for part six of this story.