In our last blog we looked at how Google’s ‘Vince’ update changed the way that Google treated brands, favouring the larger and more established brands over the smaller businesses and how ‘real-time search’ burst onto the scene, increasing the importance of social media marketing and search engine optimising significantly. In this edition we will be looking at how this was expanded, and how Google cracked down on poor quality sites.
December 2010 – Google Social Signals Update
Google and Microsoft confirmed that they would be including social signals from platforms such as Twitter and Facebook within the ranking algorithm for organic search results. Social signals identified the engagement that users had with a certain page or social update, considering these often to be a citation or positive endorsement of the brand, in a similar fashion to good quality backlinks.
The social authority of the user on the social media channels included is used to weight the organic search listings. Tweets and Facebook shares from prolific influencers or well-known voices started to help boost the prominence and weighting that Google gave the brands in question. The more people, especially ‘important’ people, were talking positively about your brand, the clearer the social signal and better weighting your organic search results would benefit from.
February / March 2011 – Google Farmer / Panda Update
One of the biggest updates that Google had released up until this point was Google’s Panda update, which focused on the quality of the websites they were including in search results. By starting to assign pages a quality classification, and cracking down on low quality websites which weren’t adhering to Google’s Quality Guidelines, Panda affected about 12% of websites included in search queries.
Due to its size and impact, Panda was rolled out gradually across the world, starting in America in February 2011 and reaching Europe the following month. Panda’s purpose was specifically to refine and improve the quality of search results that were being returned. By rewarding websites that house unique and compelling content, Google was looking to answer the customer queries more effectively whilst reacting to the increased number of complaints they had been gathering around ‘content farms’ – sites that produce a high volume of low-quality content using keywords to ensure high placement on search engine result pages.
Google’s Panda update hit the ‘article marketing’ approach dramatically, as the SEO practitioners relying on this high-volume, low-quality approach to build links and ranking had to change their entire approach. Using human-style quality rankings, Google released a list of 23 questions which guided how the algorithm was separating the good websites from the bad. These were intended to drill down into the depth and integrity of the article / page, looking at originality and authority of the author as well as looking into the quality and presentation.
For those website owners who woke post-Panda to find they had lost significant rankings, the way to fix this was clear yet challenging: improve the quality and uniqueness of the content on their website. Whether that was rewriting the content across pages, removing low quality pages, combining similar pages and updating the structure across the website, developers and marketers punished by Panda had some changes to make!
Catch the next blog in this series soon where we will be looking at some of the behind-the-scenes updates Google made to improve search engine readability and user privacy.