In April 2012, Google rolled out an extensive algorithm update. Code-named Google Penguin, the update targeted websites which violated the Google Webmaster Guidelines in an effort to boost their page rankings. Websites which were participating in link schemes and using SEO techniques like content duplication and keyword stuffing were identified by Penguin and had their page rankings dropped. Penguin was part of Google’s on-going effort to provide the most relevant, high quality and content rich websites to its users. Although Google is yet to announce when its next big algorithm update will be unleashed on unsuspecting webmasters, speculation as to what changes the search engine will be making has set the technology world abuzz.
Merchant Quality
Speaking at SXSW, senior Google engineer Matt Cutts hinted that the next update might focus on the quality of merchants that are retrieved in its search results. Google currently collect information about e-commerce stores via its Google Trusted Stores programme, but the new algorithm could factor in other ratings and rankings to assess the quality of online merchants. This could be good news for consumers who are looking for trusted websites to purchase goods from and it may make it easier to find products and services. However, small niche businesses could suffer from such an update if big consumers are given preference in the search results.
It’s All About the Content
Google strives to create the best user experience for browsing the web. The service Google provides relies on the quality of websites it retrieves in its search engine results. Before Google refined its algorithms, dubious webmasters could repeat keywords and duplicate content to boost their rankings. When Google Panda (Penguin’s predecessor) hit in 2011, Google specifically targeted websites with thin content. Nearly 12 per centof all website rankings were affected by Panda. Penguin made similar tweaks to Google’s algorithms by targeting low quality content, so it wouldn’t be surprising if their next big update has similar ramifications. Content strategy is more important now than ever, and webmasters must have a steady and quality stream of content if they want to prove the quality of their website to Google.
Social Media Tidy Up
Link participation and referral schemes have long been labelled as ‘black hat’ SEO techniques, but it seems like some websites are employing similar tactics with social media. The amount of ‘Likes’ and ‘Followers’ a website has should be a reflection of its quality and an indication of how active its community is and how much content they generate. However, while social media can help boost a website’s traffic, webmasters who pursue this sort of social activity may find themselves on the back-end of a Google algorithm update. Google could start weighing disproportionate social media activity against the worth and content of the website and affect its page ranking accordingly.
Link Profiles
In October 2012, Google launched a disavow link tool. When ranking a website, Google checks to see what kind of links it receives from other sites. These incoming links determine the quality of a website because link schemes and paid links violate Google’s Webmaster Guidelines. Google sees these low quality links as being spammy and artificial and they reflect badly on the website they are linking to, ultimately affecting its ranking and link profile. Using the disavow tool, webmasters are able to flag third-party, low quality links and ask Google not to take them into consideration when ranking their website. As Google have made link profiles a focus point in this way, the next algorithm update could up the search engine’s efforts of weeding out spammy backlinkers.
Poorly Formatted Websites
A website’s user experience isn’t just defined by its content. Users must be able to intuitively navigate websites to find the content they want. Google already penalises websites which have broken links, so it may turn its attention to those that are poorly optimised and suffer from slow performance. Websites that take too long to load pages, have buggy code or invalid page elements could be the focus of the next algorithm update. Also, with the rise of smartphone and tablet browsing, Google may take a look at the coding of websites which offer mobile optimised navigation.