What started out by two friends in Stanford in the year of 1998 as a tool to help web users navigate through the vast sea of information in the web turned out to be one of the most successful search engine tools in history. While Google is not the first search engine that ever existed (that achievement goes to Archie, which came eight years ahead of Google), it is undoubtedly the most prominent search engine to date. In a day, Google receives and answers over one billion queries from people in all corners of the world. To be more specific, questions come from people from 181 countries and are asked in an astonishing 146 languages. What’s more fascinating is that 15% of daily searches are entirely original and have never been encountered before.So how does Google handle billions of searches in a day? Plain and simple, through technology. Google and other search engines design and create computing programs, which are popularly known as algorithms. These programs are responsible for handling search requests on a daily basis, regardless of the substantial volume and the breadth.Since the creation of the Google search engine, the folks at Google constantly comes out with new algorithms or updated versions of old ones to enhance the efficiency of their search engine. While it is primarily designed to improve user experience, business owners and marketing companies rely on algorithms to market their products and services, as more and more users are using Google to do product research.Many of the algorithms involved changing the search result interface, penalizing several websites and “black hat” techniques, as well as issuing warnings regarding future penalties and subsequent actions on a variety of SEO issues.So before the year ends, we look at back at 2013 and see what new changes and improvements they came up with. We will then determine whether it made a significant impact on how people use the Google search engine or if it only made things more complicated for both the consumers and the entrepreneurs who invest a lot of time to achieve a strong web visibility in search engine results pages (SERPs).
The Year That Was: 2013 and Google Algorithms
January – Google started the year (January 24, 2013) with a 24th Panda refresh. Google’s Panda, which was launched in February 2011, was created to reinforce the importance of quality website content in achieving high rankings in Google search results. According to Google the refresh affected 1.2% of English queries but did not specify as to how exactly it’s going to affect the searches. A Googler on Twitter posted this link as a background on the said refresh, which basically talks about helping users find high-quality sites and lowering the rankings of those with poor content.
March – After a one month hiatus, Google rolls out a 25th Panda refresh, which again reiterates the increasing necessity of quality content. In a business perspective, it is important for entrepreneurs and online marketing firms to distance themselves from back using automated backlinking tools. Not only is it essential for to be fresh, original and original, but it is also great to create them on a regular basis.
Google also penalized a major link network called SAPE Links, which resulted in a tremendous downgrade in terms of rankings. This applies also to websites that are connected or are clients of this link network. Google have continuously warned businesses that use paid links that they can get penalized for doing so.
It was also during this month that Google announced their plan to integrate the Panda algorithm to their overall algorithm updates. This means that the refreshes will be applied real-time as opposed to being updated manually, and will be not be as noticeable to website owners and SEO companies as it will be gradually rolled out instead.
May – After another one-month break from updates, Google gives webmasters with a one-two punch by releasing the Phantom and Penguin 2.0 update. Branded by some as Panteguin, and affected websites that had questionable link profiles that have too many paid links. What makes Penguin 2.0 much complex and destructive to those with unnatural links is that Penguin 1.0 only considers the homepage links, while the 2.0 goes deeper into the site and checks any, if not all, pages.
It is heavily link-focused, while Phantom was more concerned about content and resembled the Panda algorithm more than Penguin. Those that were affected had similar issues, which ranged from having affiliate content, duplicate content, and low-quality content. Some also relied on heavy cross-linking with the exact anchor text. What’s interesting is that Panda also targeted websites with the aforementioned characteristics.
June – Webmasters and SEO companies prepare for the Panda Dance, which meant that updates will be pushed out monthly, but in a span of ten days or so. The term Panda dance stems from the old Google Dance, wherein the search engine results “danced” around monthly whenever Google came out with the changes in their algorithm.
August – This marks the release of the Hummingbird update, which is a major algorithmic shift in over a decade. It focused on making the searches friendly to conversational queries, as opposed to the traditional keyword search where users type in key phrases to find information that they need.
September – To make things even more aligned with the Hummingbird update, Google takes out any data that is still available in their Google Analytics. Anyone who tries to access those data would only be greeted with “not provided.”
October – Penguin update 2.1 hits the dancefloor by targeting a wide spectrum of websites. The most helpful solution to coping with the update was to remove any unnatural links and for those that cannot be manually disposed of had to be disavowed.
November – While there isn’t any formal announcement on an algorithm update, The Moz Blog did report in a post about experiencing a temperature spike on November 14 in their Mozcast which usually signifies an update. Key observations include the fact that (1) the said spike was a one-day change and that (2) those who usually get penalized are small, low quality sites whose rankings in SERPs are replaced by big brands that benefit from the update.
December – Yet again, Google denies a December 17 algorithm update despite the obvious changes shown in the various tracking tools online. There were some signs of serious changes in Google search results as shown by MozCast, SERP Metrics, Algoroo, SERPs.com, and more. Thankfully, there isn’t much buzz in discussion areas or any complaints about abrupt decrease in site traffic or shifts in ranking.
Perhaps the most defined changes for this month is the series of link networks (and smaller websites connected to them) that have gone down because of Google, including Anglo Rank and Backlinks.com. Businesses who have paid links need to either get rid of these bad links or keep a list of where you store or place them, so you can disavow them when needed.
The Round-Up: Was Google’s Algorithm Updates in 2013 Good or Bad for Businesses?
While 2012 seemed to have so much collateral damage on the part of websites, there seemed to be a continuation of Google’s fixes in 2013, but to a much lesser extent. They laid out the rules and applied them, and websites with poor content and paid links took a direct hit. In hindsight, Google is not making enemies of millions of businesses with online websites. In fact, their algorithms are just stressing out that websites need to focus on regularly creating unique, fresh content as well as to avoid using paid link services to retain or improve rankings on SERPs.
In terms of the Hummingbird Update, the use of natural language is in line with pushing their latest technologies forward. So businesses who want their sites to rank need to consult with SEO experts to find out the best web marketing strategies they need to take to keep up with the new update. This means less focus on keyword data and more time spent on looking at conversions and traffic.
If you are to observe Google’s way of doing things, it really does seem that they aren’t pushing out all these changes and updates to punish and hurt websites. Instead, they are doing their best to refine and improve the whole search process.
Latest posts by Peter A. Liefer II (see all)
- The Differences of Social Media Marketing and Community Management - May 31, 2017
- The Pillars of Social Media Marketing Part 2 - May 26, 2017
- Seven Reasons White Space is Vital to Great Graphic and Web Design - May 15, 2017