Much to the chagrin of many businesses, Google is constantly tweaking their algorithm in an attempt to provide users with the most pertinent information as quickly as possible. The algorithm is constantly changing, resulting in disgruntled businesses when their search engine rankings are affected. Any business with a website needs to understand the latest updates and how they affect search engine rankings. A failure to stay up to date with the Google algorithm likely means masses of lost traffic.
The Panda 4.0 Update
The Panda 4.0 update rolled out in May of 2014 with one specific purpose: to streamline the existing algorithm to make it harder for websites with poor or weak content to rank high in the SERPs (Search Engine Results Pages). Google has made regular updates to the Panda algorithm in the past, but in this major update they reworked some of the most important aspects of the code.
The Consequences of Becoming Penalized
Again, the whole point is an attempt to clean up the web for Google’s users. Spammy sites like content mills that spam links, horribly watered down content, and duplicated content were given lower rankings as a result of this new release. However, even top legitimate brands and businesses were affected by the powerful changes to the search engine. For example, in May of 2014 eBay took a massive blow as a result of the Panda update. Losses were estimated at a staggering 80% of organic traffic searches and long tail keywords. This is not the first time something like this has occurred, though. In January of 2014, Expedia reported lost 25% of its organic search rankings after getting penalized by Google. However, despite so much upheaval of high long-time rankings, Google did not give many of the technical, down and dirty details of this update.
Bing Implements Similar Algorithm
Not surprisingly, Google’s search engine competitor Microsoft Bing has mirrored this update in their search engine as well. Although, they seem to have done a much better job of communicating at least some of the mechanics of how the code determines a piece of content’s quality. Bing reported that they use at least three metrics: authority, utility, and presentation.
- Authority: Is the content trustworthy and credible?
- Utility: How useful is the content and how detailed is it?
- Presentation: Is the content easy to find and easily accessible or is it tucked away in some corner of a website?
While we do not know for certain, we might assume that the Google update uses similar metrics in their evaluation of website quality.