The Quick Summary
The big data revolution has arrived and those choosing to forgo the tools to analyze what works will continue to fall behind. As more SEOs help to highlight past algo changes, a function of chasing the algorithm, they are contributing to stopping rumors of what you should incorrectly do for your website. Furthermore, given that the UX on your website (local) may differ on Google’s (global) scale, should you not know what works best for your users?
Given that Google asks you to do more than just building a valuable site by focusing on cleaning up links you never purchased, one must remember Google is not perfect having to ask for help from SEOs in noting when their algorithms break. As such, it’s your duty, as an advanced SEO, to chase after the algorithm in order to provide the best SEO marketing for your business.
Welcome to the Data RevolutionImage credit: SiSense
Nearly every job industry is undergoing a data revolution, including in areas such as comedy through building a comedic robot or through data mining a comedian’s own audience to determine which jokes to use based on the small audience’s demographics. When companies focused on the stock market are investing in econometrics to predict human behavior on when the stock market will rise or fall, Google’s algorithm is easy in comparison and should be a part of your SEO repertoire to figure it out as best you can.
Recall what’s been researched and written via Freakonomics or Gun, Germs, and Steel. The value of data and making connections is only increasing over time and at the speed of which technology changes, the more one continues to refuse to chase algorithms, the faster you will fall behind the data marketers your competitors are using (the 2012 campaign between Romney vs Obama is another similar example of why using big data properly matters) to win in the SERPs.
Algorithm History is Chasing the AlgorithmImage credit: screenshot of Moz.com’s algorithm history
When you try to figure out what happened, you are, in effect, chasing the algorithm. When you monitor your rankings, read surveys of what other SEOs think matters, look at past Google patents, or pay attention to correlation studies, these are all a function of wondering about Google’s algorithm and how you should improve your site for the search engine and for your users.
Knowing whether something occurred on a specific day or whether an H1 matters is equivalently the same thing. The difference is the degree to which you are trying to understand the algorithm and whether you are looking into the past versus the present. To claim that it is wrong to chase the algorithm while caring about when Panda or Penguin hit smacks of hypocrisy.
Rumors Run Rampant, Stop Them Quickly
Unless you shy away from the SEO industry, you’ll know that rumors run rampant and nothing faster makes businesses distrust SEOs than the non-SEOs that believe in every hype about what you should do to improve one’s website. Not knowing how to do SEO and what works is why some have come to view SEO with distrust, not to do with white or black hat tactics.
Thus, it’s your job to know whether you need to spend the effort in to telling a client whether to spend a lot of money to insert an H1 tag or whether to leave it alone. It’s your job to tell them whether the 100 links in the footer is going to be a problem for their internal linking or whether Google ignores those completely.
Additionally, it’s your job to stay up-to-date with the newest tactics, tips, and tricks so you don’t go recommending to your clients how Google’s nofollow used to work rather than how it currently works. That means both paying attention to the experts who often have to chase the algorithm to find this out and by running those tests yourself to stay ahead of your competition.
Google’s UX and Your UX May Differ
Google’s goal is to create the best search engine user experience for its global users. Your goal is to create the best local user experience for your users. And sometimes, those two areas conflicts. There’s a reason why sites used to be built in Flash (yes, it was considered user-friendly) and why today you still find sites like Amazon cloaking their filter navigation. There continue to be areas where what’s best for SEO and what’s best for the UX will conflict and knowing how much an impact that will be on either side is essential for revenue optimization.
Compromises often have to be made with other team divisions, so wouldn’t you like the upper-hand to note that by including a non-SEO friendly navigation is going to cost the business more money than the UX conversion rate benefit lift? Or when its recommended to have no relevant content above the fold in order to push a value proposition, that you are right there with exact numbers of how much that will lower your rankings across the board, in effect lowering revenue by a set amount? That’s the benefit of chasing the algorithm from testing those changes.
In fact, if it’s perfectly okay to do testing within PPC as it will all lead to a better user experience, then how is testing within SEO anything but the same benefit to users? By trying to figure out what Google values as the best user experience through its algorithm, you are intrinsically trying to create the best user experience period.
Want to know why PPC generally gets more spend than SEO? No one likes to invest heavily in a marketing area that says, “Well, it may have been X, Y, or Z that helped.” So, stop guessing and starting testing!
Even Google Asks You to do MoreImage credit: Feedthebot
Let’s take the view from another angle; if you were to create the best user experience possible and never focus on the search engine (ie: build the website for users not robots), you would never need to implement the following:
- nofollow tags (only used by search engines)
- canonical tags (only used by search engines)
- cleaning up bad links that you never sought but were penalized for anyway and now you have to clean up to show otherwise (has nothing to do with building a valuable site for users)
These implementations would not be needed as Google (the robot) would be able to figure it out on its own, yet since Google’s bot is built by humans, it has limitations.
Google is Not PerfectImage credit: TheStreet
Lately, Google has been preferring to use machine learning (also known as: pattern analysis) algorithms, Panda and Penguin being two examples of this. The thing is, pattern matching will always have false positives or false negatives with the goal of having it down within an acceptable margin of error. It’s why Google has asked for help and rolled back some of the effect Panda had on sites; too many false positives where good sites got hit incorrectly.
Want to avoid becoming a false positive or at least quickly reverse the damage? Then you need to pay attention to the algorithm and understand why you might have gotten hit. Not chasing the algorithm and showing Google when it’s wrong means your business will suffer needlessly. Stay on top of what is going on and you can be quick to bounce back and never take for granted what Google believes is the best user experience.
None of this is saying you should focus exclusively on chasing the algorithm, nor building exclusively for bots. The point is to be diligent in your SEO and to use what resources you have effectively to improve the bottom-line of your business for the long-term. If you have the resources to test and chase the algorithm, then you should do so to stay ahead of the competition and to build the best user experience possible just as you would if it were for PPC.