There’s nothing quite like a healthy dose of BS on a Monday morning to get those blogging juices flowing. In related news, Matt Cutts updated his blog post about paid links on Saturday. Search Engine Land and Search Engine Journal have already provided quality overviews of the update, but once again, I felt compelled to weigh in on the issue.
Matt’s update was written in a question and answer format and despite the softballs he chose to address, there were still some areas of inconsistency. Most notably in this question and supposed answer:
Q: Googleâ€™s quality guidelines say â€œMake sites for users, not search engines.â€ Put that in context for me; how does that interact with buying links?
A: If someone is buying text links to try to rank higher on search engines, theyâ€™re already doing something intended more for search engines than for users. If you finish that guideline, youâ€™ll see that itâ€™s talking about doing radically different things for engines versus users (for example, cloaking or creating doorway pages). It would be a misinterpretation of that guideline to think â€œOkay, I can only do things for users, I can never do things for search engines. Therefore I can buy text links, but not in a way that doesnâ€™t affect search engines.â€ That same philosophy would mean that you wouldnâ€™t create a robots.txt file (users donâ€™t check those), never make any meta tags (users donâ€™t see meta tags), never create an XML sitemap file (users wouldnâ€™t know about them), and wouldnâ€™t create web pages that validate (users wouldnâ€™t notice). Yet these are all great practices to do. So if you want to buy links, Iâ€™d buy them for users/traffic, not for PageRank/search engines.
I’m not sure if Cutts was trying to get a laugh out of webmasters but I just couldn’t help myself. You see if you actually DO finish reading the guideline you’ll find this:
Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you’d feel comfortable explaining what you’ve done to a website that competes with you. Another useful test is to ask, “Does this help my users? Would I do this if search engines didn’t exist?”
The emphasis is of course mine but the contradiction remains. Cutts and Google are essentially telling us to only do things for them (the search engines) when they need us to (robots.txt telling their bots where to go, nofollow telling them what links to ignore, sitemaps making sure their bots don’t miss anything). In my humble opinion, the guidelines should be completely rewritten to better fit with Google’s new policies. For example, the section I quoted above would be much more accurate if it read:
Avoid tricks that we haven’t approved that could possibly interpreted as intending to improve search engine rankings. A good rule of thumb is “If in doubt, defer to Matt Cutts”. Another useful test is to ask, “Does this make Google’s job/life/business easier in any way? By doing this am I helping the search engines (specifically Google) make more money by appearing to have higher quality results?”
We’re also given our (seemingly weekly) dose of Google brand FUD, just in case the intimidation has worn off a bit:
The second interesting thing about these links is that our current approach to paid links worked quite well in this case. Our existing algorithms had already discounted these links without any people involved. However, our manual spamfighters had detected these links as well.
Basically, “Even though we’re begging you to help us, we still know everything and don’t you ever forget it.” Behold, the beauty of Google doublespeak.