SEO myths debunked

SEO myths and mysteries debunked

August 24, 2018

Despite the abundance of SEO myths and mysteries, we all know the aim of the game – we want our sites and pages to rank in the top SERPs for our niches and keywords. And yes, we are all gaming the system. Whether you consider yourself white hat, black hat or you think hats just don’t really suit you, we’re all playing the game. We’re constantly tweaking, adjusting and updating our sites to rank higher, and no matter which way you slice it, that’s a manipulation of the system.

Technically, all attempts at SEO are black hat. The Google Webmaster guidelines say: “Avoid tricks intended to improve search engine rankings.” But that’s exactly what SEO is.

So now that we’ve muddied things up and got you nice and confused about your SEO moral compass, let’s just go for it, hell for leather and dig into the top 10 things we’ve always believed and been told about SEO, that might not actually be true…

Ready to debunk some SEO myths and solve some mysteries? Let’s go.

Quick disclaimer: most of these myths seem to relate directly to Google, but they’re still handy for other search engines too.

SEO Myth 1: We know exactly what search engine ranking factors are

Wouldn’t this be nice? We’d love to know every single ranking factor, precisely and perfectly – it would make our jobs so much easier. But life isn’t that simple. So no matter what you think, you definitely do not know them all.

Sure, SEO software companies are always claiming that correlation studies have told us exactly how the algorithm works. But actually, we’re making assumptions and coming to conclusions from the tests that we do and we can’t test everything. Of course, testing our approaches is really important and it’s the only way we can be successful, but we should never follow the results of these tests blindly, or we’ll find ourselves in a mess sooner rather than later.

Remember that you don’t know everything. Engage critically with the results of every test you do. Don’t get lazy or sloppy.

SEO Myth 2: Links are the most important ranking factor

Following on from this very neatly (if I do say so myself) is the idea that the ranking factors can be prioritised or weighted, making any one more important than the others. This would be super useful too. If there were a set few ranking factors that would make much more of a difference than anything else, then we would all have prescribed ways to absolutely smash our SEO.

Yet again, not so simple.

All ranking factors have their part to play. Links are important, but so is content and structure and technical elements. Unfortunately, you’ve got to give every factor some love to give yourself the best chances. No slacking!

SEO Myth 3: Guest blogging is against Google’s ToS

You could be forgiven for believing this, if for no other reason than Google’s ToS are convoluted and confusing and sometimes plain stupid. Plus, Google’s employees don’t seem to be able to agree on this, with Matt Cutts and John Mueller tying themselves in all kinds of twitter knots on the subject.

But, if you think guest blogging is entirely a no-no, you would be mythtaken. (I couldn’t help myself).

The cut and thrust of it is that guest blogging has a lot of benefits. We all know about the authority and links and exposure that guest blogging can provide. So, as with most things in SEO, do it for quality rather than quantity (or at least cover your tracks) and you shouldn’t be penalised.

SEO Myth 4: There’s no point putting noindex directives in robots.txt file

You might believe that Google ignores noindex directives in your robots.txt file, and that you need to use meta noindex tags on the specific page if you don’t want it to rank. Well, that’s probably because Google’s John Mueller told us all not to rely on robots.txt.

However, testing has shown that using robots.txt does work. What’s more, robots.txt is cleaner than having multiple confusing directives and you can use it to noindex groups of URLs all at once, rather than having to do them all individually. So that’s handy.

SEO Myth 5: JS-injected canonical tags don’t work

Ok, so injecting canonical tags isn’t the most efficient approach, because it can take a while for JS-injected target URLs to be picked up by the search engine, but tests have shown that they do work eventually.

Yet again, it pays to look at the supposed rules more closely, pull them apart and test the crap out of them.
Myth number 6: You can use robots.txt crawl delay directive effectively

This used to be true, but times have changed. Now that modern servers are capable of handling huge amounts of traffic, this directive has become outdated, so Google will ignore it. There are other ways of adjusting the rate that Google crawls your site, so play around with the site settings in Google Console.

Having said all of that, Bing still pays attention to the delay directive, so, go figure.

SEO Myth 7: You can manipulate crawls with sitemap priority and frequency

Yet again, wouldn’t this be great? We would love for search engines to crawl our most important pages more than any other pages on our site – it would make so much sense and would be such a great way of targeting all our SEO efforts and making them all worth it.

Keep dreaming.

Apparently, Google ignore sitemap priority and have their own logic that decides how frequently a page is crawled. Of course they do.

SEO Myth 8: It’s a bad idea to put hreflang attributes in anchor links

So, according to Google, they don’t particularly care if we put hreflang attributes in our anchor text. But, testing has shown that some hreflang attributes are picked up if they’re put in anchor text. Emphasis on the word some here.

Basically, it’s not an entirely bad idea, but it’s also not a golden ticket to success. Play around with it, see if works for you. Are you noticing a pattern yet?

SEO Myth 9: Cookies are a super important factor in crawls

I’m going to keep this short and sweet: Google only cares about cookies if your content won’t work without them, otherwise it’s just white noise.

Next!

SEO Myth 10: Googlebot crawls HTTP/2

It doesn’t yet. Currently, Google bot still only crawls HTTP/1.1. So that’s worth bearing in mind.

TL;DR?

Basically, Google like to lead us on a merry dance, telling us contradictory stuff and then back pedaling as fast as they can. We should take it all with a pinch of salt because at the end of the day, the best option we have is to continuing our rigorous testing and critical engagement with data and results.

But hey, it wouldn’t be fun if it was easy!

Want more insights like this? Of course you do, get yourself to UnGagged in Las Vegas this November. Buy your tickets now!

Resources

Related news and insight

Local SEO - Google Maps Optimization
October 22, 2018

Your Guide to Google Maps Optimization

Local SEO is an art all of its own. For businesses with a bricks-and-mortar presence it’s the key to competing in this digitally disrupted...
Read More
Google penalties - how SEOs get penalized
September 19, 2018

Google Penalties: How SEOs get themselves penalized in 2018

Google penalties strike fear into the heart of every digital marketer. We’re used to our stats and usage fluctuating, that’s part of what makes...
Read More
Trump makes SEO headline news
September 13, 2018

SEO changed the world

We changed the world last week and no-one noticed. Disclaimer: while this article touches on political topics, I’m not going to bring any personal...
Read More
Get in the Know
Sign-up For Updates
Get notified about speaker announcements, early bird tickets and schedules.









The UnCrowd
New York Background
Next Event!

New York 2020

Due to the COVID-19 pandemic, we have to postpone the NY dates for the safety...
UnGagged London
It's back!

London 2020

The most forward-thinking digital marketers share their best-kept secrets in one place.