Surprising SEO Tactics That Google Doesn’t Support
As a marketing professional or a site owner, it can be difficult to perfect your SEO strategy. The rules seem to change constantly, and there are always new algorithm updates or releases to contend with. The good news is that the fundamental elements of SEO never change.
Everyone seems to have a list of "do's and don'ts" concerning every Google algorithm update. And in most cases, these recommendations will work. However, there are some tactics that Google and every other prominent search engine won't support. Here are some of them.
Specifying a delay in your robots.txt file used to be the norm for many webmasters. They could easily use this command to determine the number of seconds the crawl-bots should wait between page requests. It seems logical that they would continue to support this command. Unfortunately, it just doesn't work that way anymore. Todays' servers deal with massive volumes of traffic, so the crawl-bots ignore this command because it doesn't compute.
It is still possible to change the rate at which your site gets crawled through the Site Settings Tab in Google Search Console. It will offer you a range of crawl rates based on your specific server. It's not ideal, but it still gives you the option to manipulate the crawl speed.
Articulation and Language Tabs
If you have ever applied language tags, then you know there are many different places to put them. You can use them as meta-tags, in a request header or as a language attribute in an HTML tag and there are probably a few more. However, not everyone consistently uses them the same way. As a result, Google ignores them altogether. If you have been inserting language tags, then stop; it is a waste of your time. Google uses the text itself to determine the appropriate language by using a proprietary system to look at the page, the context, and the text.
Priority and Frequency
Sitemap priority and frequency was Google's initiative which allowed webmasters to set a value for each URL on a site, which would determine how often that URL got crawled relative to others on the site. The allowable range was from 0.0 to 1.0. Unfortunately, most webmasters decided to set every URL to the highest priority, which made the entire system useless. Google no longer considers any priority values in sitemaps and instead uses its own set of signals to determine how frequently it should crawl the pages.
Cookies on a Site
Search engines continually develop new ways to determine the best websites for their visitors (i.e., algorithms). Ultimately, they want to make sure they are valuing a site on what matters. Therefore, some tactics that SEO gurus and companies use don't matter anymore. So, instead of wasting time on methods that don't work, now you can focus on those that do.