Monica Piccininno Director of Operations

Surprising SEO Tactics That Google Doesn’t Support

As a marketing professional or a site owner, it can be difficult to perfect your SEO strategy. The rules seem to change constantly, and there are always new algorithm updates or releases to contend with. The good news is that the fundamental elements of SEO never change.

Everyone seems to have a list of "do's and don'ts" concerning every Google algorithm update. And in most cases, these recommendations will work. However, there are some tactics that Google and every other prominent search engine won't support. Here are some of them.

Crawl-Delay Commands

Specifying a delay in your robots.txt file used to be the norm for many webmasters. They could easily use this command to determine the number of seconds the crawl-bots should wait between page requests. It seems logical that they would continue to support this command. Unfortunately, it just doesn't work that way anymore. Todays' servers deal with massive volumes of traffic, so the crawl-bots ignore this command because it doesn't compute. 

It is still possible to change the rate at which your site gets crawled through the Site Settings Tab in Google Search Console. It will offer you a range of crawl rates based on your specific server. It's not ideal, but it still gives you the option to manipulate the crawl speed. 

Articulation and Language Tabs

If you have ever applied language tags, then you know there are many different places to put them. You can use them as meta-tags, in a request header or as a language attribute in an HTML tag and there are probably a few more. However, not everyone consistently uses them the same way. As a result, Google ignores them altogether. If you have been inserting language tags, then stop; it is a waste of your time. Google uses the text itself to determine the appropriate language by using a proprietary system to look at the page, the context, and the text. 

Priority and Frequency

Sitemap priority and frequency was Google's initiative which allowed webmasters to set a value for each URL on a site, which would determine how often that URL got crawled relative to others on the site. The allowable range was from 0.0 to 1.0. Unfortunately, most webmasters decided to set every URL to the highest priority, which made the entire system useless. Google no longer considers any priority values in sitemaps and instead uses its own set of signals to determine how frequently it should crawl the pages. 

Cookies on a Site

Google does not use cookies when crawling a site because they want an objective view similar to what new users would see. Using cookies may change the content delivered. When Google crawls a page, they want the bots to view it in a stateless manner, so they do not utilize cookies at all. The only exception to this rule is if the content on a page will not work without the use of cookies. Although it seems like Google should support this approach, along with many others, they don't.

Search engines continually develop new ways to determine the best websites for their visitors (i.e., algorithms). Ultimately, they want to make sure they are valuing a site on what matters. Therefore, some tactics that SEO gurus and companies use don't matter anymore. So, instead of wasting time on methods that don't work, now you can focus on those that do. 

Related Articles
Monica PiccininnoDirector of Operations
When a user’s browser requests files from your site, that request is routed to the closest server.Your network  might look something like this : If a user from South Africa visits your site, thir browser could download files from a nearby server.
Is page speed actually important? The short answer is YES.  The Social Dribbler reported, "47% of consumers expect a webpage to load in 2 seconds or less and 40% of people will abandon a website that takes more than 3 [...]
Read Article
It could simply be a result of your site’s technical structure, thin content, or new algorithm changes, but it could also be caused by a very problematic crawl error.Resolving these errors quickly will ensure that all of your target pages are crawled and indexed the next time search engines crawl

Ways you can work on your Indexation for SEO
As Search Engine Journal said, "With hundreds of billions of webpages in Google’s index, you need to optimize your crawl budget to stay competitive." You should be checking your website's crawl status at least every month or two. Check for errors and [...]
Read Article
Monica PiccininnoDirector of Operations
The Guide consists of various aspects of search engine optimization like: If you are serious in digital marketing, then check out another well-researched guide on SEO by Search Engine Land.Another well researched and great resource for SEO basics is Buffer’s Complete Beginner’s SEO Guide.

Great resources to learn about SEO
Want to learn more about search engine optimization? That's great! TechBuzNews put together a list of resources for those that want to study up! First, check out The Beginner’s Guide To SEO Basics by Moz. It's an in depth guide with ten [...]
Read Article