links - SEO

What to do when your website attracts an abundance of links from domain directories from unwanted ccTLDs locations

In my opinion, the question has two aspects: the first involves examining the problem from the perspective of artificially created low-quality spam backlinks and the second from the perspective of not-desired or low-quality traffic.

Now, referring to the first aspect:

According to the latest information from Google, in most cases, you should do nothing. In their words, they are pretty good at identifying spam links and would even not be needed you to disavow them, as they will simply be ignored…

In my practice, as part of requested backlink hygiene, I have uploaded a list for disavowing on a few occasions due to multiple spammy and low-quality links pointing to websites. I have not had any manual action taken against my clients referring to those aspects, nor have I noticed any impact from my efforts.

As a priority approach, the next plan can be followed:

  • Check for Manual Actions
  • Evaluate the Quality of Links
  • Disavow Low-Quality Links
  • Contact Directory Owners (did it only once – around 10 years ago)

In my opinion, point two is more interesting, as there can be various reasons and solutions to consider, which should be approached with great care.

Some of the reasons for not wanting this traffic could be because our services do not cover that area, the traffic overloads our servers, or there may be legal considerations. Additionally, we may have different websites that handle that region differently, offering different prices and deals, for example. In different moments in my career, I have had to approach such situations and not always according to the best Google recommendations.

On one occasion, a top-drawer, five-star spa and wellness resort, had a very specific requirement: to serve different content for users from different parts of the world, including different prices and offers.

On another occasions, when such traffic is undesired, blocking based on IP can be used and return of a 403-error response even without any user-friendly page interface.

Another example: a few years ago, I was working on channeling specific traffic to specific places for a big multinational, where at that time they were detecting the country and redirecting to subdomains created to logically cover the location needs. A similar approach using subdomains can be seen in Indeed.com. Other companies may have country code top-level domain (ccTLDs) and redirect to them. On the other hand, the latest trend, and also according to better SEO practices, is moving away from subdomains and using proper localization folder structures, as Godaddy did. Common behavior for such logic involves using updated paid data IP location information services and, in cases of doubt for clear local identification or error, always showing the US version as the default response.

In case of server overload not caused by bots, server-side throttling is typically the solution, limiting not only real users but also API calls. For scenarios involving bot overload, other solutions should be implemented, and effective ones do not involve restrictions from robots.txt but programmatically blocking.

As a conclusion, it should be highlighted that holistic research and analysis should be done before taking any action, as even the not developing solutions may have significant, difficult-to-correct negative impact!


Marin Popov

Marin Popov – SEO Consultant with over 15 years of experience in the digital marketing industry. SEO Expert with exceptional analytical skills for interpreting data and making strategic decisions. Proven track record of delivering exceptional results for clients across diverse industries.


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *