Dealing with Indexability issues?

Crawl your website with Sitebulb for 300+ tech SEO checks

Try for Free
High This Hint is very important, and definitely warrants attention. Issue This Hint represents an error or problem that needs to be fixed.

Mismatched nofollow directives in HTML and header

This means that the URL in question has a nofollow directive in both the HTML and in the HTTP header, where the directives do not match.

Why is this important?

This means that one location uses 'follow' and the other uses 'nofollow'. The net result of this is that the page would remain 'nofollow', even if efforts had been made otherwise. Google clearly states that if you have multiple directives which conflict, they will select the option that is most restrictive (and it is likely that the other search engines follow suit).

It is considered best practice to only specify robots directives once on any given URL. This is because doing it multiple times makes the configuration more open to error, as is the case in this scenario.

You can avoid such catastrophic futures by only specifying robots directives in one location.

What does the Hint check?

This Hint will trigger for any internal URL which contains mismatched follow/nofollow directives in the HTML and in the HTTP header.

Examples that trigger this Hint

The Hint would trigger for any URL that had both of the following;

Meta nofollow in the <head>,

<!doctype html>
<html lang="en">
<head>
<title>example</title>
<meta name="robots" content="noindex,nofollow">
...
</head>
<body>...</body>
</html>

AND a follow directive in the HTTP header:

HTTP/... 200 OK
...
X-Robots-Tag: noindex,follow

Similarly, the Hint would also trigger for the inverse;

Meta follow in the <head>,

<!doctype html>
<html lang="en">
<head>
<title>example</title>
<meta name="robots" content="noindex,follow">
...
</head>
<body>...</body>
</html>

AND a nofollow directive in the HTTP header:

HTTP/... 200 OK
...
X-Robots-Tag: noindex,nofollow

How do you resolve this issue?

This is a scenario where the robots directives are conflicting, and ultimately are completely broken. To resolve the issue, you will first need to determine the 'correct' directive, then go and fix the incorrect one so that the directives match.

Following on from this, to help avoid such issues again in the future, it would also be prudent to adjust the page template so that it only uses one method to set robots directives.

To do this you may need developer help, as you will need to adjust page templates, plugins or HTTP headers - removing the duplication so that robots directives are only defined once.

Sitebulb Desktop

Find, fix and communicate technical issues with easy visuals, in-depth insights, & prioritized recommendations across 300+ SEO issues.

  • Ideal for SEO professionals, consultants & marketing agencies.

Sitebulb Cloud

Get all the capability of Sitebulb Desktop, accessible via your web browser. Crawl at scale without project, crawl credit, or machine limits.

  • Perfect for collaboration, remote teams & extreme scale.