Noindex in HTML and HTTP header
This means that the URL in question has a noindex directive in both the HTML and in the HTTP header.
Why is this important?
It is considered best practice to only specify robots directives once on any given URL. This is because doing it multiple times makes the configuration more open to human error.
In this scenario, you have a noindex specified in the X-Robots-Tag in the HTTP header, and also a meta noindex in the HTML <head>.
At this point, there would be nothing 'wrong', as all the robots directives are in agreement. However, if in the future you wanted to change the page from noindex to index, and went into your page template or plugin configuration to change the meta noindex, there exists the potential that you may completely forget to also change the HTTP header.
The net result would be that the page would remain 'noindex', even though you thought you'd changed it to not be. Google clearly states that if you have multiple directives which conflict, they will select the option that is most restrictive (and it is likely that the other search engines follow suit).
You can avoid such catastrophic futures by only specifying robots directives in one location.
What does the Hint check?
This Hint will trigger for any internal URL which contains noindex directives in both the HTML and in the HTTP header.
Examples that trigger this Hint
The Hint would trigger for any URL that had both of the following;
Meta noindex in the <head>,
AND in the HTTP header:
Why is this Hint marked 'Advisory'?
In Sitebulb, this Hint is Advisory, as it does not represent a clear issue. In this case, the situation reflects a potential future issue.
Robots directives implemented multiple times is usually not deliberate, so this is flagged in Sitebulb so you can remove the potential for future damage. To do this you may need developer help, as you will need to adjust page templates, plugins or HTTP headers - removing the duplication so that robots directives are only defined once.