Multiple noindex directives
This means that the URL in question has a noindex directive in multiple locations (e.g. in the HTML and in the HTTP header).
Why is this important?
It is considered best practice to only specify robots directives once on any given URL. This is because doing it multiple times makes the configuration more open to human error.
Imagine you add an SEO plugin to your site which allows you to set robots directives, and you decide to set a particular page as noindex. At a later date, you add another plugin which also allows you to set robots directives, and again you remember to set this page as noindex.
At this point, there would be nothing 'wrong', as all the robots directives are in agreement. However, if in the future you wanted to change the page from noindex to index, and went into the plugin configuration and set this up, there exists the potential that you may completely forget that the other plugin is also setting robots directives, and so you would end up with one plugin still setting the noindex, and the other plugin with the noindex removed.
The net result would be that the page would remain 'noindex', even though you thought you'd changed it to not be. Google clearly states that if you have multiple directives which conflict, they will select the option that is most restrictive (and it is likely that the other search engines follow suit).
You can avoid such catastrophic futures by only specifying robots directives once.
What does the Hint check?
This Hint will trigger for any internal URL which contains noindex directives more than once (either in the HTML or in the HTTP header).
Examples that trigger this Hint
The Hint would trigger for any URL that had either of the following;
Meta noindex multiple times in the <head>:
OR meta noindex in the <head>,
AND in the HTTP header:
Why is this Hint marked 'Potential Issue'?
This Hint is a 'Potential Issue', which means that it is unlikely to be affecting the site at the moment, but should be investigated as it could cause issues in the future.
Robots directives implemented multiple times is usually not deliberate, so this is flagged in Sitebulb so you can remove the potential for future damage. To do this you may need developer help, as you will need to adjust page templates, plugins or HTTP headers - removing the duplication so that robots directives are only defined once.