Ensure that search engines get the right indexation signals

Gain clarity on exactly which URLs are indexable and which are not, so you can make sure that only the right pages end up in the index.

Get to the bottom of any indexation issue

Sitebulb’s Indexation Report will help you untangle even the most complex indexation setups, giving you a clear understanding of anything that is going wrong. Whether it’s an over-zealous robots.txt file, conflicting noindex rules, or misplaced canonical tags, Sitebulb will alert you to any configuration issues.

See how Robots.txt impacts any URL

Robots.txt files can cause major issues in terms of crawling and indexing, as wayward disallow rules can prevent big chunks of a website from even being crawled. Sitebulb will tell you every single URL that is affected by the robots.txt file, and even pick out the specific robots.txt rule triggered.

Quickly identify canonical tag issues

Sitebulb will allow you to quickly check any pages with canonical tags that are not self-referential, so you can double check these are pointing to the right URLs.

Further, it will check for canonical configuration issues, such as duplicate declarations or malformed URLs, and identify inconsistencies caused by compound robots rules – such as canonical tags that point at noindexed URLs.

Avoid potential problems from duplicate robots declarations

Robots directives can be specified in 3 different locations – in the HTML <head>, the HTTP header, and on the robots.txt file. This can lead to multiple directives for a single URL, which potentially also leads to conflicting directives. These type of inconsistencies are typically very hard to identify manually, and can cause massive problems if left unchecked.

Sitebulb will pull all these rules together and automatically check them all, picking out the specific issue and all URLs affected.

Sitebulb launches Summer 2017

To celebrate the launch, we're giving away 10 free lifetime licenses. Sign up now to get early access and a chance to win one.