In order to crawl certain websites, you may need to adjust some of the default robots settings, which you can do via the Advanced Settings.
To get to Advanced Settings, you scroll to the bottom of the main Audit setup page and hit the grey Advanced Settings button.
The Authentication section is under Crawler -> Robots:
By default, the Sitebulb crawler will respect robots directives, but you can over-ride this by unticking the box 'Respect Robots Directives'.
This will spawn 3 new options, which will allow you to control more specifically which robots directives are crawled.
Additionally, this section will allow you to specify the following:
By default, Sitebulb will crawl using the Sitebulb user agent, but you can change this by selecting a different one from the dropdown, which contains a number of preset options.
This setting allows you to over-ride the website's robots.txt file, to instead use a 'virtual robots.txt' file.
To use it, click the green button 'Fetch Current Robots.txt', which will populate the box above with the current robots.txt directives.
Then just delete or adjust the existing directives, or add new lines underneath. Sitebulb will follow these directives instead of the original ones.