Sorry, we don't support your browser.  Install a modern browser

Subdomain exclusion#80

While crawling subdomains is great, there are some sub-domains i’m not interested in.

e.g. jobs.site.com, while i would be interested in in blog.site.com

Is it possible to exclude specific sub-domains from a crawl? If not, it’s a feature that would be good to have as it will also improve budget efficiency.

Thanks,
Nathan

9 months ago
1

There is no way to exlude subdomains now, Nathan. But we definitely will add it. Thanks!

9 months ago

Pretty sure this is now implemented by un-checking the box in crawl settings?

7 months ago

@Nathan Horgan yes, you can use checkbox to exclude subdomains. But in this case you will exclude all subdomains. If you want to exclude only specific ones you can use editing robots.txt settings. Disallow specific subdomains there and Sitecheckerbot won’t crawl them. Be sure that editing robots.txt rules in Sitechecker won’t impact on the real robots.txt file placed on your server. These rules will applied only for crawling inside Sitechecker.

7 months ago