Sorry, we don't support your browser.  Install a modern browser
This post is closed.

Subdomain exclusion#80

While crawling subdomains is great, there are some sub-domains i’m not interested in.

e.g. jobs.site.com, while i would be interested in in blog.site.com

Is it possible to exclude specific sub-domains from a crawl? If not, it’s a feature that would be good to have as it will also improve budget efficiency.

Thanks,
Nathan

3 years ago
1

There is no way to exlude subdomains now, Nathan. But we definitely will add it. Thanks!

3 years ago

Pretty sure this is now implemented by un-checking the box in crawl settings?

3 years ago

@Nathan Horgan yes, you can use checkbox to exclude subdomains. But in this case you will exclude all subdomains. If you want to exclude only specific ones you can use editing robots.txt settings. Disallow specific subdomains there and Sitecheckerbot won’t crawl them. Be sure that editing robots.txt rules in Sitechecker won’t impact on the real robots.txt file placed on your server. These rules will applied only for crawling inside Sitechecker.

3 years ago

Now you can exclude specific subdomain from crawling using advanced ‘exclude’ and ‘include’ rules.

2 months ago
Changed the status to
Completed
2 months ago