Just ban robots and add robots.txt to prohibit all search engine crawling:
User-agent:*
Disallow:/
For snapshots that have been included, unless your content has been deleted or changed, it is only useful to apply to delete or update the snapshot. The robots rules are a bit slow to take effect. You can update them yourself in Baidu Webmaster Tools: Baidu Webmaster Platform
First go to Baidu to submit a deletion application
Then remember to add
robots.txt
to the test environment to prevent crawling, or directly reject requests from spiders onnginx
.Just ban robots and add robots.txt to prohibit all search engine crawling:
For snapshots that have been included, unless your content has been deleted or changed, it is only useful to apply to delete or update the snapshot.
The robots rules are a bit slow to take effect. You can update them yourself in Baidu Webmaster Tools: Baidu Webmaster Platform