These repetitive requests can increase the pageload issues by putting more load on the server. That said, there may be instances in which crawlers/spiders converge on a page that is erroring out ( 502s in the example below). Since search-indexing is desirable for most sites, tread carefully in order to avoid wreaking havoc on a site's SEO. ![]() ![]() Some legitimate bots/crawlers/proxies (such as BingBot or AdsBotGoogle) will identify themselves.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |