My server usage is exceeded because of Google bot crawling every second. I am running a 7 multistore with 3 main websites and 4 for later use.
Is there any way I can limit crawling to once a day? or I can block the rest of 4 stores from google crawling. Please guide.
We've found this occur on multiple client sites. It has become very bad recently. We found that, as long as it's a well-behaved crawler, adding `crawl-delay: 10` to your robots.txt file limits the number of requests these crawlers can make to once every 10 seconds.
UK OpenCart Hosting | OpenCart Audits | OpenCart Support - please email info@antropy.co.uk
Also, it's a good idea to exclude many of the dodgy bots and crawlers, such as semrush and others, via the '.htaccess' file. A good starting point is this list.
Export/Import Tool * SpamBot Buster * Unused Images Manager * Instant Option Price Calculator * Number Option * Google Tag Manager * Survey Plus * OpenTwig
Who is online
Users browsing this forum: imager and 2 guests