There’s been lots of discussion in the last few weeks about the activity of MSNbot. I’ve noticed a massive increase in the frequency this bot visits my sites, often spidering every single page every day of the week.
I have no problem with MSNbot (or Googlebot & Slurp) spidering my content. I agree with Mikel deMib Svendsen’s comments on WebmasterRadio that there is a symbiotic relationship at work where we allow search engines to spider and index our content in return for traffic and leads. The problem with MSNbot’s hyperactive activity is that it doesn’t seem to matter how often it spiders a page, they never get added to MSN Search index!
The crawl delay is specified in seconds – this example will slow down MSNbot to 100 requests per day, which will certainly help conserve bandwidth on larger sites. However, I and others still want to know why (even after so much bot activity) pages do not get added to the index at MSN Search!