you should see my server logs... omfg, they're full of crap like that.. i dont care so much about those, that's what debian and apache are for. but the scumbags who work for the riaa and their web crawlers that disobey or ignore robots.txt, and hammer away at a site as fast as they can.. i have a nifty, slow-loading, little infinite bot trap black hole for them, and the worst ones get filtered at the firewall instead. am looking at a dynamic robots.txt though, i saw a site that has an example in perl, it's pretty sweet.. ahh, here it is.. a little outdated, but it gives me a place to start... http://www.leekillough.com/robots.html