Before switching my site to use a CMS (content management system), I made use of hidden link and a PHP script to detect bad robot and email scraping behavior. At first I was playing safe and not automatically blocking anything that hit the hidden page, but later I decided to continue to only receive the email alerts and manually add the blocks to '.htaccess' when needed. Now that I've switched to a CMS, Drupal at this time, I wondered how I might continue to use this bot trap system.
An open source Content Management System. http://www.drupal.org