I'm not a big fan of bot traps.
The usual reasons for using bot traps are to reduce bandwidth usage, stop content scrapers and stop spammers.
In my opinion it’s not architecturally optimal to try to solve these problems with an IP filter at the PHP level. The payoff from running the filter on every page request just isn’t high enough.
Bandwidth doesn't cost anything if you have a good hosting partner. However, if you have a very dynamic site it may cost you on the performance. The big bandwidth thieves are usually search engine bots. It’s easy to pick out bots from search engines you no interest in and filter them out. Just do it on the webserver (Apache) or the firewall and not in PHP.
When it comes to content scraping I don’t care what kind of filtering you have in place. If it is publicly available it can be scraped.
When it comes to blog spam the Akismet spam filter does a really good job on WordPress installations. However, if your website is getting hammered by spam bots you can stop most of them by denying access to no-refferrer requests in .htaccess at the webserver level.
# DENY ACCESS TO NO-REFERRER REQUESTS
<IfModule mod_rewrite.c>
RewriteCond %{REQUEST_METHOD} POST
RewriteCond %{REQUEST_URI} .wp-comments-post\. [NC]
RewriteCond %{HTTP_REFERER} !.*mywebsite\. [OR,NC]
RewriteCond %{HTTP_USER_AGENT} ^$
RewriteRule (.*) - [F,L]
</IfModule>
Last but not least there is the issue of page load time and caching. You cannot use IP filtering at the PHP level and use page caching at the same the.