Telling Computers and Humans Apart in web forms without visible CAPTCHAs
Many websites use CAPTCHAs of one form or another to try and tell the difference between Humans and Computers interacting with web forms. As the BBC have recently commented, these can be cumbersome to implement and also annoying to users.
Automated ways to distinguish humans and computer bots when submitting data via a web page are relatively easy once you figure out the ways each interact with web servers.
The basic thing that almost all spambots have in common is that they
- Don't support session cookies,
- Try to submit hyperlinks in one form or another
After very close to three years of running such a system on various websites that I look after, it is still yet to be cracked by any automated bots and also has not had any false-positives (apart from requiring moderation for a very small number of posts where people are submitting legitimate URLs). The only human (non-automated) spammer that has made it through to be moderated is from one IP address in Russia that regularly tries to include a link to a fake finance blog, so again something that's very easy to block with the right back end in place.