Web Form Security Practices

We have had a lot of attacks to our web forms here at SVC. I am the paperless applications developer, so the task of keeping them safe and secure falls into my daily operations. An un secure form can let an attacker do many things to ruin someones day, such as sending unsolicited emails to all the staff and students. Some of the common practices to keep the “bots” from breaking into a web form are:

CAPTCHA
This technique provides the user with a series of image “keys”  that they must “unlock” before they can submit the web form. This is a  good way to slow down the bots, but since they don’t get tired, and can hammer away at the form many times each second, they can eventually bypass this technique. Another problem with captcha, and a very important one, is regarding the user experience. Some captchas are impossible to read, and take several tries for a normal user to get correct. This can really damper the user experience and create a situation where the user will just leave instead of taking the time to keep trying to unlock the captcha keys. As stated by the w3, “This type of visual and textual verification comes at a huge price to users who are blind, visually impaired or dyslexic. Naturally, this image has no text equivalent accompanying it, as that would make it a giveaway to computerized systems. In many cases, these systems make it impossible for users with certain disabilities to create accounts, write comments, or make purchases on these sites, that is, CAPTCHAs fail to properly recognize users with disabilities as human.”

So, what else can I do to help keep the bots from submitting forms?

1: Try to make sure that the form is submitted from the form page, because the bots usually submit directly to the form action. Put some kind of trigger in the form submit actions that look for a referring address or a session variable.

In classic ASP we have the Request.ServerVariables collection that can get some important information, and if the form data has not come from the form page, we know its a bot, and can secretly send a security alert to the admin, while not letting the bot know its been busted.

Request.ServerVariables("URL"), SCRIPT_NAME, and PATH_INFO

should all contain the form’s address, and should show that the form was submitted from the proper location.

Request.ServerVariables("SERVER_NAME")

should be checked for the proper domain, indicating whether the form was properly posted from my server. I could create an include to be used before each form submission to check for coming from my server.

These could be put into hidden fields and submitted with the form, then checked for validity before processing and submitting the form. Also a session variable might be a good way to track these from form submit to data capture.

In the form capture logic, we could look for Request.ServerVariables(“HTTP_REFERER”) to ensure that the data has come from our form page.

So, lets try some of this junk and see if it helps out.
First, lets set a session of “formuser” with a value of Request.ServerVariables(“URL”) when the user accesses the form, and then check for that session in the form submit logic. If there is no session, then this data did not come from our form.