A while back I suggested a method of using timestamps to filter out at
least some automatic form postings. Now that I have tried it for
about 10 months, I thought it might useful to report back.
Briefly, the current time is encoded in a hidden form field when the
page containing it is served. The script that processes the form
checks the (new) current time against that in the form and rejects the
submission if it is either too fast or too slow. Unless the user
is super fast they see no effect at all. There are no accessibility
issues unless one sets the maximum permitted time too low. I currently
allow submissions from 5 seconds up to an hour after the form was
sent. Results suggest that this upper limit can safely be increased.
Even if a user is "caught", in both cases one can re-present the
form. In the case of a slow submission one would regenerate the time
stamp[1] and the user need only hit submit again.
Of course, the time stamp must be protected so that tampering could be
detected, although no examples of altered or missing timestamps showed
up in this test (which it hardly surprising, why would a bot alter
some mysterious hidden field?).
The method does seem to work well. The people seeing the submissions
report very little "spam" and there have been no complaints from users
being inconvenienced. I have not been able to study the submissions
that got through to count the failures, so all I have to go on is the
reports of "very little" spam and counts of blocked submissions.
Of the 1369 submissions in 42 weeks 518 were blocked by this method.
159 of these for being too fast and a surprising 359 for being too
slow -- way too slow. Only two of these are at all close to the one
hour cut-off time. I image many bots queue up the forms for
submission later. It seems they also keep trying without requesting
the form again since there are increasing time values later in the
test period.
The 5 second minimum look about right. The spread for fast reply
rejections was:
0s: 16
1s: 72
2s: 38
3s: 26
4s: 7
I can supply code if anyone else wants to try it.
[1] The ideal setting would be exactly 5 seconds (the minimum
submission time) old. A submit would then be accepted no matter how
fast the user was.
--
Ben. 13 1585
On Fri, 09 May 2008 03:33:13 +0100, Ben Bacarisse
<be********@bsb.me.ukwrote:
>A while back I suggested a method of using timestamps to filter out at least some automatic form postings.
Thanks for this, Ben. very interesting and thorough research.
--
Locate your Mobile phone: <http://www.bizorg.co.uk/news.html>
Great gifts: <http://www.ThisBritain.com/ASOS_popup.html>
Ben Bacarisse <be********@bsb.me.ukwrote in
<87************@bsb.me.uk>:
A while back I suggested a method of using timestamps to
filter out at least some automatic form postings. Now
that I have tried it for about 10 months, I thought it
might useful to report back.
That's a brilliant idea, thanks a lot for your research. I'm
going to adopt this method for filtering out the
spambots... unless you consider this your IP and are going
to patent this. But that would be very, very evil of
you. :-)
--
I'm not dead, just pinin' for the fnords.
"Ben Bacarisse" <be********@bsb.me.ukwrote ...
Of the 1369 submissions in 42 weeks 518 were blocked by this method.
159 of these for being too fast and a surprising 359 for being too
slow -- way too slow. Only two of these are at all close to the one
hour cut-off time. I image many bots queue up the forms for
submission later. It seems they also keep trying without requesting
the form again since there are increasing time values later in the
test period.
What does the system do about a failed form?
Does the bounce page say why?
--
Andrew
seo2seo.com
sick-site-syndrome.com
UK Residents:
STOP THE "10p Tax Ripoff"
Sign the petition to stop the government stealing from the
very poorest tell your friends about this petition: http://petitions.pm.gov.uk/10penceband/
"Andrew Heenan" <an*****@heenan.netwrites:
"Ben Bacarisse" <be********@bsb.me.ukwrote ...
>Of the 1369 submissions in 42 weeks 518 were blocked by this method. 159 of these for being too fast and a surprising 359 for being too slow -- way too slow. Only two of these are at all close to the one hour cut-off time. I image many bots queue up the forms for submission later. It seems they also keep trying without requesting the form again since there are increasing time values later in the test period.
What does the system do about a failed form?
Does the bounce page say why?
For a "fast" or a "slow" submission, the page gives the user a message
and re-presents the form (with the user-provided data, of course) and
asks that the user re-submit. When the time-stamp is missing or
corrupt (i.e. the hashed version does not match the plain text
version) I also re-present the form (with a general-sounding failure
message) though the logs show that this has not yet actually happened.
--
Ben.
Pavel Lepin <p.*****@ctncorp.comwrites:
Ben Bacarisse <be********@bsb.me.ukwrote in
<87************@bsb.me.uk>:
>A while back I suggested a method of using timestamps to filter out at least some automatic form postings. Now that I have tried it for about 10 months, I thought it might useful to report back.
That's a brilliant idea, thanks a lot for your research. I'm
going to adopt this method for filtering out the
spambots... unless you consider this your IP and are going
to patent this. But that would be very, very evil of
you. :-)
.... and would be inconsistent with reporting it here. It is simple to
implement and almost entirely invisible to users so, although it is
only partially effective, I hope it does get used. That was the point
of the post.
Thank you for your kind words.
--
Ben.
Gazing into my crystal ball I observed Ben Bacarisse
<be********@bsb.me.ukwriting in news:87************@bsb.me.uk:
A while back I suggested a method of using timestamps to filter out at
least some automatic form postings. Now that I have tried it for
about 10 months, I thought it might useful to report back.
Briefly, the current time is encoded in a hidden form field when the
page containing it is served. The script that processes the form
checks the (new) current time against that in the form and rejects the
submission if it is either too fast or too slow. Unless the user
is super fast they see no effect at all. There are no accessibility
issues unless one sets the maximum permitted time too low. I
currently
allow submissions from 5 seconds up to an hour after the form was
sent. Results suggest that this upper limit can safely be increased.
Even if a user is "caught", in both cases one can re-present the
form. In the case of a slow submission one would regenerate the time
stamp[1] and the user need only hit submit again.
Of course, the time stamp must be protected so that tampering could be
detected, although no examples of altered or missing timestamps showed
up in this test (which it hardly surprising, why would a bot alter
some mysterious hidden field?).
The method does seem to work well. The people seeing the submissions
report very little "spam" and there have been no complaints from users
being inconvenienced. I have not been able to study the submissions
that got through to count the failures, so all I have to go on is the
reports of "very little" spam and counts of blocked submissions.
Of the 1369 submissions in 42 weeks 518 were blocked by this method.
159 of these for being too fast and a surprising 359 for being too
slow -- way too slow. Only two of these are at all close to the one
hour cut-off time. I image many bots queue up the forms for
submission later. It seems they also keep trying without requesting
the form again since there are increasing time values later in the
test period.
The 5 second minimum look about right. The spread for fast reply
rejections was:
0s: 16
1s: 72
2s: 38
3s: 26
4s: 7
I can supply code if anyone else wants to try it.
[1] The ideal setting would be exactly 5 seconds (the minimum
submission time) old. A submit would then be accepted no matter how
fast the user was.
You could enhance it by placing the time into a db, and upon submission,
compare the value in the db. Generate a unique identifier as a hidden
field, and compare that to the one in the db with the time submitted.
--
Adrienne Boswell at Home
Arbpen Web Site Design Services http://www.cavalcade-of-coding.info
Please respond to the group so others can share
"Ben Bacarisse" <be********@bsb.me.ukwrote in message
news:87************@bsb.me.uk...
>A while back I suggested a method of using timestamps to filter out at
least some automatic form postings. Now that I have tried it for
about 10 months, I thought it might useful to report back.
Briefly, the current time is encoded in a hidden form field when the
page containing it is served. The script that processes the form
checks the (new) current time against that in the form and rejects the
submission if it is either too fast or too slow. Unless the user
is super fast they see no effect at all. There are no accessibility
issues unless one sets the maximum permitted time too low. I currently
allow submissions from 5 seconds up to an hour after the form was
sent. Results suggest that this upper limit can safely be increased.
<snip>
Fascinating Ben. This is an area where I have an active interest, so I may
borrow your idea (no need for code, but the concept is priceless.) I know
its not a complete answer to spam, but it all helps.
One thing I would say is that I wouldn't advertise the idea. Once spammers
catch on to it it shouldn't take them much effort to get round it. That
said, most spammers seem to be absolute idiots so the idea may be sound for
many years to come.
--
Brian Cryer www.cryer.co.uk/brian
Adrienne Boswell wrote:
Gazing into my crystal ball I observed Ben Bacarisse
<be********@bsb.me.ukwriting in news:87************@bsb.me.uk:
>A while back I suggested a method of using timestamps to filter out at least some automatic form postings. Now that I have tried it for about 10 months, I thought it might useful to report back.
Briefly, the current time is encoded in a hidden form field when the page containing it is served. The script that processes the form checks the (new) current time against that in the form and rejects the submission if it is either too fast or too slow. Unless the user is super fast they see no effect at all. There are no accessibility issues unless one sets the maximum permitted time too low. I
currently
>allow submissions from 5 seconds up to an hour after the form was sent. Results suggest that this upper limit can safely be increased.
Even if a user is "caught", in both cases one can re-present the form. In the case of a slow submission one would regenerate the time stamp[1] and the user need only hit submit again.
Of course, the time stamp must be protected so that tampering could be detected, although no examples of altered or missing timestamps showed up in this test (which it hardly surprising, why would a bot alter some mysterious hidden field?).
The method does seem to work well. The people seeing the submissions report very little "spam" and there have been no complaints from users being inconvenienced. I have not been able to study the submissions that got through to count the failures, so all I have to go on is the reports of "very little" spam and counts of blocked submissions.
Of the 1369 submissions in 42 weeks 518 were blocked by this method. 159 of these for being too fast and a surprising 359 for being too slow -- way too slow. Only two of these are at all close to the one hour cut-off time. I image many bots queue up the forms for submission later. It seems they also keep trying without requesting the form again since there are increasing time values later in the test period.
The 5 second minimum look about right. The spread for fast reply rejections was:
0s: 16 1s: 72 2s: 38 3s: 26 4s: 7
I can supply code if anyone else wants to try it.
[1] The ideal setting would be exactly 5 seconds (the minimum submission time) old. A submit would then be accepted no matter how fast the user was.
You could enhance it by placing the time into a db, and upon submission,
compare the value in the db. Generate a unique identifier as a hidden
field, and compare that to the one in the db with the time submitted.
Or, better yet, in the session.
--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp. js*******@attglobal.net
==================
Adrienne Boswell <ar****@yahoo.comwrites:
Gazing into my crystal ball I observed Ben Bacarisse
<be********@bsb.me.ukwriting in news:87************@bsb.me.uk:
>A while back I suggested a method of using timestamps to filter out at least some automatic form postings.
<snip>
>Briefly, the current time is encoded in a hidden form field when the page containing it is served. The script that processes the form checks the (new) current time against that in the form and rejects the submission if it is either too fast or too slow.
<snip>
>Of course, the time stamp must be protected so that tampering could be detected, although no examples of altered or missing timestamps showed up in this test (which it hardly surprising, why would a bot alter some mysterious hidden field?).
<snip>
You could enhance it by placing the time into a db, and upon submission,
compare the value in the db. Generate a unique identifier as a hidden
field, and compare that to the one in the db with the time
submitted.
I am not sure that would add anything. Currently, the server sets the
hidden field to:
time + ":" + md5(time + "some secret string")
When the form comes back, the server splits the string at the ":" and
it computes md5(part-before-the-colon + "some secret string").
Checking that this md5 hash matches the part after the colon is
equivalent, I think, to looking up a unique ID in a server-side DB
(but simpler to do).
--
Ben.
Jerry Stuckle <js*******@attglobal.netwrites:
Adrienne Boswell wrote:
>Gazing into my crystal ball I observed Ben Bacarisse <be********@bsb.me.ukwriting in news:87************@bsb.me.uk:
>>A while back I suggested a method of using timestamps to filter out at least some automatic form postings.
<snip>
>You could enhance it by placing the time into a db,
<snip>
Or, better yet, in the session.
See my reply to Adrienne Boswell. I don't think you gain much by
using session data. There is no reason not to store the data in the
session, but given the checks I make, I don't think it adds much.
--
Ben.
Ben Bacarisse wrote:
Jerry Stuckle <js*******@attglobal.netwrites:
>Adrienne Boswell wrote:
>>Gazing into my crystal ball I observed Ben Bacarisse <be********@bsb.me.ukwriting in news:87************@bsb.me.uk:
A while back I suggested a method of using timestamps to filter out at least some automatic form postings.
<snip>
>>You could enhance it by placing the time into a db,
<snip>
>Or, better yet, in the session.
See my reply to Adrienne Boswell. I don't think you gain much by
using session data. There is no reason not to store the data in the
session, but given the checks I make, I don't think it adds much.
Just one more layer of security - it isn't in the web page.
--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp. js*******@attglobal.net
==================
Jerry Stuckle wrote:
Ben Bacarisse wrote:
>Jerry Stuckle <js*******@attglobal.netwrites:
>>Adrienne Boswell wrote:
Gazing into my crystal ball I observed Ben Bacarisse <be********@bsb.me.ukwriting in news:87************@bsb.me.uk:
A while back I suggested a method of using timestamps to filter out at least some automatic form postings. >
<snip>
>>>You could enhance it by placing the time into a db,
<snip>
>>Or, better yet, in the session.
See my reply to Adrienne Boswell. I don't think you gain much by using session data. There is no reason not to store the data in the session, but given the checks I make, I don't think it adds much.
Just one more layer of security - it isn't in the web page.
With any use of sessions I always have to wonder; what about people who
have cookies disabled?
Do you insist they enable cookies, or go with the flawed trans_sid method?
--
*****************************
Chuck Anderson • Boulder, CO http://www.CycleTourist.com
Nothing he's got he really needs
Twenty first century schizoid man.
***********************************
Chuck Anderson wrote:
Jerry Stuckle wrote:
>Ben Bacarisse wrote:
>>Jerry Stuckle <js*******@attglobal.netwrites:
Adrienne Boswell wrote:
Gazing into my crystal ball I observed Ben Bacarisse <be********@bsb.me.ukwriting in news:87************@bsb.me.uk: > > >A while back I suggested a method of using timestamps to filter >out at >least some automatic form postings. >> <snip>
You could enhance it by placing the time into a db, > <snip>
Or, better yet, in the session.
See my reply to Adrienne Boswell. I don't think you gain much by using session data. There is no reason not to store the data in the session, but given the checks I make, I don't think it adds much.
Just one more layer of security - it isn't in the web page.
With any use of sessions I always have to wonder; what about people who
have cookies disabled?
Do you insist they enable cookies, or go with the flawed trans_sid method?
PHP will handle the session id through a get parameter. Others do
similarly.
But then, people who surf with cookies disabled are uses to sites which
don't work.
--
==================
Remove the "x" from my email address
Jerry Stuckle
JDS Computer Training Corp. js*******@attglobal.net
================== This thread has been closed and replies have been disabled. Please start a new discussion. Similar topics
by: pantagruel |
last post by:
I have an old web application I did where browsers with dynamic
capabilities received a drop down menu on the top of the page and a
fold out on the left hand side of the page and non-dynamic...
|
by: Gomaw Beoyr |
last post by:
Two question about the "partial classes" (in the next wersion of
..NET).
Question 1
==========
Will partial classes (in the next version of C#) have to be
declared "partial" in ALL places.
...
|
by: pawel.pabich |
last post by:
Hajo,
I would like to have 2 my own partial classes.
For example:
Default.aspx.cs
Default2.aspx.cs
and they both will relate to
Default.aspx page.
|
by: Julian Jelfs |
last post by:
Hi,
I had an aspx pag in .Net 1.1 with a label on it. As such I had a code
behind page with a declaration for that label.
When I convert to Asp.Net 2.0 the code behind is converted to a...
|
by: ptass |
last post by:
Hi
In asp.net 2.0 an aspx files .cs file is a partial class and all works fine,
however,
I thought I’d be able to create another class file, call it a partial class
and have
that compile and...
|
by: Matt |
last post by:
I have a problem when I select node elements from an xml file and validata
each node againts the schema. I use XmlValidatingReader and it complains
about elements not being declared.
I have...
|
by: Fat Elvis |
last post by:
I'd like to extend some of my Asp.net pages by using Partial Classes.
Example ASP.Net Page:
public partial class Admin_Customer : System.Web.UI.Page
{
protected void Page_Load(object sender,...
|
by: lokchan |
last post by:
i want to create a vector of pointer s.t. it can handle new and delete
but also have std::vector interface
can i implement by partial specialization and inherence like follow ?
#include...
|
by: kaoruHobbs |
last post by:
My code does not work , i cannot find what is wrong with it.
Please help me ! i don not want to use onclick in HTMl file to make unobtrusive javascript.
html
<form id="checkSudoku" action="#">...
|
by: lorlarz |
last post by:
Unobtrusive JavaScript leads to BUILDERS (e.g. drag drop activity
builder)
Once you totally remove JS from a web page, and learn the shortcuts
and efficiencies
provided by a library like...
|
by: ryjfgjl |
last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
|
by: ryjfgjl |
last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
|
by: emmanuelkatto |
last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud.
Please let me know.
Thanks!
Emmanuel
|
by: BarryA |
last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
|
by: nemocccc |
last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
|
by: Sonnysonu |
last post by:
This is the data of csv file
1 2 3
1 2 3
1 2 3
1 2 3
2 3
2 3
3
the lengths should be different i have to store the data by column-wise with in the specific length.
suppose the i have to...
|
by: marktang |
last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
|
by: Hystou |
last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
|
by: Oralloy |
last post by:
Hello folks,
I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>".
The problem is that using the GNU compilers,...
| |