23

I have made a little app for signing up for an event. User input their data and click "sign me in".

Now sometimes people are double in the database, the exact same data that got inserted 2 times very quickly after each other. This can only mean someone clicked the button twice, which caused two posts to happen.

This is common web problem, as credit card apps and forum apps often say: "Clicking once is enough!".

I guess you could solve it by checking for the exact same data to see if the post is unique, but I wonder if there are other methods.

This ofcourse does not count for ASP.NET webforms, because POST doesn't matter as much.

JJJ
  • 32,902
  • 20
  • 89
  • 102

11 Answers11

27

While JavaScript solutions can disable the submit button after it has been clicked, this will have no effect on those people who have JavaScript disabled. You should always make things work correctly without JavaScript before adding it in, otherwise there's no point as users will still be able to bypass the checks by just disabling JavaScript.

If the page where the form appears is dynamically generated, you can add a hidden field which contains some sort of sequence number, a hash, or anything unique. Then you have some server-side validation that will check if a request with that unique value has already come in. When the user submits the form, the unique value is checked against a list of "used" values. If it exists in the list, it's a dupe request and can be discarded. If it doesn't exist, then add it to the list and process as normal. As long as you make sure the value is unique, this guarantees the same form cannot be submitted twice.

Of course, if the page the form is on is not dynamically generated, then you'll need to do it the hard way on the server-side to check that the same information has not already been submitted.

Rich Adams
  • 26,096
  • 4
  • 39
  • 62
  • 3
    Your solution to the problem is fine. But a hidden field can be viewed by client and can be edited by him to spoil our application which may accept double form again. I am searching for a solution which is completely oriented to server. – sud_shan Sep 11 '14 at 08:42
  • If you have some load balancer, send a UUID (or any type of unique number) to the server to store and read again will not work well if the server is not aware about other servers, because each request could be processed by a different server. – Dherik Dec 21 '17 at 13:14
  • 1
    Hey Rich Adams. I wanted to reproduce the problem that if you hit a button twice that this will submit the form twice. However, this never happens. Even if I hit the button 100 times the form will only be submitted once - do you have an idea why this works out of the box? Here is my example: https://stackoverflow.com/questions/51347585/why-is-this-form-not-submitted-twice-when-hitting-the-button-twice – Adam Jul 24 '18 at 16:13
20

Most of the answers so far have been client-side. On the server-side, you can generate a hidden field with a GUID when you first produce the form, and then record that GUID as a submitted form when the post is received. Check it before doing any more processing.

Jon Skeet
  • 1,421,763
  • 867
  • 9,128
  • 9,194
  • +1 client side is just not reliable enough where data corruption is a possibility, especially where the one-time key method works so well – annakata Jan 14 '09 at 12:05
  • 2
    (actually, just wanted to add I personally prefer to put that key on the querystring rather than create a hidden field) – annakata Jan 14 '09 at 12:07
  • The GUID is being used as an "idempotency key" here, to give anyone curious about the theory a term to google. – DharmaTurtle Aug 11 '20 at 20:25
5

Whenever a page is requested from the server , generate a unique requestToken , save it in server side,mark status as NOT Processed and pass it along with the current requested page. Now whenever a page submit happens , get the requestToken from the "POST"ed data and check the status and save the data or take alternate action.

Most of the banking applications use this technique to prevent double "POST"ing.So this is a time proven & reliable way of preventing double submissions.

Tito
  • 8,894
  • 12
  • 52
  • 86
  • Cherit, let's assume the client got the token and he submits a POST. The POST made it on the server but on the reply back, network got affected. The user didn't see a response. He refreshes the form page. Should he get a new token since his POST earlier made it? – devwannabe Jul 12 '17 at 14:11
  • 1
    @devwannabe when the user refreshes the page, the browser will ask to resend form data as a courtesy. The browser can send the same form data, but the server can reject it based on the expired token. If the server is not coded to check the expired token, it can result in double submit of the same data. The way to handle expired token is to do a server side redirect to another page with a message that 'your data is already submitted' or if its an ajax call, a error response accordingly. Its unofficially called PRG pattern. More --> http://www.theserverside.com/news/1365146/Redirect-After-Post – Tito Jul 13 '17 at 14:03
  • That means, browser is ok to resend the same data as long as the backend knows how to verify data being sent. Backend has to verify always if the request has already been fulfilled previously and if so, redirect user to another page. – devwannabe Jul 14 '17 at 06:34
4

A user-side solution is to disable the submission button via Javascript after the first click.

It has drawbacks, but I see it often used on e-commerce websites.

But, it won't never replace a real server-side validation.

Pierre-Yves Gillier
  • 507
  • 1
  • 4
  • 17
  • It can be a useful addition to server-side detection, as it can stop your second click taking you to the "You double-posted, idiot" page. It's a good idea to re-enable the button shortly afterwards so that accidental double-clicks are stopped without breaking the form for a deliberate second use. – bobince Jan 14 '09 at 13:01
3

Client side techniques are useful, but you may want to couple it with some server side techniques.

One way to do this is to include a unique token in the form (e.g. a GUID or similar), so that when you come to process the form you can check to see whether the token has already been used, preventing a double submission.

In your case, if you have a table with event visitors, you might include this token as a column.

Paul Dixon
  • 295,876
  • 54
  • 310
  • 348
3

A client-only solution won't be enough, as stated in many of the answers here. You need to go with a server-side fail-safe.

An often overlooked reason that disabling the submit button doesn't work is, the user can simply refresh the submit target (and click OK on the "are you sure you want to resubmit the POST data?" dialog). Or even, some browsers may implicitly reload the submitted page when you try to save the page to disk (for example, you're trying to save a hard-copy of an order confirmation).

Ates Goral
  • 137,716
  • 26
  • 137
  • 190
  • i was thinking about this , we cannot prevent double submits with mere javascript alone. – Tito Sep 20 '12 at 07:11
2

Almost no one has js disabled. Think about coding your e-commerce website for the 70 year old woman who double clicks every link and button. All you want to do is add a javascript to prevent her clicking "Order Now" twice. Yes - check this at the server side too "be defensive" - but don't code for that case. But for the sake of a better UI do it on the client side too.

Here are some scripts that I found:

//
// prevent double-click on submit
//
  jQuery('input[type=submit]').click(function(){
    if(jQuery.data(this, 'clicked')){
      return false;
    }
    else{
      jQuery.data(this, 'clicked', true);
      return true;
    }
  });

and

// Find ALL <form> tags on your page
$('form').submit(function(){
    // On submit disable its submit button
    $('input[type=submit]', this).attr('disabled', 'disabled');
});
aron
  • 2,856
  • 11
  • 49
  • 79
2

None of the solutions address a load-balance server.

If you have some load balancer, send a UUID (or any type of unique number) to the server to store and read again will not work well if the server is not aware of other servers, because each request could be processed by a different server in a stateless environment. These servers need to read/write to the same place.

If you have multiple servers you will need to have some shared cache (like a Redis) among the servers to read/write the unique value in the same place (what could be an over-engineering solution, but works).

Dherik
  • 17,757
  • 11
  • 115
  • 164
  • Also reverse proxies such as Kong like to send automated retries when your backend takes too long. A single request sent by a client would appear as 5 separate requests to the backend. This is a problem we encountered in production once, and it was not pretty. – Zyl Jul 27 '19 at 14:56
0

Client side alteration is a common technique:

  • Disable submit button
  • Change the screen to a "please wait" screen
  • If the form was modal, changing the screen back to their usual process (this has the benefit of making things look really slick)

But it's not perfect. It all relies on JS being available and if that's not the case, without back-end duplication detection, you'll get duplicates still.

So my advice is to develop some sort of detection behind the scenes and then improve your form to stop people with JS being able to double-submit.

Oli
  • 235,628
  • 64
  • 220
  • 299
0

You can track the number of times the form's been submitted and compare it to the number of unique visits to the page with the form on it in the session.

Allain Lalonde
  • 91,574
  • 70
  • 187
  • 238
0

Beside the many good techniques already mentioned, another simple server-side method, that has the drawback of requiring a session, is to have a session variable that is switched off on the first submit.

xgMz
  • 3,334
  • 2
  • 30
  • 23