0

I have a big issue here, for which I couldn't find an answer anywhere on the web.

I have a basic MVC system, which has 3 components: Model, View and Controller.

I have an index.php file where I'm including everything based on the URI request of the visitor.

The controller connects the View and Model. I'm opening a file on the page load and write a dummy text there. The problem is the following: if I put a sleep(1) into the controller, to delay before loading the page, a visitor can easily hit referesh multiple times.

The file gets opened and the multiple lines are inserted. I have tried to create a session upon file writing and if it exists I don't even run the file writing.

However there is a problem with that. I've refreshed the page multiple times for testing, and even so the session checking is there, the file still gets the multiple lines written in it.

So I guess it gets a simoultanious HTTP request with the refereshing and all of those requests see that no Session variable is set.

After the page has loaded, and I hit refresh it surely doesn't insert any new line, because the session exists. The problem occurs when the visitor refresh the page multiple times at one loading.

Any recommendation to avoid this?

Radical_Activity
  • 2,618
  • 10
  • 38
  • 70
  • 2
    "if I put a sleep(1) into the controller, to delay before loading the page, a visitor can easily hit referesh multiple times." - So dont put sleep(1) in your code. – Gordon Sep 28 '14 at 10:40
  • @Gordon There is a reason that I'm putting sleep there. It's because there would be another function in the place of that code, which is an email sending and it takes about 1 second to send. In that time, the visitor can refresh the page, send the email multiple times and write the file multiple times. – Radical_Activity Sep 28 '14 at 10:42
  • [Send the eMail asynchronously?](https://www.google.de/search?q=async+email+php) – Gordon Sep 28 '14 at 10:44
  • @Gordon what happens if the page doesn't load right away because of the massive amount of codes, and they have the opportunity to refresh it? Than we have the same issue, haven't we? – Radical_Activity Sep 28 '14 at 10:47
  • 3
    Well yes. But that's probably an issue of your design then. A refresh should not alter the server state. Browsers usually tell users when they are resubmitting a form via POST. So if the user confirms this, they actually do a resubmit, so it's okay to write it twice. A refresh, e.g. GET request, on the other hand should not write something because it should be idempotent and safe. In any case, is http://stackoverflow.com/questions/2133964/how-to-prevent-multiple-inserts-when-submitting-a-form-in-php?rq=1 what you are looking for? – Gordon Sep 28 '14 at 10:52
  • @Gordon I guess, yes, my design is the problem basically, because I have tried this with a complete new file, and it workd perfectly. I have tried all the solutions in that thread, thank you, they did not work. I don't know what can cause this strange effect but it's even there in the index.php file, before doing anything. – Radical_Activity Sep 28 '14 at 11:01
  • http://phpsense.com/2006/prevent-duplicate-form-submission/ – Avinash Babu Sep 28 '14 at 13:34

2 Answers2

0

Locks

If you work in a multiuser environment with shared resources you should use a locking system. The bacis procedure is:

  1. Try to get a lock on a resource. If there is already a lock on the resource you cannot get it. If there's not, it's yours.
  2. Use the resource.
  3. Let go of the lock. This is also called: 'unlock' or 'release'.

The main advantage of locks is that they work directly on the resource, and do not depend on a cookie in the browser of the visitor, like in the case of sessions.

You could create locks in many different ways. Some resources, like files, support locks themselves:

http://php.net/manual/en/function.flock.php

But others do not. You could create your own locking system, for instance with a database table.

The problem with locks is that sometimes the unlock command is not given. You need to create a way to deal with this. For instance you could decide that a lock expires after some time. This is the most basic type of protection. There are many more advanced way of dealing with this.

KIKO Software
  • 15,283
  • 3
  • 18
  • 33
  • I did not vote this down - that was someone else, however this does not what I actually need. – Radical_Activity Sep 28 '14 at 11:18
  • Well, you may be right. But you can never prevent multiple submits entirely, you have to deal with it. Locks are part of this solution. Apart from locks you have to know what to do when the same data is submitted multiple time. Replace the old data? Ignore new input? – KIKO Software Sep 28 '14 at 11:23
-1

Problem is that you need to know how sessions works. Session ID is saved to cookie and sent to client on page end. If you don't wait for PHP to finish executing there will be new session id for every page refresh.. What I would do and it should work for 99% of cases , even for users that don't support cookies (most bots) is to use user IP (there is a chance that 2 different users use same IP but that ~1% )

if (!empty($_SERVER['HTTP_CLIENT_IP'])) {
   $ip = $_SERVER['HTTP_CLIENT_IP'];
} elseif (!empty($_SERVER['HTTP_X_FORWARDED_FOR'])) {
    $ip = $_SERVER['HTTP_X_FORWARDED_FOR'];
} else {
    $ip = $_SERVER['REMOTE_ADDR'];
}

Store $ip as uniq userid in DB and maybe have another field timestamp so you can expire user id after XX seconds. This has some pitfalls but for many cases will work.

BojanT
  • 801
  • 1
  • 6
  • 12