How can I simulate lost packets, to test my web app's handling of network glitches?
My background is in video game development ( http://xona.com/ ) and I know network coding can be tested by simulating lost packets, with a setting from 0% to 100% to test your game's performance on bad network traffic. I believe Microsoft offered a way to do this, but I am lost at the moment with finding this solution. I'm interesting in any solution and advice on the matter.
The web app I wish to debug is running on a network that has 0.06..0.07% packet lost on average. This is high enough to cause problems for users, we think, but too low to replicate for developers. In weeks of working on this I have only replicated an issue twice and then couldn't replicate again immediately afterwards for debugging. I would like to simulate more packet loss if possible to replicate the issue, but only between myself and the server running the web app, not between everyone and the web app server.
I hope this explains the situation!
EDIT: Potential duplicate but only on Linux machines, but we may need to do this on Windows machines too (depends on the developer who is given the task): Simulate delayed and dropped packets on Linux