Is a multi-threaded client necessary in order to cause packet loss, if both the server and client are on the same machine ? What would be the case if there is a remote server ? Suppose I'm sending packets to the server from the client sequentially (in a for loop), is packet loss possible here ?
Asked
Active
Viewed 204 times
0
-
Packet loss on the loopback interface (if it's never leaving your local machine, eg. you're sending to 127.0.0.1) is unlikely. If you're looking to simulate packet loss though, there are ways to do that such as this answer: http://stackoverflow.com/questions/614795/simulate-delayed-and-dropped-packets-on-linux – Liam Gray Feb 26 '17 at 13:43
-
But is multithreading necessary ? A for-loop can't simulate packet loss in case of a remote server also ? @LiamGray – Jarvis Feb 26 '17 at 13:46
-
Packet loss and multithreading are two completely separate things. I'm not entirely sure what you're asking? – Liam Gray Feb 26 '17 at 14:44
-
I'm trying to simulate packet loss on my localhost, so do I need to create threads of client and send messages from multiple threads to the server to cause packet loss or can I use a for loop to sequentially send packets and still cause packet loss ? – Jarvis Feb 26 '17 at 14:49
-
Whether you use one or many clients will have no impacts on simulating packet loss — these clients will still send packets as normal. You will need an intermediate tool (perhaps `iptables` firewall rules) to "drop"/delete the packets and stop your network card resending them over the loopback interface. – Liam Gray Feb 26 '17 at 14:53
-
Just make the server sends the data fast and the client read the data slow - this way data will be lost. No need for multithreading. – Steffen Ullrich Feb 26 '17 at 14:55