I'm using the latest .NET core target Lidgren fork: https://github.com/soccermitchy/lidgren-network-gen3 and I'm trying to simulate packet loss / high latency.
There is documentation on how to do this here: https://github.com/lidgren/lidgren-network-gen3/wiki/Lag-Simulation.
This is how I set up my net peer configuration:
config = new NetPeerConfiguration(name);
// This line breaks
config.SimulatedLoss = 0.5f;
config.Port = NetConfig.port;
config.MaximumConnections = 200;
config.EnableMessageType(NetIncomingMessageType.ConnectionApproval);
The line: config.SimulatedLoss = 0.5f;
does not work because apparantly NetPeerConfiguration does not contain a definition for SimulatedLoss
.
When I take a look at the source code: https://github.com/soccermitchy/lidgren-network-gen3/blob/master/Lidgren.Network/NetPeerConfiguration.cs#L468 I see that the SimulatedLoss is between #if DEBUG #endif directives.
How can I run code from NuGet packages that are for debugging only?
I tried enabling a few options in Tools -> Debug and a few other things but I really can't find an answer anywhere.
I apologize in advance if this question has already been asked multiple times before.