2

I am attempting to connect my laptop with my standalone pc using C# TCPClient class.

Laptop is running a simple console application and plays the role of the server.

PC is a Unity aplication (2018.1.6f1 with .Net4.x Mono)

The code for sending is

public void SendData() {
    Debug.Log("Sending data");
    NetworkStream ns = client.GetStream();
    BinaryFormatter bf = new BinaryFormatter();
    TCPData data = new TCPData(true);
    using (MemoryStream ms = new MemoryStream()) {
        bf.Serialize(ms, data);
        byte[] bytes = ms.ToArray();
        ns.Write(bytes, 0, bytes.Length);
    }
}

The same code is used in the Laptop's project, except Debug.Log() is replaced by Console.WriteLine()

For data reception I use

public TCPData ReceiveData() {
    Debug.Log("Waiting for Data");
    using (MemoryStream ms = new MemoryStream()) {
    byte[] buffer = new byte[2048];
        int i = stream.Read(buffer, 0, buffer.Length);
        stream.Flush();
        ms.Write(buffer, 0, buffer.Length);
        ms.Seek(0, SeekOrigin.Begin);
        BinaryFormatter bf = new BinaryFormatter();
        bf.Binder = new CustomBinder();
        TCPData receivedData = (TCPData)bf.Deserialize(ms);
        Debug.Log("Got the data");
        foreach (string s in receivedData.stuff) {
            Debug.Log(s);
        }
        return receivedData;
    }
}

Again the same on both sides,

The data I am trying to transfer looks like this

[Serializable, StructLayout(LayoutKind.Sequential)]
public struct TCPData {
    public TCPData(bool predefined) {
        stuff = new string[2] { "Hello", "World" };
        ints = new List<int>() {
            0,1,2,3,4,5,6,7,8,9
        };
    }
    public string[] stuff;
    public List<int> ints;
}

The custom binder is from here without it I get an assembly error

with it I get Binary stream '0' does not contain a valid BinaryHeader. Possible causes are invalid stream or object version change between serialization and deserialization.

Now the problem:

Sending this from PC to Laptop - 100% success rate
Sending this from Laptop to PC - 20% success rate
(80% is the Exception above)

How is it even possible that it "sometimes" works ?
Shouldn't it be 100% or 0% ?
How do I get it to work ?

Thanks

E: Ok thanks to all the suggestions I managed to increase the chances of success, but it still occasionally fails.

I send a data size "packet" which is 80% of the time received correctly, but in some cases the number I get from the byte[] is 3096224743817216 (insanely big) compared to the sent ~500.

I am using Int64 data type.

E2: In E1 I was sending the data length packet separately, now I have them merged, which does interpret the length properly, but now I am unable to deserialize the data... every time I get The input stream is not a valid binary format. The starting contents (in bytes) are: 00-00-00-00-00-00-04-07-54-43-50-44-61-74-61-02-00 ...

I read the first 8 bytes from the stream and the remaining 'x' are the data, deserializing it on server works, deserializing the same data throws.

E3: Fixed it by rewriting the stream handling code, I made a mistake somewhere in there ;)

MoonKillCZ
  • 71
  • 7
  • Did you put `TCPData` in its own file or in another .cs file? That matters in Unity – Programmer Jul 27 '18 at 16:15
  • @Programmer yes I did, but does it really matter since the networking has nothing to do with Unity ? (The Unity tag is here just in case .net framework version has anything to do with it) – MoonKillCZ Jul 27 '18 at 16:49
  • Most likely the data from PC to Laptop is being sent as a single datagram while the reverse direction the data is being split into multiple datagrams. The issue is how does the receive end know where the end of the data is found. To prove my point check the number of bytes received before de-serializing – jdweng Jul 27 '18 at 17:22
  • @jdweng total amount 543 bytes, I receive 536 and then the remaining 7.. is that the reason for ~20% success that those 7 bytes sometimes end up in the first "batch" ? Thanks for pointing me in the direction ;) – MoonKillCZ Jul 27 '18 at 17:36
  • As I said TCP allows routers and server to split datagrams randomly. You need to add a byte count at beginning of the message and then when you receive, read until you get all the bytes. – jdweng Jul 27 '18 at 17:47

1 Answers1

1

NetworkStream.Read() doesn't block until it reads the requested number of bytes:

"This method reads data into the buffer parameter and returns the number of bytes successfully read. If no data is available for reading, the Read method returns 0. The Read operation reads as much data as is available, up to the number of bytes specified by the size parameter. If the remote host shuts down the connection, and all available data has been received, the Read method completes immediately and return zero bytes."

You must

1) Know how many bytes you are expecting

and

2) Loop on Read() until you have received the expected bytes.

If you use a higher-level protocol like HTTP or Web Sockets they will handle this "message framing" for you. If you code on TCP/IP directly, then that's your responsibility.

David Browne - Microsoft
  • 80,331
  • 6
  • 39
  • 67
  • So the scenario that is happening is: one side sends 'x' amount of data from total 'y', immediately the other side consumes those bytes and returns because it has nothing else to receive and finally right after that the remaining bytes are sent that are effectively lost? Why does it not work only for one side ? This is my first time playing with networking. – MoonKillCZ Jul 27 '18 at 16:45
  • There's typically very small messages aren't broken across Read(), so a message size difference between the directions could explain it, but also might depend on other factors. – David Browne - Microsoft Jul 27 '18 at 17:12