0

I'm attempting to create a remote desktop server and client using C#. The server captures the screen and then sends it to the client via a socket. I'm using the code below although it only displays a part of the jpeg image on the client. I think this is because the image is sent in multiple packets and at the moment the code only reads the one packet and displays it. Can anyone explain how I would change my code so it receives multiple packets (the whole image) before displaying it.

Server code:

Socket serverSocket;
Socket clientSocket;

public Form1()
{
    InitializeComponent();

    backgroundWorker1.RunWorkerAsync();
}

private void backgroundWorker1_DoWork(object sender, DoWorkEventArgs e)
{
    try
    {
        serverSocket = new Socket(AddressFamily.InterNetwork,
                                  SocketType.Stream,
                                  ProtocolType.Tcp);

        IPEndPoint ipEndPoint = new IPEndPoint(IPAddress.Any, 8221);

        serverSocket.Bind(ipEndPoint);
        serverSocket.Listen(4);

        //Accept the incoming clients
        serverSocket.BeginAccept(new AsyncCallback(OnAccept), null);
    }
    catch (Exception ex)
    {
        MessageBox.Show(ex.Message, "Stream Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
    }
}

private void timer1_Tick(object sender, EventArgs e)
{
    timer1.Stop();

    Rectangle bounds = new Rectangle(0, 0, 1280, 720);
    Bitmap bitmap = new Bitmap(bounds.Width, bounds.Height);

    using (Graphics g = Graphics.FromImage(bitmap))
    {
        g.CopyFromScreen(Point.Empty, Point.Empty, bounds.Size);
    }

    System.IO.MemoryStream stream = new System.IO.MemoryStream();

    ImageCodecInfo myImageCodecInfo;
    System.Drawing.Imaging.Encoder myEncoder;
    EncoderParameter myEncoderParameter;
    EncoderParameters myEncoderParameters;

    myEncoderParameters = new EncoderParameters(1);

    myImageCodecInfo = GetEncoderInfo("image/jpeg");
    myEncoder = System.Drawing.Imaging.Encoder.Quality;
    myEncoderParameter = new EncoderParameter(myEncoder, 40L);
    myEncoderParameters.Param[0] = myEncoderParameter;

    bitmap.Save(stream, myImageCodecInfo, myEncoderParameters);

    byte[] imageBytes = stream.ToArray();

    stream.Dispose();

    clientSocket.Send(imageBytes);

    timer1.Start();
}

As you can see, I'm using a timer which has the interval set to 30 for sending the image bytes.

Client code:

public Socket clientSocket;

byte[] byteData = new byte[2048];
MemoryStream ms;

public Form1()
{
    InitializeComponent();

    backgroundWorker1.RunWorkerAsync();

    this.DoubleBuffered = true;
}

private void backgroundWorker1_DoWork(object sender, DoWorkEventArgs e)
{
    try
    {
        clientSocket = new Socket(AddressFamily.InterNetwork,
                       SocketType.Stream, ProtocolType.Tcp);

        IPEndPoint ipEndPoint = new IPEndPoint(IPAddress.Parse("MY EXTERNAL IP HERE"), 8221);

        //Connect to the server
        clientSocket.BeginConnect(ipEndPoint,
            new AsyncCallback(OnConnect), null);

    }
    catch (Exception ex)
    {
        MessageBox.Show(ex.Message, "SGSclient",
                        MessageBoxButtons.OK,
                        MessageBoxIcon.Error);
    }
}

private void OnConnect(IAsyncResult ar)
{
    try
    {
        //Start listening to the data asynchronously
        clientSocket.BeginReceive(byteData,
                                   0,
                                   byteData.Length,
                                   SocketFlags.None,
                                   new AsyncCallback(OnReceive),
                                   null);
    }
    catch (Exception ex)
    {
        MessageBox.Show(ex.Message, "Stream Error",
            MessageBoxButtons.OK, MessageBoxIcon.Error);
    }
}

private void OnReceive(IAsyncResult ar)
{
    try
    {
        int byteCount = clientSocket.EndReceive(ar);

        ms = new MemoryStream(byteData);

        using (BinaryReader br = new BinaryReader(ms))
        {
            this.BackgroundImage = Image.FromStream(ms).GetThumbnailImage(this.ClientRectangle.Width, this.ClientRectangle.Height, null, IntPtr.Zero);
        }

    }
    catch (ArgumentException e)
    {
        //MessageBox.Show(e.Message);
    }

    clientSocket.BeginReceive(byteData, 0, byteData.Length, SocketFlags.None, new AsyncCallback(OnReceive), null);
}

The client is meant to receive the image and then display it on the form's background.

tshepang
  • 12,111
  • 21
  • 91
  • 136
Joey Morani
  • 25,431
  • 32
  • 84
  • 131
  • You must set a delimiter at the end of your image, some set of bytes that will sinalize the server that the image has ended. It's also known as **end of file** (EOF). TCP won't split the image into logical packets for your app, so you must write your own control of information. – Andre Calil Aug 03 '12 at 18:57

3 Answers3

4

You need to add application level protocol to your socket communications.

Add a header to all messages sent. The header contains the count of bytes that follow. It is simpler code and performs better to have a byte count than to kludge a termination sequence.

The client then does two sets of reads: 1) Read for the number of bytes that are known to be in any header.

2) After extracting the byte count from the header, loop reading until you get the indicated byte count.

A must-read article for all people who are writing socket communications: http://nitoprograms.blogspot.com/2009/04/message-framing.html From that article: Repeat this mantra three times: "TCP does not operate on packets of data. TCP operates on streams of data."

aaaa bbbb
  • 3,003
  • 2
  • 23
  • 23
1

I have previously answered a similar question, and provided a fully working example that I think does exactly what you are trying to do. See: transferring a screenshot over a TCP connection

Community
  • 1
  • 1
Iridium
  • 23,323
  • 6
  • 52
  • 74
0

You must set a delimiter at the end of your image, some set of bytes that will sinalize the server that the image has ended. It's also known as end of file (EOF) or end of message. TCP won't split the image into logical packets for your app, so you must write your own control of information.

The logic would be similar to this: CLIENT

byte[] EndOfMessage = System.Text.Encoding.ASCII.GetBytes("image_end");
byte[] ImageBytes = GetImageBytes();
byte[] BytesToSend = new byte[EndOfMessage.Length + ImageBytes.Length];
Array.Copy(ImageBytes, 0, BytesToSend);
Array.Copy(EndOfMessage, 0, BytesToSend, ImageBytes.Length, EndOfMessage.Length);

SendToServer(BytesToSend);

SERVER

byte[] EndOfMessage = System.Text.Encoding.ASCII.GetBytes("image_end");
byte[] ReceivedBytes;

while(!IsEndOfMessage(ReceivedBytes, EndOfMessage ))
{
//continue reading from socket and adding to ReceivedBytes
}

ReceivedBytes = RemoveEndOfMessage(ReceivedBytes, EndOfMessage );
PrintImage(ReceivedBytes);

I'm at work right now and I can't provide a full running example, I'm sorry.

Regards


Support methods:

private bool IsEndOfMessage(byte[] MessageToCheck, byte[] EndOfMessage)
{
    for(int i = 0; i++; i < EndOfMessage.Length)
    {
        if(MessageToCheck[MessageToCheck.Length - (EndOfMessage.Length + i)] != EndOfMessage[i])
            return false;
    }

    return true;
}

private byte[] RemoveEndOfMessage(byte[] MessageToClear, byte[] EndOfMessage)
{
    byte[] Return = new byte[MessageToClear.Length - EndOfMessage.Length];
    Array.Copy(MessageToClear, Return, Return.Length);

    return Return;
}

Again, I couldn't test them, so you may find some bugs.

Andre Calil
  • 7,652
  • 34
  • 41
  • 1
    It is better to use a header that has a count of bytes instead of a termination string. – aaaa bbbb Aug 03 '12 at 20:12
  • That's another solution. Anyway, it would be necessary to convention the end of the header (and the start of the message). So, I wouldn't say that it's *better*. It's just another way. – Andre Calil Aug 03 '12 at 20:18
  • 1
    No, for binary data byte count header IS better. A convention that the first 4 bytes in the message are the message length is simple. Using a termination string requires expensive processing to detect the string in the bytes received. Another problem is when the bytes transmitted have the same pattern as the termination string. For even one more problem, the packets sent might get combined with the next message, and your scheme doesn't separate data received into two separate messages. – aaaa bbbb Aug 03 '12 at 20:52
  • (1) You're relying on conventions. (2) OP is sending an image, there's no chance to conflict with the EOM string. (3) What do you mean by "next message"? TCP has no *message* concept, we're just concatenating bytes. (4) Mind your answer, OP will decide which serves better. Regards. – Andre Calil Aug 03 '12 at 20:56
  • Data sent over sockets can be contain data from more than one Send call. What the network does with the data is the network's business. You might do a send with one image buffer, then a short time later do another send with another image buffer. Both buffers might arrive in the same packet, with the Receive call getting all the data in the one packet--the data from two different Send calls. – aaaa bbbb Aug 03 '12 at 21:15
  • All data transmissions over sockets rely on conventions. Header with byte count is a convention. Text with carriage return as terminator is a convention. "image_end" concatenated at the end of the buffer is a convention. – aaaa bbbb Aug 03 '12 at 21:17