3

I was looking at Create "Hello World" WebSocket example and began trying to adapt this to send a custom message over the WebSocket that's entered at the console, but I'm having a few issues understanding the code.

My question is what is the correct way to encode my own message so that it can be sent correctly based on the answer I linked above

client.Send(my-own-message);
Community
  • 1
  • 1
AreYouSure
  • 732
  • 3
  • 9
  • 18
  • From what little I can pick up from your question I'd guess that you have had very little to no prior experience with socket programming. I'd recommend that you bite off a smaller piece and start with basic TCP sockets before getting into this. – Spencer Ruport Nov 24 '12 at 11:16
  • Yea, I have very little experience with socket programming. Unfortunately it's something I need to try and get done. I would have preferred to start off smaller but time constraints have hampered me on this one. – AreYouSure Nov 24 '12 at 11:19
  • @Meekel What are you trying to send? Are you trying to send data to the web? Which class is your client ? – Danpe Nov 24 '12 at 11:25
  • @Danpe I was just trying to send a bit of JavaScript and then within my onmessage within the JS to eval it. I had seen an example called Waldo which was built on websocketpp which sort of did the same thing. – AreYouSure Nov 24 '12 at 11:33
  • 2
    Whoooa, what exactly are you asking? Is this a question about how to use an existing websocket implementation to send a message, or how to format your data according to the websocket specification, or how to read data from the standard input stream or, well, any of a dozen other things? Please try to narrow your question down to just *one* thing, and tell us the relevant background for that, and nothing else. If you are able to get a string from the console, then remove the "getting-string-from-console" part of the code and just show us what you do with the resulting string, for example – jalf Nov 24 '12 at 12:10
  • It would also help to know what type `client` is. Is that some class implementing the websocket protocol? Is it just a plain TCP socket? – jalf Nov 24 '12 at 12:11
  • @jalf Thanks. I've narrowed down the question. I want to encode my own message so it can be sent through the WebSocket based on the implementation I linked. – AreYouSure Nov 24 '12 at 12:19
  • @jalf Forgot to say, "client" is just a plain TCP socket based on the answer I linked in my question. – AreYouSure Nov 24 '12 at 12:21
  • @Meekel See the `SendString` method here http://stackoverflow.com/questions/13517712/websocket-server-client-cant-handshake/13520317#13520317 – L.B Nov 24 '12 at 12:26
  • @Meekel Thanks, it's a lot clearer now what you actually want to know. I've tried to post an answer. I hope it helps. :) – jalf Nov 24 '12 at 12:56

2 Answers2

2

You should really go to the source. The Websocket specification is actually fairly straightforward to read, and it tells you exactly how your messages should be formatted.

But in short, and assuming you've already completed the initial handshake establishing the connection, here is what data a Websocket frame should contain:

  • an opcode, a single byte with the value 0x81 if the message is formatted as UTF-8 text, and 0x82 if the message is binary data (note that a couple of browsers do not support the latter)
  • a length field of one or more bytes, describing the length of the message. The most significant bit of the first byte must be set on messages sent by the client (it indicates that the payload is masked, which must be done on client-to-server messages, and must not be done on server-to-client messages). The length field can have a variable length: If the length is below 126 bytes, it is simply encoded as a single byte (with the most significant bit reserved to indicate masking, as mentioned before). If the length is less than 65KB, the 7 available bits of the first byte take the value 126, and the two subsequent bytes contain the length as a 16-bit integer. Otherwise, the 7 bits of the first byte take the value 127, and the subsequent 8 bytes contain the length as a 64-bit integer.
  • a 4-byte masking key, which must be picked randomly for every message
  • and finally, the actual message you wish to send. This must be masked using the masking key, simply by XOR'ing each byte with a byte of the masking key. (byte i of the message should be XOR'ed together with the i%4th byte of the masking key).

Do this, and you've created a valid websocket frame containing either UTF8 text or raw binary data. As you can see, there are a few steps involved, but each is relatively straightforward. (And again, please check with the RFC I linked to, because I just wrote all of this from memory, so there might be minor inaccuracies)

Community
  • 1
  • 1
jalf
  • 243,077
  • 51
  • 345
  • 550
1

So first of all Console.Read() only reads one char and returns int representing this char.

If you want to send a message you probably want to use Console.ReadLine() that returns a string.

string msg = Console.ReadLine();
client.Send(GetBytes(msg));

static byte[] GetBytes(string str)
{
    byte[] bytes = new byte[str.Length * sizeof(char)];
    System.Buffer.BlockCopy(str.ToCharArray(), 0, bytes, 0, bytes.Length);
    return bytes;
}

If encdoing metters:

byte[] ascii = System.Text.Encoding.ASCII.GetBytes (msg);
byte[] utf8 = System.Text.Encoding.UTF8.GetBytes (msg);
Danpe
  • 18,668
  • 21
  • 96
  • 131
  • 1
    I believe the question is about making the message conform to the websocket protocol. So it is not simply copying the data, but prepending the correct header and masking the payload and so on so that it can be sent over a TCP socket, and received by a websocket server. :) – jalf Nov 24 '12 at 15:37