Short version
Is this an identity function?
f = (gₐ · hᵤ · gᵤ · hₐ)
where:
hₐ
is the UTF-16 conversion from bytes to string,gₐ
is the UTF-16 conversion from string to bytes,gᵤ
is theEncoding.UTF8.GetBytes()
,hᵤ
is theEncoding.UTF8.GetString()
,
Long version
I'm using WebSocket4Net to send and receive messages through WebSockets between a C# application and a C# service.
Some messages being binary, I should convert them from and to strings when interacting with the library, since while its Send()
method enables to send an array of bytes, its MessageReceived
communicates the received message as a string only.
To convert bytes to string and string to bytes, I follow the answer by Mehrdad where the internal encoding of .NET Framework, i.e. UTF-16, is used.
On the other hand, according to the code source (see for example DraftHybi10Processor.cs, line 114), WebSocket4Net uses UTF-8 to convert string to bytes and bytes to string.
Would it cause issues? Is data loss possible?