1

I have openssl server and Objective-C client. I send message like this

uint32_t testD = 161;
err = SSL_write(ssl_, &testD, sizeof(uint32_t));

and read it by NSInputStream like

case NSStreamEventHasBytesAvailable:
        {
                uint8_t buffer[4];
                int len;
                while ([inStream hasBytesAvailable])
                {
                    len = [inStream read:buffer maxLength:sizeof(buffer)];
                    if (len > 0)
                    {
                        NSString *output = [[NSString alloc] initWithBytes:buffer length:len encoding:NSASCIIStringEncoding];
                        NSData *theData = [[NSData alloc] initWithBytes:buffer length:len];
                        if (nil != output)
                        {
                            char buff;
                            [theData getBytes:&buff length:1];
                            uint32_t temp = (uint32_t)buffer;
                        }
        ...

So, in output I have "¡", it's 161-th ASCII symbol, in buff I have '\xa1' and in temp very big number, but actually I need 161 in temp.

I read that '\xa1' it's also 161, but I can't cast this to uint32_t. What is the problem?


ANSWER:

The problem was in casting. This works fine for me:

unsigned char buff;
int temp = buff;

or

char buff;
int b = (unsigned char) buff;
Vlad
  • 1,541
  • 1
  • 21
  • 27
  • 1
    There is no encoding in SSL_write(), just encryption etc. It transmits whatever you give it. The problem lies elsewhere. If it exists. `\xa1 == 161` identically, not via any encoding. What was the 'very big number'? – user207421 Mar 01 '15 at 00:24
  • @EJP, It's changing from compilation to compilation. 3221078568 and 3220841000, for example. output isn't changing, it's always "¡". – Vlad Mar 01 '15 at 00:37
  • Well your code isn't valid. You sent 4 bytes, so you should be reading 4 bytes, and using all of them to form `temp`. `3221078568 == \xBFFDC228` and `3220841000 == \xBFFA2228`. I shouldn't have had to post those conversions for you. – user207421 Mar 01 '15 at 00:54
  • @EJP, what are u talking about? Max size of buffer is 4 bytes, so I read exactly 4 bytes and I use all buffer to form temp, what is wrong? – Vlad Mar 01 '15 at 01:14
  • You have `char buff` and `length:1`. Sure looks like one byte to me. – user207421 Mar 01 '15 at 07:10
  • @EJP, yes, but to the temp I cast all the `BUFFER`, not `char buff`. `uint32_t temp = (uint32_t)buffer;` – Vlad Mar 01 '15 at 11:03
  • If your test was `len >= 4` you would have four bytes. As it is you have at least one byte but up to three bytes of junk. – user207421 Mar 01 '15 at 18:51
  • Possible duplicate of [Network byte order and endianness issues](http://stackoverflow.com/questions/22889283/network-byte-order-and-endianness-issues), [Converting network byte order (big endian) to little endian](http://stackoverflow.com/q/21143877), [Network Byte Order in sockets](http://stackoverflow.com/q/28253786), etc. – jww Mar 01 '15 at 20:03
  • If that was the answer I retract my comment saying otherwise, but it only works for byte values, in which case why are you sending an `int`? – user207421 Mar 01 '15 at 23:58
  • @EJP, I encoding information with int, actually uint32_t, like 0x80 | 0x21, where 0x80 and 0x21 some information to client. – Vlad Mar 02 '15 at 00:10
  • But why send 32 bits if you only use 8? And if you intend to use 32, or even 16, your solution above won't work. – user207421 Mar 02 '15 at 00:30

3 Answers3

2

No encoding is used by SSL_write(), and \xa1 == 161 is a mathematical identity, not the result of any encoding process. As you're successfully recovering \xa1, clearly no decoding is used by NSInputStream either.

It seems to me that you're casting the address of the buffer rather than its contents, which is why you get a high value that varies with compilation.

In addition you are possibly over-running the data by reading whatever is available and then only consuming four bytes of it: less in fact because you're incorrectly testing len >= 1 rather than len >= 4.

You should:

  1. Use a buffer of exactly four bytes. No need to allocate it dynamically: you can declare it as a local array.
  2. Read until you have read four bytes. This requires a loop.
  3. Change the casting syntax (don't ask me how, I'm no Objective-C expert, but the code that recovers buff looks like a good start), so that you get the content of the buffer instead of the address.

After that you may then have endian issues.

Nothing to do with encoding.

user207421
  • 305,947
  • 44
  • 307
  • 483
0

We can get a single byte value like this:

unsigned char buff;
int temp = buff;

Or

char buff;
int b = (unsigned char) buff;
user207421
  • 305,947
  • 44
  • 307
  • 483
Vlad
  • 1,541
  • 1
  • 21
  • 27
0

What encoding is used in SSL_write and NSInputStream?

There is no encoding. Its bytes in and bytes out.

I think you are looking for network byte order/endianess.

Network byte order is big endian. So your code would become:

uint32_t testD = 161;
uint32_t be = htonl(testD);
err = SSL_write(ssl_, &be, sizeof(be));

Here's the description of htonl from the htonl(3) man pages:

The htonl() function converts the unsigned integer hostlong from host byte order to network byte order.

To convert back, you would use ntohl.

I'm not sure if Cocoa/CocoaTouch offers a replacement for htonl and ntohl. So you might have to use them in your iPhone projects, too. See, for example, Using ntohl and htonl problems on iPhone.

Community
  • 1
  • 1
jww
  • 97,681
  • 90
  • 411
  • 885
  • No, you are wrong. The problem was not int byte order, because I send this bytes and I accept this bytes, so in which order I send in this order I will accept. The problem was in `unsigned char`, `'\xa1'` — it's 161, but if I cast this to some integer it will produce mistakes, so I need first cast to `unsigned char` and then to integer. – Vlad Mar 01 '15 at 23:02
  • @Vladislav - your question was on encoding and byte orderings. Check your title. Given that, I'm not sure how the answer was wrong given your question :o Surely you don't expect the community to have crystal balls.... – jww Mar 01 '15 at 23:09
  • @Vladislav Therefore the problem is in how you're casting, and has nothing to do with encoding. Your reasoning above about endianness is fallacious. – user207421 Mar 01 '15 at 23:26