4

I have some image data (jpeg) I want to send from my iPhone app to my webservice. In order to do this, I'm using the NSData from the image and converting it into a string which will be placed in my JSON.

Currently, I'm doing this:

    NSString *secondString = [[NSString alloc]  initWithBytes:[result bytes]
                                                    length:[result length] 
                                                  encoding:NSUTF8StringEncoding];

Where result is of type NSData. However, secondString appears to be null even though result length returns a real value (like 14189). I used this method since result is raw data and not null-terminated.

Am I doing something wrong? I've used this code in other areas and it seems to work fine (but those areas I'm currently using it involve text not image data).

TIA.

pschang
  • 2,568
  • 3
  • 28
  • 23

2 Answers2

2

For binary data, better to encode it using Base64 encoding then decode it in you webservice. I use NSData+Base64 class downloaded from here, this reference was also taken from Stackoverflow, an answer made by @Ken (Thanks Ken!).

Community
  • 1
  • 1
Manny
  • 6,277
  • 3
  • 31
  • 45
  • I tried Base64 but got some illegal JSON characters (before). I'll look closer into the encoding and see if I can find the error. – pschang Nov 10 '10 at 16:11
  • Sweet, it worked. Looks like the base64 function I was using needed a little tweaking. Thanks! – pschang Nov 10 '10 at 17:34
2

You are not converting the data to a string. You are attempting to interpret it as a UTF-8 encoded string, which will fail unless the data really is a UTF-8 encoded string. Your best bet is to encode it somehow, perhaps with Base64 as Manny suggests, and then decode it again on the server.

Lily Ballard
  • 182,031
  • 33
  • 381
  • 347
  • OK. Yeah, I wasn't sure if this was possible. I'm trying to address some code in an external library and wasn't sure if this made sense. Thanks. – pschang Nov 10 '10 at 16:20
  • 1
    Just a note for any future readers. From the wiki on UTF-8: "Not all sequences of bytes are valid UTF-8." And looking more closely at the definition of UTF-8, I can see where there's a problem. It seems that you run into problems by taking binary data (that is not UTF-8) and trying to interpret it as UTF-8. I looked around Stack Overflow and couldn't find anyone who explicitly said this. – pschang Nov 10 '10 at 16:21