9

I'm working on default emojis in iOS. i'm able to successfully encode and decode default emojis using NSNonLossyASCIIStringEncoding encoding.

Its working fine when i sent emojis with simple text but it returns nil when some special character is added in string. How do i make it work ?

Code :

    testString=":;Hello \ud83d\ude09\ud83d\ude00 ., <> /?\";
    NSData *data = [testString dataUsingEncoding:NSUTF8StringEncoding];
    NSString *strBody = [[NSString alloc] initWithData:data encoding:NSNonLossyASCIIStringEncoding]; 
    // here strBody is nil
aqsa arshad
  • 801
  • 8
  • 27

2 Answers2

7

The problem is due to different encodings you have used for encoding and decoding.

 testString=":;Hello \ud83d\ude09\ud83d\ude00 ., <> /?\";
 NSData *data = [testString dataUsingEncoding:NSUTF8StringEncoding];

Here you have converted a string to data using UTF8 encoding. This means it will convert the unicode characters in 1-4 bytes depending on the unicode character used. for e.g. \ude09 will translate to ED B8 89. The explanation of the same is available in wiki. Basically is uses the following technique:

enter image description here

Now if you try to decode this to string using ascii encoding like below

   NSString *strBody = [[NSString alloc] initWithData:data encoding:NSNonLossyASCIIStringEncoding]; 

The above is bound to fail as it cannot decode ED B8 89 or similar unicode data to ascii string. That's why it returns an error.

If the data was ascii encoded, it would have used literal ascii hex for the conversion. So \ude09 would have become "5c 75 64 65 30 39"

So the correct conversion would be :

    testString=":;Hello \ud83d\ude09\ud83d\ude00 ., <> /?\";
    NSData *data = [testString dataUsingEncoding:NSNonLossyASCIIStringEncoding];
    NSString *strBody = [[NSString alloc] initWithData:data encoding:NSNonLossyASCIIStringEncoding]; 

The question for you is why you want it to encode as UTF8 and decode as ASCII?


For emojis, please try the below

        testString=":;Hello \\ud83d\\ude09\\ud83d\\ude00 ., <> /?";
        NSData *data = [testString dataUsingEncoding:NSUTF8StringEncoding];
        NSString *strBody = [[NSString alloc] initWithData:data encoding:NSNonLossyASCIIStringEncoding]; 
manishg
  • 9,520
  • 1
  • 16
  • 19
  • This way i've done is the standard way to escape and unescape emojis. If i would do as you are saying it will never displays emoji. Please give it a try on Xcode and then let me know any valid solution. – aqsa arshad Feb 26 '17 at 15:00
  • well I answered why it returns nil. Thats the reason. Now you need a solution for emojis. The reason why the above solution wont work is because you need to double escape your unicode characters. Like this: testString=":;Hello \\ud83d\\ude09\\ud83\\ude00 ., <> /?\"; – manishg Feb 26 '17 at 17:01
  • I added an edit to my answer which shall work for you. – manishg Feb 26 '17 at 17:06
  • Let me try it and get back to you. – aqsa arshad Feb 27 '17 at 09:50
  • @manishg you have provide 2 solution what if display both example : 1. nikh\\n\\n\\nhhji\\n\\nghj\\n\\nghj id use "NSNonLossyASCIIStringEncoding" will work but not 2nd one. 2. "\\uD83D\\uDE01hi" Emoji if use "NSUTF8StringEncoding" will work but 1st one not – Nikunj Dec 03 '19 at 17:24
1

If you simply want to have emojis in your code as literals, there are two options:

A. Just do it:

NSString *hello = @"+_)(&#&)#&)$&$)&$)^#%!!#$%!";
NSLog(@"%@", hello);

B. Add the codes as UTF32

NSString *hello = @"\U0001F600\U0001F60E+_)(&#&)#&)$&$)&$)^#%!!#$%!";
NSLog(@"%@", hello);

Both prints: +_)(&#&)#&)$&$)&$)^#%!!#$%!

I really do not get your problem.

Amin Negm-Awad
  • 16,582
  • 3
  • 35
  • 50