2

I have a NSString object and want to change it into unichar.

int decimal = [[temp substringFromIndex:2] intValue]; // decimal = 12298

NSString *hex = [NSString stringWithFormat:@"0x%x", decimal]; // hex = 0x300a

NSString *chineseChar = [NSString stringWithFormat:@"%C", hex];

// This statement log a different Chinese char every time I run this code 
NSLog(@"%@",chineseChar); 

When I see the log, It gives different character every time when I run my code. m I missing something...?

BoltClock
  • 700,868
  • 160
  • 1,392
  • 1,356
SandeepM
  • 2,601
  • 1
  • 22
  • 32

2 Answers2

5

The %C format specifier takes a 16-bit Unicode character (unichar) as input, not an NSString. You're passing in an NSString, which is getting reinterpreted as an integer character; since the string can be stored at a different address in memory each time you run, you get that address as an integer, which is why you get a different Chinese character every time you run your code.

Just pass in the character as an integer:

unichar decimal = 12298;
NSString *charStr = [NSString stringWithFormat:@"%C", decimal];
// charStr is now a string containing the single character U+300A,
// LEFT DOUBLE ANGLE BRACKET
justin
  • 104,054
  • 14
  • 179
  • 226
Adam Rosenfield
  • 390,455
  • 97
  • 512
  • 589
3

How about -[NSString characterAtIndex:]? It wants a character index and returns a unichar.

Dave DeLong
  • 242,470
  • 58
  • 448
  • 498