0

What is an elegant way of converting a character from int, and int to character?

for example, for an often operation of a -> 0, how do i convert this in swift? And how do i convert 0 -> a?

Currently i use unicode to achieve what i want..

let int = char.unicodeScalars.first!.value - 65 

65 is the uniscalar code number of "A", But i find this really verbose.

Anyone knows a better way of dealing with this kind of stuff in swift?

progammingBeignner
  • 936
  • 1
  • 8
  • 19
  • Not clear what you're after here. Are you asking how to get the ascii value of a character and vice versa? – keno Jun 27 '19 at 07:41
  • @keno yes, i mean. I know how to, just wondering if there is a better way. – progammingBeignner Jun 27 '19 at 07:43
  • What would the expected result be for an emoji "" or a flag "" (which consists of multiple Unicode scalars)? – Martin R Jun 27 '19 at 07:44
  • @MartinR This is more for algorithm purpose. For example, in a lot of algorithm problems, we need to store characters in an array of [Int]. 0 means "a". I have been using unicode.scalar to have a private convertion function most of time, but just wondering if there is a better approach – progammingBeignner Jun 27 '19 at 07:47
  • Check this SO answer -- https://stackoverflow.com/questions/29835242/whats-the-simplest-way-to-convert-from-a-single-character-string-to-an-ascii-va Apparently there are some new properties in Swift 5 that can help. – keno Jun 27 '19 at 07:49

0 Answers0