86

I just want to get the ASCII value of a single char string in Swift. This is how I'm currently doing it:

var singleChar = "a"
println(singleChar.unicodeScalars[singleChar.unicodeScalars.startIndex].value) //prints: 97

This is so ugly though. There must be a simpler way.

Imanou Petit
  • 89,880
  • 29
  • 256
  • 218
DavidNorman
  • 1,421
  • 1
  • 17
  • 22

15 Answers15

142

edit/update Swift 5.2 or later

extension StringProtocol {
    var asciiValues: [UInt8] { compactMap(\.asciiValue) }
}

"abc".asciiValues  // [97, 98, 99]

In Swift 5 you can use the new character properties isASCII and asciiValue

Character("a").isASCII       // true
Character("a").asciiValue    // 97

Character("á").isASCII       // false
Character("á").asciiValue    // nil

Old answer

You can create an extension:

Swift 4.2 or later

extension Character {
    var isAscii: Bool {
        return unicodeScalars.allSatisfy { $0.isASCII }
    }
    var ascii: UInt32? {
        return isAscii ? unicodeScalars.first?.value : nil
    }
}

extension StringProtocol {
    var asciiValues: [UInt32] {
        return compactMap { $0.ascii }
    }
}

Character("a").isAscii  // true
Character("a").ascii    // 97

Character("á").isAscii  // false
Character("á").ascii    // nil

"abc".asciiValues            // [97, 98, 99]
"abc".asciiValues[0]         // 97
"abc".asciiValues[1]         // 98
"abc".asciiValues[2]         // 99
Leo Dabus
  • 229,809
  • 59
  • 489
  • 571
15
UnicodeScalar("1")!.value // returns 49

Swift 3.1

Joe
  • 181
  • 1
  • 6
  • @ixany forced unwraps exist for a reason, plus in this case we're 101% sure the initializer won't fail ("1" is a valid character). – Cristik Feb 06 '19 at 20:32
  • @Cristik Disagree. In general, you can’t be sure that an input value won’t fail forever. So why not playing it safe? Better approach would be: `guard let value = UnicodeScalar("1")?.value else { return }`. – ixany Feb 06 '19 at 21:22
  • @ixany, you disagree that "1" is a valid character? :) – Cristik Feb 06 '19 at 21:23
  • @Cristik `if let` or `guard let` statements too exist for a reason. I’m convinced that, whenever possible, you should play things safe. And that we, as more experienced developer, should provide good coding style and best practices for all the newcomers here at stackoverflow. It was not meant as an offence. – ixany Feb 06 '19 at 21:35
  • @ixany, yes, agreed, and myself avoid forced unwraps as much as possible, however there are (rare or very rare) cases when forced unwraps make sense. Your "never a good idea" seemed a little bit drastic to me in the answer context, that's why I replied :) – Cristik Feb 06 '19 at 21:37
11

Now in Xcode 7.1 and Swift 2.1

var singleChar = "a"

singleChar.unicodeScalars.first?.value
wakeupsumo
  • 206
  • 2
  • 5
10

You can use NSString's characterAtIndex to accomplish this...

var singleCharString = "a" as NSString
var singleCharValue = singleCharString.characterAtIndex(0)
println("The value of \(singleCharString) is \(singleCharValue)")  // The value of a is 97
timgcarlson
  • 3,017
  • 25
  • 52
8

Swift 4.2

The easiest way to get ASCII values from a Swift string is below

let str = "Swift string"
for ascii in str.utf8 {
    print(ascii)
}

Output:

83
119
105
102
116
32
115
116
114
105
110
103
aios
  • 405
  • 5
  • 14
  • This is not the ascii values. This will return an array of bytes. A character other than ascii will return multiple bytes with no relation with the ascii table. – Leo Dabus Jul 06 '20 at 03:35
  • 1
    try `let str = ""` `for ascii in str.utf8 {` `print(ascii)` `}` // 240, 159, 152, 128` – Leo Dabus Jul 06 '20 at 03:42
3

The way you're doing it is right. If you don't like the verbosity of the indexing, you can avoid it by cycling through the unicode scalars:

var x : UInt32 = 0
let char = "a"
for sc in char.unicodeScalars {x = sc.value; break}

You can actually omit the break in this case, of course, since there is only one unicode scalar.

Or, convert to an Array and use Int indexing (the last resort of the desperate):

let char = "a"
let x = Array(char.unicodeScalars)[0].value
matt
  • 515,959
  • 87
  • 875
  • 1,141
3

A slightly shorter way of doing this could be:

first(singleChar.unicodeScalars)!.value

As with the subscript version, this will crash if your string is actually empty, so if you’re not 100% sure, use the optional:

if let ascii = first(singleChar.unicodeScalars)?.value {

}

Or, if you want to be extra-paranoid,

if let char = first(singleChar.unicodeScalars) where char.isASCII() {
    let ascii = char.value
}
Airspeed Velocity
  • 40,491
  • 8
  • 113
  • 118
3

Here's my implementation, it returns an array of the ASCII values.

extension String {

    func asciiValueOfString() -> [UInt32] {

      var retVal = [UInt32]()
      for val in self.unicodeScalars where val.isASCII() {
          retVal.append(UInt32(val))
      }
      return retVal
    }
}

Note: Yes it's Swift 2 compatible.

Sakiboy
  • 7,252
  • 7
  • 52
  • 69
3

Swift 4.1

https://oleb.net/blog/2017/11/swift-4-strings/

let flags = "99_problems"
flags.unicodeScalars.map {
    "\(String($0.value, radix: 16, uppercase: true))"
}

Result:

["39", "39", "5F", "70", "72", "6F", "62", "6C", "65", "6D", "73"]

rustyMagnet
  • 3,479
  • 1
  • 31
  • 41
  • try `let flags = ""` `flags.unicodeScalars.map {` `"\(String($0.value, radix: 16, uppercase: true))"` `}` // "1F600" – Leo Dabus Jul 06 '20 at 03:44
2

Swift 4+

Char to ASCII

let charVal = String(ch).unicodeScalars
var asciiVal = charVal[charVal.startIndex].value

ASCII to Char

let char = Character(UnicodeScalar(asciiVal)!)
Mohit Kumar
  • 2,898
  • 3
  • 21
  • 34
1
var singchar = "a" as NSString

print(singchar.character(at: 0))

Swift 3.1

Muhammad Shauket
  • 2,643
  • 19
  • 40
1

There's also the UInt8(ascii: Unicode.Scalar) initializer on UInt8.

var singleChar = "a"
UInt8(ascii: singleChar.unicodeScalars[singleChar.startIndex])
Anders
  • 186
  • 1
  • 6
  • Note that this is a non fallible initialiser. It will crash if you pass an invalid character which its value is out of the ascii range 0..<128. **"Fatal error: Code point value does not fit into ASCII"** – Leo Dabus Jul 17 '18 at 00:08
  • try `var singleChar = ""` `UInt8(ascii: singleChar.unicodeScalars[singleChar.startIndex])` // Fatal error: Code point value does not fit into ASCII: file /AppleInternal/BuildRoot/Library/Caches/com.apple.xbs/Sources/swiftlang/swiftlang-1200.2.10.322/swift/stdlib/public/core/UnicodeScalar.swift, line 354` – Leo Dabus Jul 06 '20 at 03:45
1

With Swift 5, you can pick one of the following approaches in order to get the ASCII numeric representation of a character.


#1. Using Character's asciiValue property

Character has a property called asciiValue. asciiValue has the following declaration:

var asciiValue: UInt8? { get }

The ASCII encoding value of this character, if it is an ASCII character.

The following Playground sample codes show how to use asciiValue in order to get the ASCII encoding value of a character:

let character: Character = "a"
print(character.asciiValue) //prints: Optional(97)
let string = "a"
print(string.first?.asciiValue) //prints: Optional(97)
let character: Character = ""
print(character.asciiValue) //prints: nil

#2. Using Character's isASCII property and Unicode.Scalar's value property

As an alternative, you can check that the first character of a string is an ASCII character (using Character's isASCII property) then get the numeric representation of its first Unicode scalar (using Unicode.Scalar's value property). The Playground sample code below show how to proceed:

let character: Character = "a"
if character.isASCII, let scalar = character.unicodeScalars.first {
    print(scalar.value)
} else {
    print("Not an ASCII character")
}
/*
 prints: 97
 */
let string = "a"
if let character = string.first, character.isASCII, let scalar = character.unicodeScalars.first {
    print(scalar.value)
} else {
    print("Not an ASCII character")
}
/*
 prints: 97
 */
let character: Character = ""
if character.isASCII, let scalar = character.unicodeScalars.first {
    print(scalar.value)
} else {
    print("Not an ASCII character")
}
/*
 prints: Not an ASCII character
 */
Imanou Petit
  • 89,880
  • 29
  • 256
  • 218
0

Swift 4

print("c".utf8["c".utf8.startIndex])

or

let cu = "c".utf8
print(cu[cu.startIndex])

Both print 99. Works for any ASCII character.

PJ_Finnegan
  • 1,981
  • 1
  • 20
  • 17
0

var input = "Swift".map { Character(extendedGraphemeClusterLiteral: $0).asciiValue! }

// [83, 119, 105, 102, 116]

SantMan
  • 77
  • 6