33

As mentioned in this post, before Xcode 6 Beta 4, one could use c.isDigit() and c.isAlpha() to find if c : Character was a digit or alpha. The post mentions that this was removed as it was only effective for ASCII characters.

My question is, what's the replacement? Short of setting up a function with a switch statement for alphanumeric options, how can I test a character on its digit-ness?

Community
  • 1
  • 1
Alex Mitchell
  • 390
  • 1
  • 3
  • 13
  • 1
    The accepted answer to the question you linked contains an example that continues to work. – Greg Hewgill Oct 14 '14 at 02:12
  • My problem is that I only have the one character. Suppose I have `var c : Character` and I want to see if it complies with /[0-9]/. According to that answer, I should do `NSCharacterSet.decimalDigitCharacterSet().longCharacterIsMember(c)`, but it tells me "'Character is not convertible to 'UTF32Char'", so I try `NSCharacterSet.decimalDigitCharacterSet().longCharacterIsMember(UTF32Char(c))` and get that I "Cannot invoke 'init' with an argument of type 'UTF32Char'" – Alex Mitchell Oct 14 '14 at 02:35

6 Answers6

76

The "problem" is that a Swift character does not directly correspond to a Unicode code point, but represents an "extended grapheme cluster" which can consist of multiple Unicode scalars. For example

let c : Character = "" // REGIONAL INDICATOR SYMBOL LETTERS US

is actually a sequence of two Unicode scalars.

If we ignore this fact then you can retrieve the initial Unicode scalar of the character (compare How can I get the Unicode code point(s) of a Character?) and test its membership in a character set:

let c : Character = "5"

let s = String(c).unicodeScalars
let uni = s[s.startIndex]

let digits = NSCharacterSet.decimalDigitCharacterSet()
let isADigit = digits.longCharacterIsMember(uni.value)

This returns "true" for the characters "0" ... "9", but actually for all Unicode scalars of the "decimal digit category", for example:

let c1 : Character = "৯" // BENGALI DIGIT NINE U+09EF
let c2 : Character = "" // MATHEMATICAL DOUBLE-STRUCK DIGIT ONE U+1D7D9

If you care only for the (ASCII) digits "0" ... "9", then the easiest method is probably:

if c >= "0" && c <= "9" { }

or, using ranges:

if "0"..."9" ~= c { }

Update: As of Swift 5 you can check for ASCII digits with

if c.isASCII && c.isNumber { }

using the “Character properties“ introduced with SE-0221.

This solves also the problem with digits modified by a variation selected U+FE0F, like the Keycap Emoji "1️⃣". (Thanks to Lukas Kukacka for reporting this problem.)

let c: Character = "1️⃣"
print(Array(c.unicodeScalars)) // ["1", "\u{FE0F}", "\u{20E3}"]
print(c.isASCII && c.isNumber) // false
Martin R
  • 529,903
  • 94
  • 1,240
  • 1,382
  • The "only ASCII" check does not seem to be 100% reliable. It will fail for characters like "1️⃣" (keycap emoji: https://emojipedia.org/keycap-digit-one/). For example `("0"..."9").contains(Character("1️⃣"))` returns `true`, which is unexpected for simple ASCII 0-9 check.. – Lukas Kukacka Apr 22 '20 at 14:16
  • 1
    @LukasKukacka: You are right. I *think* that the way how strings (or characters) are compared changed in Swift 4 (or 5). The easiest way to check for ASCII digits in Swift 5 would be `if c.isASCII && c.isNumber`. I'll check it later and update the answer accordingly. – Martin R Apr 22 '20 at 14:39
20

With Swift 5, according to your needs, you may choose one of the following ways in order to solve you problem.


#1. Using Character's isNumber property

Character has a property called isNumber. isNumber has the following declaration:

var isNumber: Bool { get }

A Boolean value indicating whether this character represents a number.

The Playground sample codes below show how to check if a character represents a number using isNumber:

let character: Character = "9"
print(character.isNumber) // true
let character: Character = "½"
print(character.isNumber) // true
let character: Character = "④"
print(character.isNumber) // true
let character: Character = "1⃣"
print(character.isNumber) // true
let character: Character = "1️⃣"
print(character.isNumber) // true
let character: Character = "৯"
print(character.isNumber) // true
let character: Character = ""
print(character.isNumber) // true
let character: Character = "F"
print(character.isNumber) // false

#2. Using Character's isWholeNumber property

If you want to check if a character represents a whole number, you can use Character's isWholeNumber property:

let character: Character = "9"
print(character.isWholeNumber) // true
let character: Character = "½"
print(character.isWholeNumber) // false
let character: Character = "④"
print(character.isWholeNumber) // true
let character: Character = "1⃣"
print(character.isWholeNumber) // false
let character: Character = "1️⃣"
print(character.isWholeNumber) // false
let character: Character = "৯"
print(character.isWholeNumber) // true
let character: Character = ""
print(character.isWholeNumber) // true
let character: Character = "F"
print(character.isWholeNumber) // false

#3. Using Unicode.Scalar.Properties's generalCategory property and Unicode.GeneralCategory.decimalNumber

The Playground sample codes below show how to check if the first Unicode scalar of a character is a decimal number using generalCategory and Unicode.GeneralCategory.decimalNumber:

let character: Character = "9"
let scalar = character.unicodeScalars.first! // DIGIT NINE
print(scalar.properties.generalCategory == .decimalNumber) // true
let character: Character = "½"
let scalar = character.unicodeScalars.first! // VULGAR FRACTION ONE HALF
print(scalar.properties.generalCategory == .decimalNumber) // false
let character: Character = "④"
let scalar = character.unicodeScalars.first! // CIRCLED DIGIT FOUR
print(scalar.properties.generalCategory == .decimalNumber) // false
let character: Character = "1⃣"
let scalar = character.unicodeScalars.first! // DIGIT ONE
print(scalar.properties.generalCategory == .decimalNumber) // true
let character: Character = "1️⃣"
let scalar = character.unicodeScalars.first! // DIGIT ONE
print(scalar.properties.generalCategory == .decimalNumber) // true
let character: Character = "৯"
let scalar = character.unicodeScalars.first! // BENGALI DIGIT NINE
print(scalar.properties.generalCategory == .decimalNumber) // true
let character: Character = ""
let scalar = character.unicodeScalars.first! // MATHEMATICAL DOUBLE-STRUCK DIGIT ONE
print(scalar.properties.generalCategory == .decimalNumber) // true
let character: Character = "F"
let scalar = character.unicodeScalars.first! // LATIN CAPITAL LETTER F
print(scalar.properties.generalCategory == .decimalNumber) // false

#4. Using Unicode.Scalar.Properties's generalCategory property and Unicode.GeneralCategory.otherNumber

Similarly, you can check that the first Unicode scalar of a character corresponds to the category Other_Number in the Unicode Standard using generalCategory and Unicode.GeneralCategory.otherNumber:

let character: Character = "9"
let scalar = character.unicodeScalars.first!
print(scalar.properties.generalCategory == .otherNumber) // false
let character: Character = "½"
let scalar = character.unicodeScalars.first!
print(scalar.properties.generalCategory == .otherNumber) // true
let character: Character = "④"
let scalar = character.unicodeScalars.first!
print(scalar.properties.generalCategory == .otherNumber) // true
let character: Character = "1⃣"
let scalar = character.unicodeScalars.first!
print(scalar.properties.generalCategory == .otherNumber) // false
let character: Character = "1️⃣"
let scalar = character.unicodeScalars.first!
print(scalar.properties.generalCategory == .otherNumber) // false
let character: Character = "৯"
let scalar = character.unicodeScalars.first!
print(scalar.properties.generalCategory == .otherNumber) // false
let character: Character = ""
let scalar = character.unicodeScalars.first!
print(scalar.properties.generalCategory == .otherNumber) // false
let character: Character = "F"
let scalar = character.unicodeScalars.first!
print(scalar.properties.generalCategory == .otherNumber) // false

#5. Using CharacterSet's decimalDigits property

As an alternative, you can import Foundation and check if CharacterSet.decimalDigits contains the first Unicode scalar of a character:

import Foundation

let character: Character = "9"
let scalar = character.unicodeScalars.first!
print(CharacterSet.decimalDigits.contains(scalar)) // true
import Foundation

let character: Character = "½"
let scalar = character.unicodeScalars.first!
print(CharacterSet.decimalDigits.contains(scalar)) // false
import Foundation

let character: Character = "④"
let scalar = character.unicodeScalars.first!
print(CharacterSet.decimalDigits.contains(scalar)) // false
import Foundation

let character: Character = "1⃣"
let scalar = character.unicodeScalars.first!
print(CharacterSet.decimalDigits.contains(scalar)) // true
import Foundation

let character: Character = "1️⃣"
let scalar = character.unicodeScalars.first!
print(CharacterSet.decimalDigits.contains(scalar)) // true
import Foundation

let character: Character = "৯"
let scalar = character.unicodeScalars.first!
print(CharacterSet.decimalDigits.contains(scalar)) // true
import Foundation

let character: Character = ""
let scalar = character.unicodeScalars.first!
print(CharacterSet.decimalDigits.contains(scalar)) // true
import Foundation

let character: Character = "F"
let scalar = character.unicodeScalars.first!
print(CharacterSet.decimalDigits.contains(scalar)) // false

#6. Using Unicode.Scalar.Properties's numericType

Apple documentation states for numericType:

For scalars that represent a number, numericType is the numeric type of the scalar. For all other scalars, this property is nil.

The sample codes below show the possible numeric type (decimal, digit or numeric) for the first scalar of a given character:

let character: Character = "9"
let scalar = character.unicodeScalars.first!
print(scalar.properties.numericType) // Optional(Swift.Unicode.NumericType.decimal)
let character: Character = "½"
let scalar = character.unicodeScalars.first!
print(scalar.properties.numericType) // Optional(Swift.Unicode.NumericType.numeric)
let character: Character = "④"
let scalar = character.unicodeScalars.first!
print(scalar.properties.numericType) // Optional(Swift.Unicode.NumericType.digit)
let character: Character = "1⃣"
let scalar = character.unicodeScalars.first!
print(scalar.properties.numericType) // Optional(Swift.Unicode.NumericType.decimal)
let character: Character = "1️⃣"
let scalar = character.unicodeScalars.first!
print(scalar.properties.numericType) // Optional(Swift.Unicode.NumericType.decimal)
let character: Character = "৯"
let scalar = character.unicodeScalars.first!
print(scalar.properties.numericType) // Optional(Swift.Unicode.NumericType.decimal)
let character: Character = ""
let scalar = character.unicodeScalars.first!
print(scalar.properties.numericType) // Optional(Swift.Unicode.NumericType.decimal)
let character: Character = "F"
let scalar = character.unicodeScalars.first!
print(scalar.properties.numericType) // nil
Imanou Petit
  • 89,880
  • 29
  • 256
  • 218
9

Swift 3 seems a little easier:

let str = "abcdef12345"
let digitSet = CharacterSet.decimalDigits

for ch in str.unicodeScalars {
    if digitSet.contains(ch) {
        // is digit!
    }
}
KeepWalking
  • 129
  • 1
  • 2
4

About Swift 3:

According to the great Martin R answer's , for who have problems with the new swift:

replace:

let digits = NSCharacterSet.decimalDigitCharacterSet()

with:

let digits = NSCharacterSet.decimalDigits as NSCharacterSet
Alessandro Ornano
  • 34,887
  • 11
  • 106
  • 133
4

For a single character:

CharacterSet.decimalDigits.contains(string.unicodeScalars.first!)
0

I have create mine var to check whether character is number or not.

var isNumber : Bool {
    get{
        return !self.isEmpty && self.rangeOfCharacter(from: CharacterSet.decimalDigits.inverted) == nil
    }
}

// Use

var initial = "1"
if let isNum = initial.isNumber, isNum { // true
    initial = "#"
} 
Gurjinder Singh
  • 9,221
  • 1
  • 66
  • 58