1

Like my question title.
If I get A, I want to return 0.
If I get B, I want to return 1.
....
If I get Z, I want to return 25.
How to make this function looks better and easier? Thanks.

func convertStringToInt(text: String) -> Int {

 switch text {
    case "A":
        return 0
    case "B":
        return 1

    //...C TO Y

    case "Z":
        return 25
    default:
        break
    }
    return 0

}
ddSwift
  • 11
  • 1
  • 1
    How about the small case alphabets and other characters ? – Rizwan Sep 17 '18 at 10:31
  • I think you should change your input method parameter type to Character instead of String – Leo Dabus Sep 17 '18 at 10:33
  • 3
    You can keep this method small by using ASCII values of your characters i.e ASCII value of 'A' is 65, ASCII value of 'B' 66 then in case of A get 65 - 65 = 0, then in case of B get 66 - 65 = 1 likes wise. – vivekDas Sep 17 '18 at 10:36
  • Btw you are returning zero for any Character that's not from a...z how would you differentiate "a" from something else? – Leo Dabus Sep 17 '18 at 10:36

3 Answers3

5

Using ASCII values you can shorten it to this (works with lowercase, too):

func convertStringToInt(characterText: String) -> Int? {
    guard let aValue = "A".unicodeScalars.first?.value,
        let zValue = "Z".unicodeScalars.first?.value,
        let characterValue = characterText.uppercased().unicodeScalars.first?.value,
        // next line tests if the input value is between A and Z
        characterValue >= aValue && characterValue <= zValue else {
            return nil // error

    }
    return Int(characterValue) - Int(aValue)
}

print("Value for A: \(convertStringToInt(characterText: "A"))")
print("Value for G: \(convertStringToInt(characterText: "G"))")
print("Value for Z: \(convertStringToInt(characterText: "Z"))")
print("Value for z: \(convertStringToInt(characterText: "z"))")
print("Value for ^: \(convertStringToInt(characterText: "^"))")

Prints:

Value for A: Optional(0)
Value for G: Optional(6)
Value for Z: Optional(25)
Value for z: Optional(25)
Value for ^: nil

Based on this question.

Or, if you want to play around with array indices:

func convertStringToInt(characterText: String) -> Int {
    let array = ["A","B","C","D","E","F","G","H","I","J","K","L","M","N","O","P","Q","R","S","T","U","V","W","X","Y","Z"]
    return array.firstIndex(of: characterText.uppercased()) ?? -1 // default value for a text that is not found
}
Milan Nosáľ
  • 19,169
  • 4
  • 55
  • 90
  • What if lower case alphabets are passed to convertStringToInt – Rizwan Sep 17 '18 at 10:41
  • @Rizwan changed it locally, forgot to edit the question, I fixed it now.. thanks for noting – Milan Nosáľ Sep 17 '18 at 10:43
  • Suggesting the use of "magic numbers" in Swift is a very bad practice. You should change your return type to optional, remove the nil coalescing operator and the default value (magic number). – Leo Dabus Sep 22 '18 at 01:18
  • Note that unicodeScalars would never return less than one element so it is safe to force unwrap it everywhere in your code. `unicodeScalars.first!` – Leo Dabus Sep 22 '18 at 01:22
3

First of all your function returns 0 for both A and invalid character which is certainly not intended.

This solution considers all uppercase letters in range A-Z and returns nil on failure

func convertStringToInt(text: String) -> Int? {

    guard let scalar = UnicodeScalar(text), 65..<91 ~= scalar.value else { return nil }
    return Int(scalar.value) - 65
}

To consider also lowercase characters use a switch statement

func convertStringToInt(text: String) -> Int? {

    guard let scalar = UnicodeScalar(text) else { return nil }
    let value = Int(scalar.value)
    switch value {
    case 65..<91: return value - 65
    case 97..<123: return value - 97
    default : return nil
    }
}
vadian
  • 274,689
  • 30
  • 353
  • 361
  • Eliminating *magic numbers*: `func convertStringToInt(text: String) -> Int? { let aValue = UnicodeScalar("A").value let zValue = UnicodeScalar("Z").value guard let value = UnicodeScalar(text)?.value, aValue...zValue ~= value else { return nil } return Int(value) - Int(aValue) }` – vacawama Sep 17 '18 at 11:30
  • @vacawama Actually a developer is supposed to know the *magic numbers* (ASCII values - 127 / 0x7F) by heart. – vadian Sep 17 '18 at 11:41
  • 1
    LOL, many of us do know the ASCII values by heart. It was drilled into me in college (aka University) not to put *magic numbers* in your code, and it still bothers me to this day when I see them (or when I use them). The `let`s make it clear what the values mean and any compiler worth a darn will reduce that to the values at compile time. – vacawama Sep 17 '18 at 12:12
  • `guard let scalar = UnicodeScalar(text), "A"..."Z" ~= scalar else` – Leo Dabus Sep 17 '18 at 14:59
  • or `guard let first = text.unicodeScalars.first, "A"..."Z" ~= first else { return nil } return Int(first.value) - 65` – Leo Dabus Sep 17 '18 at 15:02
  • @vacawama You can simply use a ClosedRange – Leo Dabus Sep 17 '18 at 15:13
  • the switch would look like this `guard let scalar = UnicodeScalar(text) else { return nil }` `switch scalar {` `case "a"..."z": return Int(scalar.value) - 32` `case "A"..."Z": return Int(scalar.value) - 65` `default: return nil` `}` – Leo Dabus Sep 17 '18 at 15:21
  • @LeoDabus, nice. That is actually a `ClosedRange`. – vacawama Sep 18 '18 at 02:43
  • @vacawama https://stackoverflow.com/a/47573104/2303865 is an example of ClosedRange that one above it is a String range https://www.dropbox.com/s/9qqvumcmsdjgizv/Screen%20Shot%202018-09-17%20at%2023.46.47.png?dl=1 – Leo Dabus Sep 18 '18 at 02:47
  • 1
    @LeoDabus, Because you are comparing a UnicodeScalar with `"A"..."Z"`, Swift infers `"A"` and `"Z"` to be of type `UnicodeScalar` just as it does here: `let x: UnicodeScalar = "X"`. If you try to compare a `ClosedRange` with a `UnicodeScalar` you get the error **Binary operator '~=' cannot be applied to operands of type 'ClosedRange' and 'Unicode.Scalar'** – vacawama Sep 18 '18 at 03:00
0

I am a high school student and I am also new to Swift, my way of solving the problem is not going to be the most efficient one, but I hope it gives you new insight

func convertStringToInt(text: String)-> Int {
let stringarray=["A","B","C","D","E","F","G","H","I","J","K","L","M","N","O","P","Q","R","S","T","U","V","W","X","Y","Z"]
var defaultvalue = -1
var value = -1000000
for i in stringarray{
    defaultvalue+=1
    if text==i{
        value=defaultvalue
    }
    }
return value

}

-1000000 is just a value for invalid input, I did not use ASCLL code in this code since people already did that, but using ASCLL code will be the most correct way and it will also be easier to identify lowercase and uppercase

  • If you preferred explicit characters declaration, it'll be better to create a dictionary, where each _key_ should be a character within an appropriate value as Integer literal. With that, you can easily write this function without any conditional statements, just like `chararterDictionary["Z"]` which returns 25. – hamsternik Sep 17 '18 at 14:55