53

In the following code, I iterate over a string rune by rune, but I'll actually need an int to perform some checksum calculation. Do I really need to encode the rune into a []byte, then convert it to a string and then use Atoi to get an int out of the rune? Is this the idiomatic way to do it?

// The string `s` only contains digits.
var factor int
for i, c := range s[:12] {
    if i % 2 == 0 {
        factor = 1
    } else {
        factor = 3
    }
    buf := make([]byte, 1)
    _ = utf8.EncodeRune(buf, c)
    value, _ := strconv.Atoi(string(buf))
    sum += value * factor
}

On the playground: http://play.golang.org/p/noWDYjn5rJ

miku
  • 181,842
  • 47
  • 306
  • 310
  • 1
    `int(r)` to set a rune to an int. See https://stackoverflow.com/a/62739051/12817546. `rune(i)` to set an int to a rune. See https://stackoverflow.com/a/62737936/12817546. –  Jul 09 '20 at 05:21

4 Answers4

129

The problem is simpler than it looks. You convert a rune value to an int value with int(r). But your code implies you want the integer value out of the ASCII (or UTF-8) representation of the digit, which you can trivially get with r - '0' as a rune, or int(r - '0') as an int. Be aware that out-of-range runes will corrupt that logic.

Gustavo Niemeyer
  • 22,007
  • 5
  • 57
  • 46
  • 14
    For those who is wondering how `int(r-'0')` even work, short explanation: "subtracting the value of rune '0' from any rune '0' through '9' will give you an integer 0 through 9". Resulting type after subtraction `r-'0'` will be int32 (base type of runes), that is why if you need `int` type - you will also need `int()` conversion. – Evgeniy Maynagashev Aug 12 '20 at 06:11
14

For example, sum += (int(c) - '0') * factor,

package main

import (
    "fmt"
    "strconv"
    "unicode/utf8"
)

func main() {
    s := "9780486653556"
    var factor, sum1, sum2 int
    for i, c := range s[:12] {
        if i%2 == 0 {
            factor = 1
        } else {
            factor = 3
        }
        buf := make([]byte, 1)
        _ = utf8.EncodeRune(buf, c)
        value, _ := strconv.Atoi(string(buf))
        sum1 += value * factor
        sum2 += (int(c) - '0') * factor
    }
    fmt.Println(sum1, sum2)
}

Output:

124 124
peterSO
  • 158,998
  • 31
  • 281
  • 276
  • Thanks, but that's not quite what I want. You are initializing the rune with 42 - that would yield the `*` character according to my ASCII table. What I get is a string and `range` yields runes, that represent the the character. Using int on the rune would give me the value of the representation, not the *display* value, which I need. I made a short snippet, that hints at the difference: http://play.golang.org/p/JLlmKnddGv – miku Jan 24 '14 at 00:48
  • @miku: see my revised answer. – peterSO Jan 24 '14 at 01:15
  • Ah, nice trick. I don't know why your answer has been downvoted. – miku Jan 24 '14 at 01:53
2

why don't you do only "string(rune)".

s:="12345678910"
var factor,sum int
for i,x:=range s{
    if i%2==0{
            factor=1
        }else{
        factor=3
    }
        xstr:=string(x) //x is rune converted to string
        xint,_:=strconv.Atoi(xstr)
        sum+=xint*factor
}
fmt.Println(sum)
Mowazzem Hosen
  • 457
  • 4
  • 10
-1
val, _ := strconv.Atoi(string(v))

Where v is a rune

More concise but same idea as above

Suraj Rao
  • 29,388
  • 11
  • 94
  • 103
LilahTilt
  • 11
  • 1