0

I should first start out by saying that I come from a Java background. That being said, I'm just starting to learn C and I really struggle with the use of pointers. The concepts are simple enough, but actually using them proves to be a rather difficult and frustrating experience for me.

At any rate, I'm trying to create a function that replicates atoi without using the stdlib.h library. I'm thinking it's a simple matter of casting, but when I test it I get some really strange results. What I have is as follows:

int myatoi(const char* str){
    return (int)*str;
}

Given that I don't really know what I'm doing when it comes to pointers, I'm most certainly doing something wrong, but I have absolutely no idea what.

  • This does not convert your string to integer. – phoxis Mar 22 '18 at 21:53
  • This is your whole function ??? – llllllllll Mar 22 '18 at 21:53
  • Your function takes the binary data in the string and represents the first character in numerical value (i.e., the character `"1"` might be represented in it's binary ASCI value `49`. – Myst Mar 22 '18 at 21:54
  • 1
    If you have a `char*` that stores the string `"1234"`, then casting that to `int` will not convert the string `"1234"` to the number `1234`. You need to parse the string one character at a time to convert to a number/integer. – Alex Reynolds Mar 22 '18 at 21:54
  • See https://github.com/gcc-mirror/gcc/blob/master/libiberty/strtol.c for how this is done. – phoxis Mar 22 '18 at 21:57
  • casting is not the same as converting. Casting tells the compiler to treat a variable/expression as if it were of a different type. What you are doing is treating the location where you store a sequence of characters as if they were a sequence of integers. – Pablo Mar 22 '18 at 21:58
  • Casts in C do two very different things, which is a constant source of confusion to learners, and a fundamental mistake IMHO. (1) Casts are a compile-time operator that direct the compiler to take a chunk of bits of some type, and interpret them as some other type, without changing the bits themselves. The cast itself produces no code. (2) Casts among number types may (or may not) actually produce runtime code that fiddles bits, truncating, extending, or even converting between floats and ints. You're expecting your cast to do the latter, when it's doing the former. – Lee Daniel Crocker Mar 22 '18 at 22:00
  • Consider: `int myatoi(const char* str) { size_t l = strlen(c) - 1; char c = str[l]; int i = 0; int radix = 10; int exp = 0; while (l >= 0) { if (((int)c < 48) || ((int)c > 59)) { exit(EXIT_FAILURE); } i += ((int)c - 48) * pow(radix, exp++); c = str[--l]; } return i; }` To learn why you subtract `48` from an individual character cast to `int`, look at an ASCII character table. – Alex Reynolds Mar 22 '18 at 22:02
  • @AlexReynolds There are too many things wrong with that comment. Have you read our "help center" yet? – autistic Mar 22 '18 at 22:03
  • No, I'm afraid not, but I've only been here four years longer than you. – Alex Reynolds Mar 22 '18 at 22:54

1 Answers1

2

That will not work. Casting will result in the "reinterpretation" of the first character (at location *str) into its ASCII value as an integer.

You will need to process the string by iterating through its characters and parsing them.

Alexander
  • 59,041
  • 12
  • 98
  • 151