0

so, lets say that a char contains the letter "a":

int main() {
int i=8;
char test2[2]="a"+i;
return 0;
}

I just want to add 5 to the value of that letter's ASCII value. So that test2 would contain "h", since ASCII value of h a ASCII value of a+8.

Is there a simple way to do this? I have tried googling this, and I would think this is a basic thing to do, but I am clearly missing an easy way to do this. Would appreciate any help.

user7977797
  • 73
  • 1
  • 1
  • 10
  • You're adding i to the address of the first element of the string literal `"a"`. Try using the character constant `'a'` instead. You're also trying to initialize an array with a scalar value. Your compiler should be warning you about it, and if not, enable all warnings. – Ilja Everilä Jun 02 '17 at 02:44
  • 1
    [Reading a good beginners book](http://stackoverflow.com/questions/562303/the-definitive-c-book-guide-and-list) should be a good start. – Some programmer dude Jun 02 '17 at 02:46
  • 1
    `char test2[2]={'a'+i-1};` – BLUEPIXY Jun 02 '17 at 02:46
  • Stop guessing and start reading. – autistic Jun 02 '17 at 06:43

4 Answers4

3

You have to add to a char, not the string:

char test2[2] = { 'a' + i, 0 };
Barmar
  • 741,623
  • 53
  • 500
  • 612
2

"a" is not a character, but a string. You need single quotes to works with individual characters.

int i = 8;
char c = 'a' + i;
printf("c=%c\n", c);

Output:

c=i
dbush
  • 205,898
  • 23
  • 218
  • 273
1

In C, character holds an ACSII value (which is integer) by default. In your case, to make h out of a you have to do this:

#include <stdio.h>

int main() {
char test2 = 'a' + 7;
printf("%c\n\n", test2); // just check
return 0;
}
  • 1
    While `char` is a (`signed` or `unsigned`) integer type in C, encoding is _not_ ASCII by default; this is an implementation detail, and not specified by the Standard. – ad absurdum Jun 02 '17 at 03:19
  • For example, some IBM systems used to use [EBCDIC](https://en.wikipedia.org/wiki/Extended_Binary_Coded_Decimal_Interchange_Code) instead of ASCII. Here, the characters of the Latin alphabet are not encoded in a contiguous sequence, causing some ASCII-reliant code to fail. This problem still crops up in the real world on legacy systems. – ad absurdum Jun 02 '17 at 13:31
0

Haven't compiled this, but wouldn't (int)'a' + i work?

A double quote isn't a char, its a string literal.

Ron Thompson
  • 1,086
  • 6
  • 12