1

why I am getting output blank? pointers are able to modify but can't read.why?

#include <iostream>
using namespace std;

int main(){
    int a = 0;
    char *x1,*x2,*x3,*x4;
    x1 = (char *)&a;
    x2 = x1;x2++;
    x3 = x2;x3++;
    x4 = x3;x4++;
    *x1=1;
    *x2=1;
    *x3=1;
    *x4=1;

    cout <<"@" << *x1 << " " << *x2 << " " << *x3 << " " << *x4 << "@"<<endl ;
    cout << a << endl;
}
[Desktop] g++ test_pointer.cpp
[Desktop] ./a.out
@   @
16843009

I want to read the value of integer by using pointers type of char. so i can read byte by byte.

463035818_is_not_an_ai
  • 109,796
  • 11
  • 89
  • 185
  • 4
    you have undefined behaviour. you cannot point somewhere, increment and then hope to get something meaningful. Pointer arithmetics is only well defined inside arrays – 463035818_is_not_an_ai May 17 '19 at 14:01
  • 1
    what output do you want to generate? I am sure if you clarify that, there can be a solution that is valid (i even doubt you need pointers) – 463035818_is_not_an_ai May 17 '19 at 14:05
  • 7
    I disagree with the above comments about undefined behaviour and unallocated memory: the memory for `int a` is both allocated and initialized, and that memory is accessed via the `char *` pointers which is permitted as long as `int` is at least 4 `char`s wide. The value `16843009` is also correct (`0x01010101`). I think the unexpected output is simply that the individual bytes are printed as characters (as opposed to their integer values) and `0x01` is a control character. – Arkku May 17 '19 at 14:08

5 Answers5

6

You're streaming chars. These get automatically ASCII-ised for you by IOStreams*, so you're seeing (or rather, not seeing) unprintable characters (in fact, all 0x01 bytes).

You can cast to int to see the numerical value, and perhaps add std::hex for a conventional view.

Example:

#include <iostream>
#include <iomanip>

int main()
{
    int a = 0;

    // Alias the first four bytes of `a` using `char*`
    char* x1 = (char*)&a;
    char* x2 = x1 + 1;
    char* x3 = x1 + 2;
    char* x4 = x1 + 3;

    *x1 = 1;
    *x2 = 1;
    *x3 = 1;
    *x4 = 1;

    std::cout << std::hex << std::setfill('0');
    std::cout << '@' << std::setw(2) << "0x" << (int)*x1
              << ' ' << std::setw(2) << "0x" << (int)*x2
              << ' ' << std::setw(2) << "0x" << (int)*x3
              << ' ' << std::setw(2) << "0x" << (int)*x4
              << '@' << '\n';
    std::cout << "0x" << a << '\n';
}

// Output:
//   @0x01 0x01 0x01 0x01@
//   0x1010101

(live demo)

Those saying that your program has undefined are incorrect (assuming your int has at least four bytes in it); aliasing objects via char* is specifically permitted.

The 16843009 output is correct; that's equal to 0x01010101 which you'd again see if you put your stream into hex mode.


N.B. Some people will recommend reinterpret_cast<char*>(&a) and static_cast<int>(*x1), instead of C-style casts, though personally I find them ugly and unnecessary in this particular case. For the output you can at least write +*x1 to get a "free" promotion to int (via the unary + operator), but that's not terribly self-documenting.


* Technically it's something like the opposite; IOStreams usually automatically converts your numbers and booleans and things into the right ASCII characters to appear correct on screen. For char it skips that step, assuming that you're already providing the ASCII value you want.

Lightness Races in Orbit
  • 378,754
  • 76
  • 643
  • 1,055
2

Assuming an int is at least 4 bytes long on your system, the program manipulates the 4 bytes of int a.

The result 16843009 is the decimal value of 0x01010101, so this is as you might expect.

You don't see anything in the first line of output because you write 4 characters of a binary value 1 (or 0x01) which are invisible characters (ASCII SOH).

When you modify your program like this

*x1='1';
*x2='3';
*x3='5';
*x4='7';

you will see output with the expected characters

@1 3 5 7@
926233393

The value 926233393 is the decimal representation of 0x37353331 where 0x37 is the ASCII value of the character '7' etc.

(These results are valid for a little-endian architecture.)

Bodo
  • 9,287
  • 1
  • 13
  • 29
1

Have a look at your declarations of the x's

char *x1,*x2,*x3,*x4;

these are pointers to chars (characters).

In your stream output they are interpreted as printable characters. A short look into the ascii-Table let you see that the lower numbers are not printeable.

Since your int a is zero also the x's that point to the individual bytes are zero.

One possibility to get readeable output would be to cast the characters to int, so that the stream would print the numerical representation instead the ascii character:

cout <<"@" << int(*x1) << " " << int(*x2) << " " << int(*x3) << " " << int(*x4) << "@"<<endl ;
vlad_tepesch
  • 6,681
  • 1
  • 38
  • 80
1

You can use unary + for converting character type (printed as symbol) into integer type (printed as number):

cout <<"@" << +*x1 << " " << +*x2 << " " << +*x3 << " " << +*x4 << "@"<<endl ;

See integral promotion:

0

If I understood your problem correctly, this is the solution

#include <stdio.h>
#include <iostream>
using namespace std;

int main(){
    int a = 0;
    char *x1,*x2,*x3,*x4;
    x1 = (char*)&a;
    x2 = x1;x2++;
    x3 = x2;x3++;
    x4 = x3;x4++;
    *x1=1;
    *x2=1;
    *x3=1;
    *x4=1;
    cout <<"@" << (int)*x1 << " " << (int)*x2 << " " << (int)*x3 << " " << (int)*x4 << "@"<<endl ;
    cout << a << endl;
}
pitprok
  • 388
  • 3
  • 15
  • Great, then you should mark the answer that helped you so that people will know that it is answered. – pitprok May 17 '19 at 14:19
  • 1
    You should always _explain_ your proposed solution and how it resolves the stated problem. Don't just dump code. Also don't beg for answer accepts; it's only been 20 minutes since the question was posted. – Lightness Races in Orbit May 17 '19 at 14:20
  • 1
    and please don't use C-style casts and ``. – Swordfish May 17 '19 at 14:20
  • TBH I'm fine with C-style casts for things like this cos `reinterpret_cast(&a)` is ugly AF but yeah – Lightness Races in Orbit May 17 '19 at 14:22
  • 1
    @LightnessRacesinOrbit *cos `reinterpret_cast(&a)` is ugly* – When casting the pointer it makes no difference but I meant the casts in the calls to `operator<<()` where a `static_cast<>()` would suffice. – Swordfish May 17 '19 at 14:26
  • @Swordfish Perhaps. I added a little thing to my answer about this a few mins ago – Lightness Races in Orbit May 17 '19 at 14:27
  • @LightnessRacesinOrbit You misunderstood my intention, I didn't mean that he should mark my own answer as accepted. I meant that he should mark the answer that helped him. Also, I found this question while using the c filter in stack overflow, that's why I used C-style casts – pitprok May 17 '19 at 14:32
  • The question (and your answer) are clearly C++, despite the bum tag ;) – Lightness Races in Orbit May 17 '19 at 14:32
  • @LightnessRacesinOrbit I'm such a noob that I didn't even know, I just threw the code in my online IDE, added casts, and it worked. – pitprok May 17 '19 at 14:34