0

As far as I understand(Correct me if i'm wrong), the output of a compiler depends on the Architecture version used, Compiler, and operating system.

Lets say Im using ubuntu release 16.04 x84-64 and compiling a c file with gcc version 5.4(or any other mix of OS,arch,compiler for the example) .

As I understood it until now, if I were to compile the same c file but with a different ubuntu release, with the same arch and compiler version it should have produced the same assembly code.

After a few tries I have got the impression that this is incorrect, how is this possible?

Does the output of a compiler depend on the release of the specific OS?

One of the examples is compiling https://github.com/tbuktu/libntru on 2 different ubuntu versions and receiving different assembly.

gilm501
  • 151
  • 1
  • 2
  • 8
  • Compiler output also depends upon optimization flags (and there are *many* of them). – Basile Starynkevitch Dec 13 '17 at 16:05
  • How exactly did you make sure you had the same compiler version on different OS versions? Is it possible they were configured differently? e.g. [one of them with `--enable-default-pie`](https://stackoverflow.com/questions/43367427/32-bit-absolute-addresses-no-longer-allowed-in-x86-64-linux/46493456#46493456), or a different default for `-fstack-protector`? – Peter Cordes Dec 13 '17 at 16:12
  • How exactly did you compare the asm? Did you [use `gcc -O3 -S` to get the compiler's *actual* asm output](https://stackoverflow.com/questions/38552116/how-to-remove-noise-from-gcc-clang-assembly-output), or did you disassemble a linked binary? Or did you compare binaries with `cmp` or something? – Peter Cordes Dec 13 '17 at 16:17
  • @PeterCordes is it possible that some artefacts of address space layout randomization influence the linking? Not the randomization itself as this happens at load time, but some provisions for it, according to version X supported on one OS, and a later but compatible version Y on another? – Vroomfondel Dec 13 '17 at 17:08
  • @Vroomfondel: Well sort of maybe. `--enable-default-pie` makes `-pie` and `-fpie` the default to enable ASLR for the executable as well as for libraries. (See my first comment). Other than that, I think the same gcc version is doing to make the same asm. A different version of binutils might possibly set some bits in the ELF headers differently, but the OP asked about asm diffs, not binary diffs. Different ways of doing ASLR still just need PIC code, so the difference is just `-fpie` or not. – Peter Cordes Dec 13 '17 at 17:14
  • Even the binaries of `gcc 5.4` in the official repositories may be built with different configuration for each distro, so the compiler itself may be slightly different, even when it reports by the same version number, and it was built from the same source. So while the output of compiler would be OS agnostic, it may differ, as it is not the exactly same compiler. – Ped7g Dec 13 '17 at 23:29

2 Answers2

2

The different OS's may have different versions of the default libraries installed (which get linked into your final application). Thus the end result may be slightly different.

Buddy
  • 10,874
  • 5
  • 41
  • 58
  • 1
    Almost everything uses dynamic libs, which aren't linked in. What can differ is the *headers* for these libraries, which may include different function signatures, or different types. The only extra stuff normally linked in is the CRT startup code (`/usr/lib/crt*.o`) that comes with gcc. And anyway, that's not part of the compiler's asm output, that's part of the linker's binary output. – Peter Cordes Dec 13 '17 at 16:15
1

If you are just doing a few ubuntu versions the odds of differences goes down as the overall architecture differences may not be reflected either in your test or may not change on the same os family with the same compiler family for long periods of time. Where you are more likely to see differences in a test like that is as you get older versions of the same distro, newer/newest versions of the compiler are not ported/supported directly as an apt-get. maybe you can get them to work by hand building but gcc in particular is really bad about that their code only builds with relatively recent prior or following versions get too far apart and gcc cant build gcc. What I would first expect to see is strictly due to gcc version differences you start to see differences in the compiler.

A better test is take a simple .c file and build for windows any version (using the same version of gcc built for that system) and ubuntu/linux any version. Should more quickly see differences.

Two different compilers should show differences for reasonably sized projects, or knowledge based targeted small code samples, llvm/clang vs gcc for example. Different versions of the same compiler or compiler family will somewhat by definition show differences over time, does 6.x vs 6.x+1 gcc show differences well yes if you know where to look but often not, but gcc 3.x vs gcc 7.x should and then depending on the test you can narrow in from there.

You have compiler to compiler differences on the same os and system that are expected to show differences.

You have various reasons why system to system differences with the same compiler will show differences.

And then combinations of the above would naturally also show differences.

The bigger question is why do you care, the educational information is that you shouldn't expect the same C source code to build the same way if you change the compiler, compiler settings, or operating system. It can have anywhere from no differences to huge differences based on any of the above. Starting quite simply with optimization and other tuning settings and going from there.

Peter Cordes
  • 328,167
  • 45
  • 605
  • 847
old_timer
  • 69,149
  • 8
  • 89
  • 168