Lets say I have a bootloader assembly code to debug which uses .code16
and .code32
to define codes for different mode of the CPU it is running in. The architecture for which this bootloader is meant is 64-bit (x86) CPU.
Now what mode should be used during disassembly (with tools like objdump, gdb, etc.) ? i8086
? i386
? x86-64
?
As per my understanding and observation, we should use combination of them all depending on the section of code we are analyzing (.code16
,.code32
) as that gives expected results (to me).
For example :
.code16
mov %ax, %bx
mov %ecx, %edx
.code32
mov %eax, %ebx
mov %cx, %dx
Compiled like this :
$ as -o test.o test.S. #16-bit and 32-bit code packed in 64-bit elf, default 64 since host is 64-bit
Diassembly for 16-bit mode CPU. 16-bit code section is displayed fine, whereas 32-bit code section is messed up.
$ objdump -m i8086 -d test.o
test.o: file format elf64-x86-64
Disassembly of section .text:
0000000000000000 <.text>:
0: 89 c3 mov %ax,%bx
2: 66 89 ca mov %ecx,%edx
5: 89 c3 mov %ax,%bx
7: 66 89 ca mov %ecx,%edx
Analyzing in 32-bit mode. Now 32-bit coe section is disassembled perfectly, even though 16-bit code section is messed up.
$ objdump -m i386 -d test.o
test.o: file format elf64-x86-64
Disassembly of section .text:
0000000000000000 <.text>:
0: 89 c3 mov %eax,%ebx
2: 66 89 ca mov %cx,%dx
5: 89 c3 mov %eax,%ebx
7: 66 89 ca mov %cx,%dx
Please confirm if the strategy is perfect, else, please correct me what is the best method while disassembling the mixed assembly code (16,32,64 bit).