I have a "shared object" file my.so
that I tried to load in python:
from ctypes import cdll
lib = cdll.LoadLibrary("my.so")
This fails with
OSError: dlopen(my.so, 6): no suitable image found. Did find:
my.so: unknown file type, first eight bytes: 0x7F 0x45 0x4C 0x46 0x02 0x01 0x01 0x00
/path/to/.../my.so: unknown file type, first eight bytes: 0x7F 0x45 0x4C 0x46 0x02 0x01 0x01 0x00`
Reseach into other SO questions reveals:
Python ctypes not loading dynamic library on Mac OS X
This user shares the "No suitable image." error. The reason for them was a mismatch between the architecture of the processor running python (x86_64) and the target architecture of their .so file (ARM). The way to confirm this issue is to run, from the command line:
>file my.so
This outputs ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, BuildID[sha1]=..., stripped
The focus of Error "mach-o, but wrong architecture" after installing anaconda on mac was 32-bit vs 64-bit. From python I can see that import platform;platform.architecture()
gives ('64bit', '')
.
From a response I now can't find it was stated that this technique can be unreliable and instead to use max int size. import sys; sys.maxsixe
gives 9223372036854775807 which means 64-bit.
Finally, in a blog post on ELF binaries the author gives the Python snippet import os; os.uname()[4]
to give the machine architecture. For me this returns x86_64
.
To sum up, cdll
doesn't recognise the ELF ("unknown filetype"). The environment and binary are both 64 bit. The architectures also both match (and are x86_64).
Why is this not working as expected?
edit (credit to Max): OS running python is macOS 10.15.7 Catalina.