I am searching for some statistics regarding the runtime time overhead that occurs when a program is loaded by using the runtime linker (e.g. ld.so
). I am not an expert for how the runtime linker works but as I understand it usually performs the following actions:
- Searching for shared libraries in the well known paths, or in the
LD_LIBRARY_PATH
- Loading the shared libraries
- Symbol resolution for used functions
So when I start a program through a GUI or through a command line, at some point a system call to exec
will happen and the requested program is started. Lets take a quick look to what happens then:
Exec(myprogram)
- Operating system loads
myprogramm
into memory - Operating system turns over execution to
_start
- Some initialization happens and the runtime linker is run
main()
is called
Assuming that the above list is correct and I did not leave out any major steps I would be interested in two things:
- What is the overhead of step 4. according to theory?
- How can I determine in practice the overhead of step 4. (e.g. for real programs such as Firefox or Chrome)?