6

While trying to build OpenCV 3.3.0 on a Raspberry Pi I keep getting compiler level segmentation faults. I am following the guide here on building the optimized OpenCV Library.

The cmake step works perfectly fine however, when I attempt to run make -j4 a number of segmentation faults come from the compiler.

Zoe
  • 27,060
  • 21
  • 118
  • 148
rreichel
  • 794
  • 3
  • 9
  • 25
  • Does it happen on the same file every time? Does the compiler print an error before segfaulting? Does the system have enough resources to run multiple compilers at the same time or does it run out of memory and cause the error? Is there maybe an update available for the compiler that fixes the crash (which compiler are you using)? – nwp Jan 15 '18 at 17:28
  • Its on different files at different points of the build process - someone mentioned below that it was due to running out of memory which makes sense given my experience. The odd part was the lack of OOM error rather it would just segfault. – rreichel Jan 15 '18 at 23:27
  • 1
    You might also want to increase the swap file size which by default is only 100MB. Edit `/etc/dphys-swapfile` and `/sbin/dphys-swapfile` to increase the maximum allowed size e.g. to 4GB (4096) or 8GB (8192). Then restart the `phys-swapfile` service or reboot. – ccpizza Oct 04 '18 at 16:05

2 Answers2

2

The solution to this ended up being something related to the usage of multiple jobs. I still am not 100% sure what the cause was, however upon running the make command with the -j2 flag instead of the -j4 flag it compiled perfectly fine, albeit much slower. I think this may have been coming from memory allocation bugs that arise when running on a machine with sparse resources.

#Fixed command:
make -j2

Edit: Modified the text to more accurately describe what the -j flag does.

rreichel
  • 794
  • 3
  • 9
  • 25
  • `make`'s jobs and threads are unrelated. `make` makes processes, not threads. – nwp Jan 15 '18 at 17:25
  • 2
    The `-j` flag of `make` is used to start multiple [parallel build processes](https://www.gnu.org/software/make/manual/html_node/Parallel.html). A Rasperry Pi has a rather small memory (compared to a modern regular PC), and compiling can be memory-intensive, so trying to start many processes may exhaust your memory causing a segfault (depending on the current load, etc.). If you plan to compile a lot of stuff for your Raspberry, you may look into [cross compiling](https://medium.com/@au42/the-useful-raspberrypi-cross-compile-guide-ea56054de187). – jdehesa Jan 15 '18 at 17:29
  • Thanks for the feedback guys! @jdehesa, do you know why it would segfault rather than get some sort of OOM error? – rreichel Jan 15 '18 at 23:26
  • 1
    @rreichel On Linux, each dynamic memory allocation (in this case by the compiler) is a [`malloc`](https://linux.die.net/man/3/malloc) in the end. When there is no memory available, the function returns an address that, when used, segfaults. See [this question](https://stackoverflow.com/questions/1655650/linux-optimistic-malloc-will-new-always-throw-when-out-of-memory) for a further explanation of this behavior. – jdehesa Jan 15 '18 at 23:47
1

Hi I ran into a similar problem with the xtensor library on Xavier NX. The version of gcc was 7.5.0 . After I switched to gcc version 8.4.0 the internal compiler error went away.

Here is the tutorial to switch between multiple gcc versions:

https://linuxconfig.org/how-to-switch-between-multiple-gcc-and-g-compiler-versions-on-ubuntu-20-04-lts-focal-fossa

Best

Alexander
  • 1,422
  • 11
  • 15