I wrote a python extension for my C project: some data are taken (in C), passed to a python function which modifies them, then stored in C structures and used for more complex operations. When I run:
$ valgrind ./my_c_project
I obtain the well-known cascade of false positives due to the python C API's usage. I searched on the web, but the answers seem to be either outdated, or referring to the other way around: c modules extending python.
I can think that no matter if you use python to extend C, or C to extend python, the solution should be the same since the core "problem" relies on some malloc-PyMem "misunderstood". Said that, the most recent and promising resource that I found is here cpython/README.valgrind which in particular claims:
UPDATE: Python 3.6 now supports PYTHONMALLOC=malloc environment variable which can be used to force the usage of the malloc() allocator of the C library.
Does it mean that instead of rebuilding python (as suggested by old resources), can I just set the variable above properly?
Another solved question on this website seems to give an affirmative answer (see dequis's contribution), but an example is shown while executing a python script (I quote):
PYTHONMALLOC=malloc python3 foobar.py
which is not my case, since I have a compiled C code. I naively tried with:
$ PYTHONMALLOC=malloc valgrind ./my_c_project
which didn't work (as expected...). I did not rebuild python since I share my workstation with colleagues; I guess I could try locally, but I have not a good feeling since I use a lot of external modules and I might end up in chaos (I'll do as a last chance, it's all experience).
Ps: I also tried by simply commenting dequis's answer, asking for more explanation, but my reputation is not enough (absolutely no complain here, only an implicit apology for my long, but hopefully detailed, post).