I am trying to pass a cython-numpy array to a c struct. Which works sometimes but crashes if the array becomes too 'large'. While this is not really large (like <100k double type entries) and I cannot figure out where this limit is coming from. A few code snippets (full code attached).
Building on Win7, Cython 0.28.4, Python 3.6, mingw64
I tried this with different lengths of the array. The latest thing I discovered that it always crashes if the array length is larger than 2**16-512 entries. But I have no clue why.
The cython file:
//cy_file.pyx
//...
cdef public struct Container:
double* np_array
// ...
cdef public void create_np_array(Container *container):
cdef numpy.ndarray[numpy.float_t, ndim=1, mode = 'c'] np_array
# just create a numpy array
longarray = np.linspace(1.0, 5.0, 100000)
np_array = numpy.ascontiguousarray(numpy.array(longarray), dtype=float)
container.np_array = <double*> np_array.data
And for the c-file:
//c_file.c
#include "Python.h"
#include "cy_file.h"
struct Container container;
//...
Py_Initialize();
PyInit_cy_file();
// and call the cython function that fills up the numpy array
create_np_array(&container, start, stop, n_elements, n);
// shutdown of python interpreter
Py_Finalize();
// *** here comes the crash if the array longarray is 'too long' ***
container.np_array[0]
Can anyone give me a hint what goes wrong here or how to debug?
Thanks and cheers, Tim