I have a c++ program that process a Wavefront OBJ file (a 3D model). It has worked for numerous files, but not the latest one.
The OBJ file is plain text and contains lists of vertices, normals and faces etc. The first one or two characters specify the type of that line, hence I scan the file first and count the number of each type. I then declare arrays to hold the data, at which point it segmentation faults. Perhaps this file is unusually large and that it's run out of memory. Or perhaps I've corrupted something in the initial processing that hasn't come to light until this point.
Compiled with
g++ processobj.cpp -o processobj -lgd -lpng -lz -ljpeg -lfreetype -lm -lmysqlcppconn -lpthread -ggdb
gdb tells me that it's here:
//Now declare the arrays.
//They begin from 0, but the OBJ indexes start at 1, so we have to add an extra element.
//x,y,z of each point
double *v_x = new double[type_count["v "]+1]{};
double *v_y = new double[type_count["v "]+1]{};
double *v_z = new double[type_count["v "]+1]{};
//and a reference to the UVs at each point:
//Array indexed by vertex, then by opposite vertex, giving the uv index at
//the first vertex
std::unordered_map<int, int> v_edge_uv[type_count["v "]+1];
//An array indexed by vertex, and then by face, giving the uv there.
std::unordered_map<int, int> v_uv[type_count["v "]+1];
//Links to other vertices - a two dimensional array of vertices
std::set<int> v_edges[type_count["v "]+1]; <---- Segmentation fault
//u,v
double *vt_u = new double[type_count["vt"]+1]{};
double *vt_v = new double[type_count["vt"]+1]{};
//the vertices for each face.
int *faces[type_count["f "]+1]{};
//and the UV for each
int *face_uvs[type_count["f "]+1]{};
int face_vertices[type_count["f "]+1]{};
type_count shows some large but reasonable counts:
(gdb) p type_count
$1 = std::unordered_map with 9 elements = {["f "] = 57461, ["us"] = 1, ["vt"] = 58517, ["s "] = 2, ["v "] = 57661, ["o "] = 1, ["mt"] = 1, ["vn"] = 55184, ["# "] = 2}
Any ideas on how I can track this one down?