0

I have a c++ program that process a Wavefront OBJ file (a 3D model). It has worked for numerous files, but not the latest one.

The OBJ file is plain text and contains lists of vertices, normals and faces etc. The first one or two characters specify the type of that line, hence I scan the file first and count the number of each type. I then declare arrays to hold the data, at which point it segmentation faults. Perhaps this file is unusually large and that it's run out of memory. Or perhaps I've corrupted something in the initial processing that hasn't come to light until this point.

Compiled with

g++ processobj.cpp -o processobj -lgd -lpng -lz -ljpeg -lfreetype -lm -lmysqlcppconn -lpthread -ggdb

gdb tells me that it's here:

//Now declare the arrays.
//They begin from 0, but the OBJ indexes start at 1, so we have to add an extra element.

//x,y,z of each point
double *v_x = new double[type_count["v "]+1]{};
double *v_y = new double[type_count["v "]+1]{};
double *v_z = new double[type_count["v "]+1]{};
//and a reference to the UVs at each point:
//Array indexed by vertex, then by opposite vertex, giving the uv index at
//the first vertex
std::unordered_map<int, int> v_edge_uv[type_count["v "]+1];
//An array indexed by vertex, and then by face, giving the uv there.
std::unordered_map<int, int> v_uv[type_count["v "]+1];
//Links to other vertices - a two dimensional array of vertices
std::set<int> v_edges[type_count["v "]+1];             <---- Segmentation fault
//u,v
double *vt_u = new double[type_count["vt"]+1]{};
double *vt_v = new double[type_count["vt"]+1]{};

//the vertices for each face.
int *faces[type_count["f "]+1]{};
//and the UV for each
int *face_uvs[type_count["f "]+1]{};
int face_vertices[type_count["f "]+1]{};

type_count shows some large but reasonable counts:

(gdb) p type_count
$1 = std::unordered_map with 9 elements = {["f "] = 57461, ["us"] = 1, ["vt"] = 58517, ["s "] = 2, ["v "] = 57661, ["o "] = 1, ["mt"] = 1, ["vn"] = 55184, ["# "] = 2}

Any ideas on how I can track this one down?

Tim Owens
  • 1
  • 1
  • 1
  • 4
    Well, you are using a non-standard extension (variable-length arrays) which allocates on the stack. Stack space is quite limited, so running out of it is not unlikely at these sizes – UnholySheep Jul 21 '22 at 22:32
  • 2
    You are already using other standard library containers, so I don't see any reason that you couldn't replace all of these arrays and `new`s with `std::vector`. Then there wouldn't be any problem and the code would be much cleaner. – user17732522 Jul 21 '22 at 22:36
  • Use dynamic allocation instead of C-style arrays . `vector` is easier to use than `new` – M.M Jul 21 '22 at 22:36
  • Oh, yes, huge arrays on the stack. That's your segfault right there. Just say "No" to **Pointless Use Of Pointers**. – Sam Varshavchik Jul 21 '22 at 22:42
  • Thank you all. Stackoverflow at its finest. And most literal... – Tim Owens Jul 22 '22 at 01:26

0 Answers0