0

I do apologize for the terrible question. I'm a 3D guy who amateurs python for plugins and scripts.

I've successfully come up with the worst possible way to export particle information (two vectors per particle per frame for position and alignment). My first method was to write out a billion vectors per line to a .txt where each line represented a frame. Now I have it just writing out a .txt per frame and loading and closing the right one depending on the frame.

Yeah, it's slow. And dumb. Whatever. What direction would you suggest I go/research? A different file type? A :checks google: bin, perhaps? Or should my retarded method actually not take very long and something else is making things move more slowly? I don't need an exhaustive answer, just some general information to get me moving in the right direction.

Thanks a million.

1 Answers1

0

if this info is going to be read by another python application ( especially if its the same application that wrote it out) look into just pickling your data structures. Just build them in memory and use pickle to dump them out to a binary file. The caveats here:

1) Do you have memory to do it all at once, or does it have to be one frame at time? You can make big combined files in the first case, you'd need to do one-frame-per-file in the second. If you're running out of memory the yield statement is your friend. 2) Pickled files need to be of the same python version to be reliable, so you need to be sure all the reading and writing apps are on the same python version 3) Pickled files are binary, so not human readable.

If you need to exchange with other applications, look into Alembic, which is an open source file format designed for this sort of problem - baking out large volumes of particle or simulation data. There's a commercial exporter avalable from EcoCortex which comes with a Python module for dealing with Alembic data

Community
  • 1
  • 1
theodox
  • 12,028
  • 3
  • 23
  • 36