7

I have a large obj file of 306 mb. So I converted it into a glb file to reduce its size. The size of the file has decreased a lot to 82 mb, but it is still big. I want to make this file smaller. Is there a way? If there is, please let me know.

If you can't reduce the glb file further, let me know more effective ways to reduce the obj file. One of the things I've already done is change the obj file to json, compress, unwind and load it using pako.js. I didn't choose this method because it was too slow to decompress.

김태은
  • 107
  • 1
  • 1
  • 10

2 Answers2

15

There might be, if it is the vertex-data that is causing the file to be that big. In that case you can use the DRACO compression-library to get the size down even further.

First, to test the compressor, you can run

npx gltf-pipeline -i original.glb -d --draco.compressionLevel 10 -o compressed.glb

(you need to have a current version of node.js installed for this to work)

If vertex-data was the reason for the file being that big, the compressed file should be considerably smaller than the original.

Now you have to go through some extra-steps to load the file, as the regular GLTFLoader doesn't support DRACO-compressed meshes.

Essentially, you need to import the THREE.DRACOLoader and the draco-decoder. Finally, you need to tell your GLTFLoader that you know how to handle DRACO-compression:

DRACOLoader.setDecoderPath('path/to/draco-decoder');
gltfLoader.setDRACOLoader(new DRACOLoader());

After that, you can use the GLTFLoader as before.

The only downside of this is that the decoder itself needs some resources: decoding isn't free and the decoder itself is another 320kB of data to be loaded by the browser. I think it's still worth it if it saves you megabytes of mesh-data.

Martin Schuhfuß
  • 6,814
  • 1
  • 36
  • 44
  • To compress it you need to install node.js, then run npm install --global gltf-pipeline in a terminal. – aLx13 Jan 10 '23 at 11:04
  • @aLx13 you don't need to install it globally, you don't even need to install it at all. `npx` handles that for you and installs it temporarily. – Martin Schuhfuß Jan 27 '23 at 11:15
4

I'm surprised that no one has mentioned the obvious, simple way of lossily reducing the size of a .glb file that's just a container for separate mesh and texture data:

Reduce your vertex count by collapsing adjacent vertices that are close together or coplanar, and reduce your image data by trimming out, scaling down, or using a lower bit depth for unnecessary details.

Every 2X decrease in surface polygon/pixel density should yield roughly a 4X decrease in file size.

And then, once you've removed unneeded detail, start looking at things like DRACO, basis, fewer JPEG chroma samples, and optipng.

Will Chen
  • 482
  • 4
  • 12
  • 1
    I have been searching around and was equally surprised that nobody is talking about just making the .glb smaller. I would like to do something like this to load preview 3D files that when opened point to the full size model but the preview is just a lossy reduced model. Similarly if you have an 2400x2400px image on a mobile device, you can simply scale the image down to 640x640 with little/no perceived loss. Obviously 3D files are far less trivial, but same concept. Any suggestions on how to go about this? preferably in node.js backend? – AhrenFullStop Oct 22 '21 at 17:10
  • 1
    @AhrenFullStop Google for "Mesh decimation library" and "Mesh simplification library". If you can call external commands and don't mind overkill, you could also run Blender as a Python library and use its "Decimate" modifier to re-export the mesh— This can automatically take care of preserving UV mappings and such. – Will Chen Oct 23 '21 at 22:10