TL;DR: Yes it is safe to perform git repository optimization, but do make backups and test them.
I guess that by "compression" you mean git gc
.
The operation is as safe as it can be given the environment (machine stability, RAM and storage reliability).
Nevertheless, there is one weakness in all computing machines: storage space.
Be aware that git gc
can sometimes (paradoxically) temporarily increase the size of the repository (due to unpacking of objects that are candidate for removal but not removed yet).
If the machine is low on storage space this can prevent the operation to succeed, or hinders consecutive work. Also, git gc
can require huge amount s of memory (e.g. bigger than the on-disk repository size) and fails if the system can't cope.
That said, I never saw repository corruption seemingly caused by a git gc
.
If your backup is a clone repository, be careful: some items (branches, lightweight tags, regular tags, configuration, hooks, etc.) are not automatically transferred between repositories, some are partially or in some cases only, with complicated rules.
Since you're worried about data safety, the best bet to be safe (and that's general, not specific to git) is to give yourself a regular backup + crash recovery process. Then, from time to time give yourself an isolated test recovery environment (it can be as simple as a folder on another computer, or a virtual machine, depending on the context). Then in that environment fully run your recovery procedure and check that your precious data and processes are made fully functional again, from the backup, without needing your main storage. That way, you know that if the main storage crashes you're still safe.