10

I have a central repository with a subset of files that I want to protect from being changed (by pushing) from other users. If I add these files to .gitignore, they would not be cloned.

Is it possible to give the ability to clone all files, but after cloning add some of them to .gitignore on the client side?

DrCord
  • 3,917
  • 3
  • 34
  • 47
andrexus
  • 165
  • 3
  • 8
  • Is .gitignore committed to the repository? Why do you want to protect those files? Does .gitignore add any kind of per-user access, or would your change mean that *nobody* could change those files? What are you trying to accomplish? – Lasse V. Karlsen Jul 29 '10 at 06:50
  • Would it be an option to simply reject pushes which modify those files? You could also offer a client-side hook to reject bad commits. – Cascabel Jul 29 '10 at 12:25

4 Answers4

1

You can have the files in the repository, commit them, then add them to the .gitignore and then remove them from the next commit.

You can still fetch the files directly prior commit (perhaps tag it with something so it can be fetched by name a bit easier) and this will preserve the state of the file, while not making it easily edited in the repository by accident.

To access those files after pulling the clone, just write a rake task that fetches them for the user of your repository.

vgoff
  • 10,980
  • 3
  • 38
  • 56
1

It sounds to me like the problem lies elsewhere if you need this level of access restriction.

Nonetheless, if you indeed do want to implement this, consider Gitolite. It allows you to define rather detailed access rules and should probably suffice for your needs.

Gitolite documentation: http://gitolite.com/gitolite/master-toc.html

You can define 'virtual refs' to control access on file level. More on that: http://gitolite.com/gitolite/vref.html

jsageryd
  • 4,234
  • 21
  • 34
1

I initially thought about a filter driver (see Pro Book), which would:

  • on the smudge step save your files content
  • on the clean step would restore your files content.

alt text

But that is not a good solution since those scripts are about stateless file content transformation (see this SO answer).

You can try enforcing a save/restore mechanism in hooks (see the same SO answer), but note that it will be local to your repo (it will protect your files in your repo only, hooks aren't pushed)

You can also use:

git update-index --assume-unchanged file

See "With git, temporary exclude a changed tracked file from commit in command line", again a local protection only. That will protect them against external push to your repo ("import"), but if you publish them ("export"), than can be modified on the client side.

ax.
  • 58,560
  • 8
  • 81
  • 72
VonC
  • 1,262,500
  • 529
  • 4,410
  • 5,250
  • Thanks. I know about git update-index --assume-unchanged, but it is applied only on the client side. I know also about hooks, but they are not cloned from the central repo. – andrexus Jul 29 '10 at 08:54
  • @andrexus: I know: filter driver is the only pushable solution, but it will require bending the smudge script to force it to know about the path of the relevant files (to save/restore), whereas it should only know about the content of the files... – VonC Jul 29 '10 at 09:03
0

Is there a specific reason why Git must itself be the answer to this?

How about making the files read-only, and dictating as policy that these files shouldn't be pushed?

Sometimes a technological solution is not the simplest way.

If somebody does push changes to these files, these changes can always be reverted.

mfontani
  • 2,924
  • 20
  • 18