Of course it will still be in the repository.
You can always update back to older revisions, and if you update back to the revision you got when you committed the file, it'll be there in all its glory.
There are two ways to mitigate this (when you're committing, not now):
- One of the big-files extensions, these essentially add big files to a secondary repository and link the two, so that if you update to a revision where the file doesn't exist, and you don't already have it, it will not get updated. ie. it's more a "on-demand" style of pulling
- If the file never changes, keep it available on the network and just create some kind of link to it instead of a full copy
Right now, you got four options:
- Strip away the changeset that added the file, and all the changesets that came after it. You can do that using the Mercurial Queues extension. Note that you need to do this stripping in all clones. If just one of your users push back the repository that has that file in its history to the central clone, you have the changesets back.
- Rebuild the repository from scratch manually
- Using the
hg convert
command and some filtering, the --filemap
option can be used for this
- Leave it as is. How big is it, will be much of a problem?
Note that rebuilding the repository, either manually or through hg convert
will invalidate all clones. Anyone trying to push to your new central clone from an old clone will get a message about unrelated repositories
. If any of your users are stupi^H^H^H^H^Hnot smart enough to realize that forcing the push is a bad idea, then you will have problems with this approach.