0

i have a repository that i finally got working correctly with a few other developers (we are migrating from killing each other with FTP overwrites).

One of the developers uploaded a changes to the server using FTP, and I downloaded those changes, overwriting my local folders. I was thinking that git would just recognize the files that had changed, however that is not the case. in fact, git is thinking that the files have not changed.

Is there a command that can tell git to "go physically rescan every file for changes and see if there are any." At this point, I am unsure which files were changed from the FTP download, and so when i do a git push/pull, it says there are no changes, even though I know there are.

Joe Smack
  • 465
  • 1
  • 7
  • 16

3 Answers3

1

Look at this thread: Git Push into Production (FTP)

Which contains a rather superb tool named git-ftp in order to push your production-ready changes to your server via FTP.

https://github.com/resmo/git-ftp

I use it to push to FTP, whenever I think the dev changes I have made are ready to be pushed to my server.

Community
  • 1
  • 1
Sheriff Md
  • 656
  • 1
  • 7
  • 11
0

Git tracks the state of the repository using a number of data structures stored in a directory named .git in the root of your tree - so if you overwrote that file with your colleague's .git directory from the FTP server, as far as git can tell you have the same repository state as he did when he uploaded it, so of course it reports no changes.

This is not a great workflow for using git, but if you must combine the FTP {up,down}loads with local versioning in git, you can set the GIT_DIR environment variable to refer to a directory outside of your source tree (one that won't get overwritten when you download a new FTP snapshot), where git can track the repository and history metadata.

Matt Enright
  • 7,245
  • 4
  • 33
  • 32
0

You should seriously consider setting up a proper git server, as transferring changes via FTP defeats the purpose of using a version control system such as git. Assuming everyone has SSH access to the server, it's fairly simple to setup, you just need to create a bare repository in a directory on the server that everyone would have read and write access to. It's also possible to setup without having to give everyone shell access.

You can read more about setting up a git server in the free Pro Git book, which details several different techniques. You can also use a hosted git service if you'd rather go that route.

That said, Matt's answer is correct, and is likely what happened: you inadvertently overwrote the directory git uses to track information about the local repository.

If what's in GitHub is still good, you can do git fetch then git reset origin/master to get your local repository back to the state of the remote repo on GitHub, and it will keep all the local differences. This will cause you to lose any commits made locally that weren't pushed up to GitHub, but will not cause any data to be lost locally. If you did push up to GitHub already after the accidental overwrite, then this probably won't help (it's likely you wouldn't have been able to without an explicit --force in the push). You can alternatively do a reset to specific commit by specifying the commit's hash instead of "origin/master".

Andrew Marshall
  • 95,083
  • 20
  • 220
  • 214
  • we are using github and the ftp thing was just an oversight. there was a single developer who didn't get the message that we went to .git and he overwrote some of the files with his ftp. (not sure which ones were changed). so not all the repositories match. but the files on the file server are different than what's in our repositories. so do we need to all delete our repositories, download what's in ftp and restart the repository? – Joe Smack Feb 16 '11 at 15:17
  • I updated my answer based on your new information. Let me know if that helps. – Andrew Marshall Feb 16 '11 at 17:35