I have a working, live website that I am certain has the latest, correct version of every file, that is not managed with Git. I also have a Bitbucket Git repo that's ostensibly for that site, but that has major problems and out-of-date stuff in it.
What I want to do git init
the live files, then somehow push them to the remote so that the final remote HEAD state is exactly like the state of my live files, but also so that the remote commit history remains intact.
I want to do this so that the live working tree is never in a different state. I've thought of creating a new branch, pulling remote the master, and then force-merging them or whatever, but as far as I know, I would have to check out the master branch and mess up the live files for those brief moments before and during the merge, assuming that everything went correctly.
I also thought of just FTPing the files down, working out all the conflicts/merging on my local computer, and then pushing from there to Bitbucket and pulling down to the live repo. But that also feels unreliable and risky.
I know that I should never manipulate live code like this, but this is the situation I've been dropped into, and I just want the easiest/safest way to make it work.
(I'm sure this is a duplicate question, but my Google-fu is failing me; please be gentle when directing me to previous answers.)