Workflow Git FTP for old projects with potential external non-git users

I have a few old legacy projects on which I work occasionally from time to time, which i would like to integrate in a git workflow – mainly just for me, but it could be nice to have the option to collaborate on some level later.

  1. The first problem is that the only method of accessing the project’s code is (s)ftp-only.
    There is no shell-access given by the provider.

  2. Good Git deployment using branches strategy with Heroku?
  3. What's the right way to manage a release with SVN?
  4. git clone through ssh returns Permission denied (publickey,password)
  5. Where do you put your app-config-files when deploying rails with capistrano and svn
  6. Generating API documents in Git Workflow
  7. Permission denied error when using Github deploy keys
  8. The second problem or contraint is that 3rd parties might change the files on the live ftp server (I know ugly) without using git at all, when i use some post-hook pushing method, this would break. Unfortunately, teaching those external users a git workflow is no option as ignorance is bliss.

What I really would like to do is to somehow keep track of those external changes (2) and benefit for my development workflow by using git.

Imagine I have not touched a project for a while and when i come back i’d like to see what has been done.

But I’m really confused on how and where I should setup repositories.

  • How to get git like statistics from TFS
  • Can't push master branch to git repository with netbeans
  • Creating a git alias with a parameter
  • Why does git difftool use git diff instead?
  • git: how to move committed changes from master to branch?
  • git2go's CheckoutHead() not updating the index
  • 2 Solutions collect form web for “Workflow Git FTP for old projects with potential external non-git users”

    A super-nice to do Git deployment over FTP is git-ftp. Unfortunately, it’s one way, i.e. it assumes the destination files are only ever updated by you (and via git-ftp).

    So for your case you’d have to create some sort of synthetic history of remote changes when they occur, and the only way to do that is to get the files and then manually create “the delta” between them. So, for this, I’m with @VonC — you’d have to transfer the files back (rsync does this efficienly, if it’s not available, google for an utility which is able to crawl the remote hierarchy using the FTP protocol itself and fetch only the missing/more recent files) and then make the changes be known to Git.

    This process may be either done on a separate branch or on a temporary throw-away branch. Basically you fork a branch off a point you think represented the state of remote files last time you looked at them and then run a syncronization program over the checkout. (Note that such a program have to track removals, that is, it has to remove files locally which were removed at the remote side.) Then run git add --all . to make Git stage for commit all updated files, all removals and also add all the presently untracked files. Then record a commit. This commit will actually represent the changes made to the remote. You can then base your further work on this state.

    If you don’t have shell access, that means you need to setup your repo locally.

    Its content would be the one you would get from the server through ftp.
    The question ” How to use rsync over FTP ” contains some ways to mirror the content of a remote set of file over sftp.

    Once you have that content, you can do:

    • one mirror branch, that you would update with a mirror image of the server (before pushing anything back)
    • one dev branch, that you would merge the mirror branch into, and where you would add new features. You could push new files from there through ftp back to the server.

    Having a branch mirroring what is on the server is a good way to quickly compare what you left on dev, and what is currently on the server.

    Git Baby is a git and github fan, let's start git clone.