Workflow Git FTP for old projects with potential external non-git users
I have a few old legacy projects on which I work occasionally from time to time, which i would like to integrate in a git workflow – mainly just for me, but it could be nice to have the option to collaborate on some level later.
The first problem is that the only method of accessing the project’s code is (s)ftp-only.
There is no shell-access given by the provider.
- Docker deployment workflow with git
- Capistrano + NGINX Passenger Restart Rails App
- Jenkins triggered code deploy is failing at ApplicationStop step even though same deployment group via code deploy directly is running successfully
- How can I reliably track changes on deployed websites?
- Can I customize the web.config transform for appharbor deployment?
- Django deployment using git, including production-relevant files
The second problem or contraint is that 3rd parties might change the files on the live ftp server (I know ugly) without using git at all, when i use some post-hook pushing method, this would break. Unfortunately, teaching those external users a git workflow is no option as ignorance is bliss.
What I really would like to do is to somehow keep track of those external changes (2) and benefit for my development workflow by using git.
Imagine I have not touched a project for a while and when i come back i’d like to see what has been done.
But I’m really confused on how and where I should setup repositories.
2 Solutions collect form web for “Workflow Git FTP for old projects with potential external non-git users”
A super-nice to do Git deployment over FTP is
git-ftp. Unfortunately, it’s one way, i.e. it assumes the destination files are only ever updated by you (and via
So for your case you’d have to create some sort of synthetic history of remote changes when they occur, and the only way to do that is to get the files and then manually create “the delta” between them. So, for this, I’m with @VonC — you’d have to transfer the files back (
rsync does this efficienly, if it’s not available, google for an utility which is able to crawl the remote hierarchy using the FTP protocol itself and fetch only the missing/more recent files) and then make the changes be known to Git.
This process may be either done on a separate branch or on a temporary throw-away branch. Basically you fork a branch off a point you think represented the state of remote files last time you looked at them and then run a syncronization program over the checkout. (Note that such a program have to track removals, that is, it has to remove files locally which were removed at the remote side.) Then run
git add --all . to make Git stage for commit all updated files, all removals and also add all the presently untracked files. Then record a commit. This commit will actually represent the changes made to the remote. You can then base your further work on this state.
If you don’t have shell access, that means you need to setup your repo locally.
Its content would be the one you would get from the server through ftp.
The question ” How to use rsync over FTP ” contains some ways to mirror the content of a remote set of file over sftp.
Once you have that content, you can do:
mirrorbranch, that you would update with a mirror image of the server (before pushing anything back)
devbranch, that you would merge the mirror branch into, and where you would add new features. You could push new files from there through ftp back to the server.
Having a branch mirroring what is on the server is a good way to quickly compare what you left on
dev, and what is currently on the server.