Should a git repo contain all binary & static files needed to deploy?
I am one of two developers on our team. We have been using git internally for 6 months, with great success. However, we still “deploy manually” via FTP to our production server… we didn’t want to risk any auto-deploy until we were comfortable with the git workflow. We now have a development server setup and are looking into some kind of “one click” deployment…. I’ve heard that some people do a “git pull” on the dev server from the code repo. I’ve also heared that some people “rsync” the files accross from a staging server.
Should our git repo contain all the large files that never change (such as .cab files used by our Java photo uploader applet). Obviously our cached HTML files don’t belong in the repo… but what about our PDF product guide and the like? Should these all be tracked by git? (I currently have them on gitignore to save space on github and on our machines).
What would be a suitable “one action” deploy stratergy for our setup:
- Dev Laptop 1 (Office / Home)
- Dev Laptop 2 (Office / Home)
- Dev Server (Office)
- Production Server (Off site)
One Solution collect form web for “Should a git repo contain all binary & static files needed to deploy?”
Usual strategy assumes that your git repository contains all necessary files needed to build complete target product (source codes, additional binary libraries, drivers, build scripts, etc.).
Be it .zip package, installer file for whole application – git repository should be self-sufficient. Each developer should be able to clone repository and able to create full output.
Of course the prerequisite is to have complete development environment.
So you might include .cab files in the repository or build it from sources.
However instead of simply cloning and pulling repository to your production server you might use a build server (eg. Hudson) that would pull, build and sync complete target to production. This way even PDFs can be built from eg. LaTeX files.