From development to deployment with Git

I am very puzzled. I have read many sources about ‘development & deploy’ with git and haven’t understand anything.

Now I have:

  1. Local (Development) repository with project (git init)
  2. Gitlab/Github Server where I store/commit/push my project.
  3. Production server

With local repository and gitlab/github server is all clear. I make changes and push to the server. Next, I need to deploy it to the server and here is the question ‘How to do it correctly’?

Many sources writes that I need two repositories on production server:

  1. Bare git --bare init
  2. And non-bare repository in /var/www folder


Then I need to push changes from my local/development repository to Bare then pull project in /var/www from this Bare repository.

But if I have Github/Gitlab Server then I don’t see logic in this Bare repository, because I have this code on the server. I’m right?

I am using Laravel for my projects and after some commits composer install is required. If I pull this commit on production server, then it may break site and it will be down for some time until I will make all required steps. Of course all steps can be done with automatic script, but I think it also will break site for a short time.

I want to understand right scheme of deploying applications to production server.

  • laravel deployment generating autoload error
  • Should I download Laravel for every project?
  • laravel vendor directory in .gitignore or not?
  • Running vagrant up can't access the registered map in the Homestead.yaml
  • Laravel 5: Remove development dependancies from app config file
  • Installing laravel on existing project
  • How to put Laravel 4 custom pagination outside vendor file?
  • laravel 4 installation with wamp only show directories of laravel
  • 2 Solutions collect form web for “From development to deployment with Git”

    The best approach i think (and this is what we are following on a large project we are working) is to setup a separate git project on a separate environment (let’s say “Test” or “Stage” environment) from which you are going to clone the git project hosted on gitlab.

    After cloning the stage and doing the necessary preparatory steps you can pull the changes committed to gitlab/github server. This way you can test the website on stage after each pull request. If all is ok and there are no broken parts you can pull the git content from the production environment too respecting the same logic as described below. The idea is the stage environment have to have the same php version and server settings etc.

    Of course you can use automation tools like Vagrant, Docker or Jenkins (just to name a few but there are much more), which are freeing up from the headaches produced by different system configuration by creating images with the actual system configuration including the project and deploying the project to separate virtual machines.

    Below is a detailed picture about the process which you might follow.

    enter image description here

    More extensive tools like Capistrano can automate a lot of things for you, but also has a learning curve.

    It’s hard to know the best solution without more details, but once strategy I’ve seen for atomic commits is using symlinks. I’ll give an example

    Clone your repository twice on your server, and set up a symlink from /var/www to point to one of them. When you deploy, go into the other folder and git pull, run any needed install commands, and then update your symlink to point to the updated folder. I believe this is similar to how Capistrano works under the covers.

    Just my two cents!

    Git Baby is a git and github fan, let's start git clone.