NodeJS/Jenkins/GIT and Jenkins Slave as Web server

I want to do Continuous Integration for one of my web server applications.
We are using Postgres as a backend database. To achieve this we are planning to use NodeJS/Jenkins and GIT.

Once any developers check-in their changes the build has to start immediately and do some basic testing.
This involves starting the NodeJS server and executing a few test cases.

  • merge before build in jenkins fails but should not
  • Is Travis CI integrated with JIRA?
  • Setting up Continuous Integration with SVN
  • Semantic versioning automation
  • Gerrit equivalent for other repositories such as Clearcase/Subversion
  • Excluded Regions in Jenkins with Git
  • I have integrated Jenkins and GIT so that whenever the git change is notified the Jenkins build starts.

    Can we run the NodeJS web server in the same Jenkins machine or should we use a slave Windows machine for this?

    Please suggest best practices for this?

    The assumption is the server machine will have Postgres DB installed and NPM also installed.

    Thanks.

  • Listing all commits in a branch using libgit2
  • git push origin master fails and gives error: src refspec HEAD does not match any
  • What to do with unused feature branches
  • Publishing a “git svn” repo
  • Git, How to reset origin/master to a commit?
  • Git will not remove untracked files
  • 2 Solutions collect form web for “NodeJS/Jenkins/GIT and Jenkins Slave as Web server”

    To have reporoducible builds, I recommend the Docker integration for Jenkins. This allows multiple builds to be processed simoultaniously, so if a build takes longer than expected, other developers can push as well.

    As each container has an own network, you can run as many containers as you want at the same time with each having an own Node server listening on the same port. As long as you don’t need to access the node server from the outside of the container (for example from the build server itself), you have no problem there.

    For the Postgresql database: you can either spawn a database server in your container which means that you have a lot of short-lived DB servers. Depending on the amount of testing data you need to import, this might not be a solution. The other way is to run the Postgresql server on the build server, and allow the containers to access it. In each build, you create a new database in Postgresql, probably with a template. You can destroy the db afterwards.

    So the process is:

    [Developer pushes] -> [git notifies Jenkins] -> [Jenkins creates database] -> [Jenkins runs container] -> [Container builds and tests] -> [Container processes results] -> [Jenkins destroys database]
    

    For the result proceesing step, there are multiple ways to do it. The easiest way (which does not require any additional software) is to have Jenkins merge the branch automatically. Each developer pushes a new branch for his work and if the tests succeed, Jenkins merges the branch to master. You can forbid the user to write to the master branch directly to enforce this.

    The other way is to integrate Jenkins into your git hosting solution (if you use any). GitLab EE supports this which means that you can see the test results for each commit in GitLab. Of course, this is not a solution if you don’t use anything like GitLab/Bitbucket/…

    Suggestion to use PM2 tool, it has high set of features.
    http://pm2.keymetrics.io/docs/usage/deployment/#force-deployment

    This tool than can be configured in Jenkins & scripts can be written in package.json

    Git Baby is a git and github fan, let's start git clone.