building multiple jobs in jenkins performance

In Jenkins I have 100 java projects. Each has its own build file.
Every time I want clear the build file and compile all source files again.
Using bulkbuilder plugin I tried compling all the jobs.. Having 100 jobs run parallel.
But performance is very bad. Individually if the job takes 1 min. in the batch it takes 20mins. More the batch size more the time it takes.. I am running this on powerful server so no problem of memory and CPU.

Please Suggest me how to over come this.. what configurations need to be done in jenkins.
I am launching jenkins using war file.

  • No credential field in jenkin while creating job with git
  • Play framework 2.0 continuous integration setup
  • Maven build fails on Jenkins
  • Getting 404 though my app is deployed
  • Jenkins Pipeline Plugin pass build Job parameters between jobs started in a pipeline
  • hudson git https password, how do I specify it?
  • Thanks..

  • How to create multi-branch project with Jenkins DSL plugin?
  • Deployment failed due to “Request had insufficient authentication scopes”
  • jenkins slave can't get started
  • Get git branch name in Jenkins Jenkinsfile
  • Jenkins vs. Xcode plugin - codesign troubles
  • Jenkins - HTML Publisher Plugin - No CSS is displayed when report is viewed in Jenkins Server
  • One Solution collect form web for “building multiple jobs in jenkins performance”

    Even though you say you have enough memory and CPU resources, you seem to imply there is some kind of bottleneck when you increase the number of parallel running jobs. I think this is understandable. Even though I am not a java developer, I think most of the java build tools are able to parallelize build internally. I.e. building a single job may well consume more than one CPU core and quite a lot of memory.

    Because of this I suggest you need to monitor your build server and experiment with different batch sizes to find an optimal number. You should execute e.g. “vmstat 5” while builds are running and see if you have idle cpu left. Also keep an eye on the disk I/O. If you increase the batch size but disk I/O does not increase, you are consuming all of the I/O capacity and it probably will not help much if you increase the batch size.

    When you have found the optimal batch size (i.e. how many executors to configure for the build server), you can maybe tweak other things to make things faster:

    • Try to spend as little time checking out code as possible. Instead of deleting workspace before build starts, configure the SCM plugin to remove files that are not under version control. If you use git, you can use a local reference repo or do a shallow clone or something like that.

    • You can also try to speed things up by using SSD disks

    • You can get more servers, run Jenkins slaves on them and utilize the cpu and I/O capacity of multiple servers instead of only one.

    Git Baby is a git and github fan, let's start git clone.