Linux – run multiple parallel commands, print sequential output

I am a bit new to bash, and I need to run a short command several hundred times in parallel but print output sequentially. The command prints a fairly short output to stdout that is I do not want to loose or for it to get garbled/mixed up with the output of another thread. Is there a way in Linux to run several commands (e.g. no more than N threads in parallel) so that all command outputs are printed sequentially (in any order, as long as they don’t overlap).

Current bash script (full code here)

  • Can git automatically split up a commit into single hunks?
  • Why don't git colors show up in iTerm2
  • Conditional space in PS1
  • Is there any fix (or workaround) to Git-2.11.0 (3) 64-bit bug in handling double slashes?
  • git pre-commit hook, add file into index
  • git hook bash does not work
  • declare -a UPDATE_ERRORS
    UPDATE_ERRORS=( )
    
    function pull {
        git pull  # Assumes current dir is set
        if [[ $? -ne 0 ]]; then
          UPDATE_ERRORS+=("error message")
        fi
    
    for f in extensions/*; do
      if [[ -d $f ]]; then
        ########## This code should run in parallel, but output of each thread
        ########## should be cached and printed sequentially one after another
        ########## pull function also updates a global var that will be used later
        pushd $f > /dev/null
        pull
        popd > /dev/null
      fi
    done
    
    if [[ ${#UPDATE_ERRORS[@]} -ne 0 ]]; then
      # print errors again
    fi
    

  • Is “git push --mirror” sufficient for backing up my repository?
  • Git pull on non-working branch without switching
  • What's the real function for git-stash and git-rebase?
  • Writing a commit message with Atom Editor
  • Why is my Git pre-commit hook not executable by default?
  • two branches which compare equal with git diff but have different hashes
  • 3 Solutions collect form web for “Linux – run multiple parallel commands, print sequential output”

    You can use flock for this. I have emulate the similar situation to test. do_the_things proc generates overlapping in time output. In a for loop text generation called several times simultaneously. Output should mess, but output is feeded to procedure locked_print which waits until lock is freed and then prints recieved input to stdout. Exports are needed to call procedure from inside of a pipe.

    #!/bin/bash
    
    do_the_things()
            {
            rand="$((RANDOM % 10))"
            sleep $rand
            for i in `seq 1 10`; do sleep 1; echo "${rand}-$i"; done
            }
    
    locked_print()
            {
            echo Started
            flock -e testlock cat
            }
    
    export -f do_the_things
    export -f locked_print
    
    for f in a b c d; do
            (do_the_things | locked_print) &
    done
    wait
    

    Try something like this. I don’t have/use git so I have done a dummy command to simulate it in my version.

    #!/bin/bash
    declare -a ERRORS
    ERRORS=( )
    
    function pull {
        cd "$1"
        echo Starting pull in $1
        for i in {0..9}; do echo "$1 Output line $i";done
        sleep 5
        echo "GITERROR: Dummy error in directory $1"
    }
    
    export -f pull
    
    for f in extensions/*; do
      if [[ -d $f ]]; then
        ########## This code should run in parallel, but output of each thread
        ########## should be cached and printed sequentially one after another
        ########## pull function also updates a global var that will be used later
        echo $f
      fi
    done | parallel -k pull | tee errors.tmp
    
    IFS=$'\n' ERRORS=($(grep "^GITERROR:" errors.tmp))
    rm errors.tmp
    
    for i in "${ERRORS[@]}"; do
       echo $i
    done
    

    You will see that even if there are 4 directories to pull, the entire script will only take 5 seconds – despite executing 4 lots of sleep 5.

    List dirs by adding /. Parallel spawns a shell that cd’s to the dir. If git pull fails a magic string is printed. All output is also kept as copies in out/1/*. When all pulls are done, check which files the magic strings occurs in and print out the STDOUT/STDERR of those commands. Cleanup.

    parallel --results out 'cd {} && (git pull || echo e_R_r_O_r)' :::  extensions/*/
    grep -l e_R_r_O_r out/*/stdout | parallel 'grep -v e_R_r_O_r {//}/stdout; cat {//}/stderr >&2'
    rm -r out
    
    Git Baby is a git and github fan, let's start git clone.