Managing library dependencies using git

I have a project which is built for multiple OSes(Linux and Windows for now, maybe OS X) and processors. To this project I have a handful of library dependencies, which are manly external but I have a couple of internal ones, in source form which I compile(cross-compile) for each OS-processor combination possible in my context.

Most of the external libraries are not changed very often, just maybe in case of a local bugfix or some feature\bugfix implemented in a newer version I think it may benefit the project. The internal libraries change quite often(1 month cycles) and are provided by another team in my company in binary form, although I also have access to the source code and if I need a bug to be fixed I can do that and generate new binaries for my usage until the next release cycle. The setup I have right now is the following(filesystem only):

  • Change build destination without touching csproj file
  • using libgit2sharp to pull latest from a branch
  • Using Jenkins with boost.test unit tests
  • How to get the list of patches made by this git script?
  • roslyn compiler not copied to AspnetCompileMerge folder using msbuild
  • What files of my project should I put in repo
  • -- dependencies
      |
       -- library_A_v1.0
         |
          --include
         |
          --lib
      |
       -- library_A_v1.2
         |
          --include
         |
          --lib
      |       
       -- library_B
         |
          --include
         |
          --lib
      | ...
    

    The libraries are kept on a server and every time I make an update I have to copy any new binaries and header files on the server. The synchronization on the client side is done using a file synchronization utility. Of course any updates to the libraries need to be announced to the other developers and everyone has to remember to synchronize their “dependencies” folder.

    Needless to say that I don’t like very much this scheme. So I was thinking of putting my libraries under version control(GIT). Build them, pack them in a tgz\zip and push them on the repo. Each library would have its own git repository so that I could easily tag\branch already used versions and test drive new versions. A “stream” of data for each library that I could easily get, combine, update. I would like to have the following:

    • get rid of this normal filesystem way of keeping the libraries; right now complete separate folders are kept and managed for each OS and each version and sometimes they get out of sync resulting in a mess

    • more control over it, to be able to have a clear history of which versions of the libs we used for which version of our project; much like what we can obtain from git(VCS) with our source code

    • be able to tag\branch the versions of the dependencies I’m using(for each and every one of them); I have my v2.0.0 tag/branch for library_A from which I normally take it for my project but I would like to test drive the 2.1.0 version, so I just build it, push it on the server on a different branch and call my build script with this particular dependency pointing to the new branch

    • have simpler build scripts – just pull the sources from the server, pull the dependencies and build; that would allow also to use different versions of the same library for different processor-OS combinations(more than often we need that)

    I tried to find some alternatives to the direct git based solution but without much success – like git-annex which kind of seems overly complicated for what I’m trying to do.

    What I’m facing right now is the fact that there seems to be very strong opinion against putting binary files under git or any VCS(although technically I would have also header files; I could also push the folder structure that I described directly to git to not have the tgz\zip, but I would still have the libraries binaries) and that some of my colleagues, driven by that shared strong opinion, are against this scheme of things. I perfectly understand that git tracks content and not files, but to some extent I will be tracking also content and I believe it will definitely be an improvement over the current scheme of things we have right now.

    What would be a better solution to this situation? Do you know of any alternatives to the git(VCS) based scheme of things? Would it be such a monstrous thing to have my scheme under git :)? Please share your opinions and especially your experience in handling these types of situations.

    Thanks

  • Updating and adding files in heroku repository
  • jenkins Wipe out repository and force clone vs Clean before checkout
  • Invalid pathspec magic
  • Backing up open source gems in case they go away
  • In git What will be applied into the file if I pushed 3 commits in the remote repository? Will the latest pushed commit will be applied or the first?
  • GIT diff GUI
  • One Solution collect form web for “Managing library dependencies using git”

    An alternative, which would still follwo your project, would be to use git-annex, which would allow you track header files, while keeping binaries stored elsewhere.
    Then each git repo can be added as a submodule to your main project.

    Git Baby is a git and github fan, let's start git clone.