Migrating virtualenv and Github between computers
I primarily work these days with Python 2.7 and Django 1.3.3 (hosted on Heroku) and I have multiple projects that I maintain. I’ve been working on a Desktop with Ubuntu running inside of a VirtualBox, but recently had to take a trip and wanted to get everything loaded up on my notebook. But, what I quickly discovered was that virtualenv + Github is really easy for creating projects, but I struggled to try and get them moved over to my notebook. The approach that I sort of came up with was to create new virtualenv and then clone the code from github. But, I couldn’t do it in the folder that I really wanted because it would say the folder is not empty. So, I would clone it to a tmp folder than them cut/paste the everthing into where I really wanted it. Not TERRIBLE, but I just feel like I’m missing something here and that it should be easier. Maybe clone first, then mkvirtualenv?
It’s not a crushing problem, but I’m thinking about making some more changes (like getting ride of the VirtualBox and just going with a Dual boot system) and it would be great if I could make it a bit smoother. 🙂
Finally, I found and read a few posts about moving git repos between computers, but I didn’t see any dealing with Virtualenv (maybe I just missed it).
EDIT: Just to be clear and avoid confusion, I’m not try to “move” the virtualenv. I’m just talking about best way to create a new one. Install the packages, and then clone the repo from github.
4 Solutions collect form web for “Migrating virtualenv and Github between computers”
The only workflow you should need is:
git clone repo_url somedir cd somedir virtualenv <name of environment directory> source <name of environment directory>/bin/activate pip install -r requirements.txt
This assumes that you have run
pip freeze > requirements.txt (while the venv is activated) to list all the virtualenv-pip-installed libraries and checked it into the repo.
That’s because you’re not even supposed to move virtualenvs to different locations on one system (there’s relocation support, but it’s experimental), let alone from one system to another. Create a new virtualenv:
- Install virtualenv on the other system
- Get a
requirements.txt, either by writing one or by storing the output of
pip freeze(and editing the output)
- Move the requirements.txt to the other system, create a new virtualenv, and install the libraries via
pip install -r requirements.txt.
- Clone the git repository on the other system
For more advanced needs, you can create a bootstrapping script which includes virtualenv + custom code to set up anything else.
EDIT: Having the root of the virtualenv and the root of your repository in the same directory seems like a pretty bad idea to me. Put the repository in a directory inside the virtualenv root, or put them into completely separate trees. Not only you avoid git (rightfully — usually, everything not tracked by git is fair game to delete) complaining about existing files, you can also use the virtualenv for multiple repositories and avoid name collisions.
In addition to scripting creating a new virtualenv, you should make a requirements.txt file that has all of your dependencies (e.g Django1.3), you can then run
pip install -r requirements.txt and this will install all of your dependencies for you.
You can even have pip create this for you by doing
pip freeze > stable-req.txt which will print out you dependencies as there are in your current virtualenv. You can then keep the requirements.txt under version control.
The nice thing about a virtualenv is that you can describe how to make one, and you can make it repeatedly on multiple platforms.
So, instead of cloning the whole thing, clone a method to create the virtualenv consistently, and have that in your git repository. This way you avoid platform-specific nasties.