In Python, virtual environments are used to isolate projects from each other (if they require different versions of the same library, for example). They let you install and manage packages without administrative privileges, and without conflicting with the system package manager. They also allow to quickly create an environment somewhere else with the same dependencies.
Virtual environments are a crucial tool for any Python developer. And at that, a very simple tool to work with.
Let’s get started!
The best tool that can be used to create virtual environments is the venv module, which is part of the standard library since Python 3.3.
venv is built into Python, and most users don’t need to install anything.
However, Debian/Ubuntu users will need to run
sudo apt-get install python3-venv to make it work (due to Debian not installing some components
venv needs by default). 
The alternative (and original, and previously standard) virtual environment tool is virtualenv. It works with Python 2.7, and has a couple
extra fetures (that you generally won’t need). virtualenv can be installed with your system package manager, or
pip install --user virtualenv.
Which one to use? Probably
venv. Both tools achieve the same goal in similar
ways. And if one of them does not work, you can try the other and it might just
(Terminology note: most of the time, the names of both tools are used interchargeably, “venv” was often used as an abbreviation for “virtualenv” before the stdlib tool was created)
To create a virtual environment named
env, you need to run the
tool with the Python you want to use in that environment.
or, if you’re using
Afterwards, you will end up with a folder named
env that contains folders
Scripts on Windows — contains executables and scripts
installed by packages, including
lib (contains code), and
include (contains C headers).
Both tools install
venv does not ship with
wheel. In addition, the default versions tend to be more-or-less outdated.
Let’s upgrade them real quick: 
Where to store virtual environments?
While the tools allow you to put your virtual environments anywhere in the system, it is not a desirable thing to do. There are two options:
Have one global place for them, like
Store them in each project’s directory, like
The first option can be easier to manage, there are tools that can help manage
virtualenvwrapper, shell auto-activation scripts, or the
workon functions described below). The second option is equally easy to
work with, but comes with one caveat — you must add the venv directory to your
.gitignore file (or
.git/info/exclude if you don’t want to commit
.gitignore), since you don’t want it in your repository (it’s
binary bloat, and works only on your machine).
If you pick the global virtual environment store option, you can use the following short
function (put it in
.zshrc / your shell configuration file)
to get a simple way to activate an environment (by running
virtualenvwrapper also has a
workon feature, although I don’t think
virtualenvwrapper is really necessary and too helpful — the
feature is handy though, and so here’s a way to do it without
And for PowerShell fans, here’s a
And for cmd.exe fans… you should switch to PowerShell, it’s a very nice and friendly shell (though perhaps requiring some effort to learn how to be productive with it).
There are three ways of working with virtual environments interactively (in a shell):
source env/bin/activateon *nix;
env\Scripts\activateon Windows) — it simplifies work and requires less typing, although it can sometimes fail to work properly. (After installing scripts,
hash -rmay be necessary on *nix to use them.)
env\Scripts\python) and other scripts directly, as activation only changes
$PATHand some helper variables — those variables are not mandatory for operation, running the correct
pythonis, and that method is failsafe.
in subshells (IMO, it’s bad UX)
Whichever method you use, you must remember that without doing any of these things, you will still be working with the system Python.
For non-interactive work (eg. crontab entries, system services, etc.), activation and subshells are not viable solutions. In these cases, you must always use the full path to Python.
Here are some usage examples (paths can be relative, of course):
## *nix, activation ## $ source /path/to/env/bin/activate (env)$ pip install Django (env)$ deactivate ## *nix, manual execution ## $ /path/to/env/bin/pip install Django ## Windows, activation ## > C:\path\to\env\Scripts\activate (venv)> pip install Django (venv)> deactivate ## Windows, manual execution ## > C:\path\to\env\Scripts\pip install Django ## Windows, updating pip/setuptools/wheel ## > C:\path\to\env\Scripts\python -m pip install -U pip setuptools wheel
The same principle applies to running Python itself, or any other script
installed by a package. (With Django’s
manage.py, calling it as
./manage.py requires activation, or you can run
If you try to copy or rename a virtual environment, you will discover that the
copied environment does not work. This is because a virtual environment is
closely tied to both the Python it was created with, and the location it was
created in. (The “relocatable” option of
virtualenv does not work and is deprecated.) 
However, this is very easy to fix. Instead of moving/copying, just create a new
environment in the new location. Then, run
pip freeze > requirements.txt in
the old environment to create a list of packages installed in it. With that,
you can just run
pip install -r requirements.txt in the new environment to
install packages from the saved list. (Of course, you can copy
between machines. In many cases, it will just work; sometimes, you might need a few
requirements.txt to remove OS-specific stuff.)
Note that it might also be necessary to re-create your virtual environment
after a Python upgrade,  so it might be handy to keep an up-to-date
requirements.txt for your virtual environments (for many projects, it makes
sense to put that in the repository).
To manage those
requirements.txt files in a more orgnized yet still simple
way, you might be interested in pip-tools.
Frequently Asked Questions
I’m using virtualenv. Do I need to install it for each Python I want to use it with?
In most cases, you can use
virtualenv -p pythonX env to specify a different
Python version, but with some Python version combinations, that approach might
be unsuccessful. (The
venv module is tied to the Python version it’s
I’m the only user on my system. Do I still need virtual environments?
Yes, you do. First, you will still need separation between projects, sooner or
later. Moreover, if you were to install packages system-wide with pip, you
might end up causing conflicts between packages installed by the system package
manager and by pip. Running
sudo pip is never a good idea because of this.
I’m using Docker. Do I still need virtual environments?
They are still a good idea in that case. They protect you against any bad system-wide Python packages your OS image might have (and one popular base OS is famous for those). They don’t introduce any extra overhead, while allowing to have a clean environment and the ability to re-create it outside of Docker (eg. for local development without Docker)
What about Pipenv?
Pipenv is a dependency management tool. It isn’t compatible with most workflows, and comes with many issues. In my opinion, it’s not worth using (Also, that thing about it being an officially recommended tool? Turns out it’s not true.)
I also wrote a blog post detailing concerns with that tool, titled Pipenv: promises a lot, delivers very little.
Consider using pip-tools instead.