Managing Python environments properly is the difference between a Django project that builds cleanly on every machine and one that breaks the moment a second developer touches it. This guide covers virtual environment creation, dependency management, Python version control, and the lockfile discipline that keeps production deployments reproducible. I have debugged enough “works on my machine” incidents to know that this is where most teams cut corners too early. For a broader look at the framework, visit our Framework overview.
The core idea is isolation. Every Django project should have its own Python environment with its own installed packages. Mixing dependencies across projects eventually causes version conflicts that are painful to untangle, especially when one project pins Django 4.2 and another needs 5.1.
Virtual environments with venv
Python ships with venv in the standard library, which is the simplest way to create an isolated environment. No extra tools needed.
python3 -m venv .venv
Activate it:
# macOS / Linux
source .venv/bin/activate
# Windows PowerShell
.venv\Scripts\Activate.ps1
# Windows cmd
.venv\Scripts\activate.bat
Once activated, any pip install command installs packages into that environment only. Your system Python stays clean.
Always add .venv/ to your .gitignore. Virtual environments are local, machine-specific artifacts. You distribute requirements files, not environment folders.
Managing dependencies with pip
After activating your environment, install Django and any other packages your project needs:
pip install django psycopg2-binary redis celery
Capture the current state with a requirements file:
pip freeze > requirements.txt
To rebuild the same environment elsewhere:
pip install -r requirements.txt
This workflow has served Django projects well for over a decade, but it has a weakness: pip freeze does not distinguish between packages you chose to install and their transitive dependencies. When you need to upgrade one package, you cannot easily tell which other packages are safe to remove.
Layered requirements files
A practical pattern is to maintain separate requirements files for different contexts:
requirements/
base.txt # Core dependencies shared everywhere
dev.txt # Development tools: debug toolbar, pytest, etc.
production.txt # Production-only: gunicorn, sentry-sdk, etc.
In dev.txt:
-r base.txt
django-debug-toolbar
pytest
pytest-django
factory-boy
In production.txt:
-r base.txt
gunicorn
sentry-sdk
whitenoise
This keeps your production image lean and avoids shipping test tooling to live servers. The production settings guide covers the settings side of this separation.
Dependency locking for reproducibility
A requirements.txt with pinned versions is a basic lock, but it lacks hash verification and does not separate direct from transitive dependencies cleanly.
Tools like pip-tools solve this:
pip install pip-tools
Create requirements.in with your direct dependencies:
django>=5.1,<5.2
psycopg2-binary
redis
celery
Compile a locked requirements file with hashes:
pip-compile --generate-hashes requirements.in
The resulting requirements.txt pins every transitive dependency with exact versions and cryptographic hashes. This means your production deployment installs exactly what you tested, not whatever the latest compatible release happens to be that day.
For further reading on Python packaging and dependency management, consult the official Python documentation, which covers venv, pip, and the packaging ecosystem authoritatively.
Python version management with pyenv
If you work on multiple projects that need different Python versions, pyenv is the standard tool for managing installations:
pyenv install 3.12.2
pyenv local 3.12.2
The pyenv local command creates a .python-version file in the project directory. When any developer enters that directory, pyenv automatically activates the correct Python version. No manual switching.
On macOS, install pyenv through Homebrew. On Linux, use the pyenv-installer script. On Windows, pyenv-win provides similar functionality, though Docker-based environments are often simpler on Windows.
Project directory conventions
A clean layout for a Django project with proper environment management:
myproject/
.venv/ # Local, gitignored
.python-version # Committed, pyenv reads this
requirements/
base.txt
dev.txt
production.txt
src/
manage.py
myproject/
settings/
__init__.py
base.py
dev.py
production.py
urls.py
wsgi.py
tests/
.gitignore
The project structure guide covers the full layout in more detail, including where to place apps, templates, static files, and configuration.
Environment variables and secrets
Never hard-code database passwords, API keys, or secret keys in settings files. Use environment variables:
import os
SECRET_KEY = os.environ['DJANGO_SECRET_KEY']
DATABASE_URL = os.environ.get('DATABASE_URL', 'sqlite:///db.sqlite3')
For local development, a .env file loaded by django-environ or python-dotenv keeps things convenient without risking secrets in version control. In production, inject variables through your hosting platform, container orchestrator, or secret manager.
Common environment pitfalls
These are the problems I see most often in Django projects with poor environment discipline:
Global installs causing conflicts. If Django is installed globally, manage.py might pick up the wrong version. Always check which python and which django-admin point inside your .venv.
Missing .gitignore entries. If .venv/ or *.pyc files end up in your repository, you create merge conflicts and bloat that slow down every clone.
Unpinned dependencies in production. Running pip install django without a version pin means a patch release on PyPI can change your production behavior between deploys. Always pin.
Forgetting to update the lock file. When you add a new dependency, compile the lock file immediately. Do not leave it until deploy day.
Frequently asked questions
Should I use Poetry or Pipenv instead of pip-tools?
Both are valid choices. Poetry handles virtual environment creation and dependency resolution together. Pipenv does similar work. If your team already uses one, keep using it. If you are starting fresh and want simplicity, venv plus pip-tools covers the ground without adding a heavy abstraction layer.
Can I use conda for Django projects? You can, but conda is optimized for scientific computing packages with C extensions. For web application projects, pip-based workflows are simpler and more widely supported by hosting platforms.
How often should I update dependencies?
Run pip-compile --upgrade at least monthly. Security patches for Django and its dependencies ship regularly. Automated tools like Dependabot or Renovate can create pull requests when updates are available.
What about Docker? Docker wraps the entire environment, including the OS layer. It is excellent for production and CI, but during local development, a virtual environment is faster to iterate with. Many teams use both: venv for local coding, Docker for integration testing and deployment.