pipenv –system option for docker. What is the suggested way to get all the python packages in docker
Question:
I use pipenv
for my django
app.
$ mkdir djangoapp && cd djangoapp
$ pipenv install django==2.1
$ pipenv shell
(djangoapp) $ django-admin startproject example_project .
(djangoapp) $ python manage.py runserver
Now i am shifting to docker environment.
As per my understanding pipenv
only installs packages inside a virtualenv
You don’t need a virtual env inside a container, docket container IS a virtual environment in itself.
Later after going through many Dockerfile ‘s i found --system
option to install in the system.
For example the following i found:
https://testdriven.io/blog/dockerizing-django-with-postgres-gunicorn-and-nginx/
COPY ./Pipfile /usr/src/app/Pipfile
RUN pipenv install --skip-lock --system --dev
https://hub.docker.com/r/kennethreitz/pipenv/dockerfile
# -- Install dependencies:
ONBUILD RUN set -ex && pipenv install --deploy --system
https://wsvincent.com/beginners-guide-to-docker/
# Set work directory
WORKDIR /code
# Copy Pipfile
COPY Pipfile /code
# Install dependencies
RUN pip install pipenv
RUN pipenv install --system
So --system
is only sufficient or --deploy --system
is better way. And --skip-lock --system --dev
which is different again.
So can some one guide how to get my environment back in my Docker
Answers:
A typical Docker deployment would involve having a requirements.txt
(it’s a file where you can store your pip dependencies, including Django itself) file and then in your Dockerfile
you do something like:
FROM python:3.7 # or whatever version you need
ADD requirements.txt /code/
WORKDIR /code
# install your Python dependencies
RUN pip install -r requirements.txt
# run Django
CMD [ "python", "./manage.py", "runserver", "0.0.0.0:8000"]
You don’t need pipenv
here at all since you no longer have a virtual environment as you say.
Even better you can configure a lot of that stuff in a docker-compose.yml
file and then use docker-compose
to run and manage your services, not just Django.
Docker have a very good tutorial on dockerising Django with it. And if you’re unsure what’s going on in the Dockerfile
itself, check the manual.
In either a docker image, a CI pipeline, a production server or even in your development workstation: you should always include the --deploy
flag in your installs unless you want to potentially relock all dependencies, e.g. while evolving your requirements. It will check that the lockfile is up-to-date and will never install anything that is not listed there.
As for the --system
flag, you’d better drop it. There is no real harm on using a virtual environment inside docker images, but some subtle benefits. See this comment by @anishtain4. Pipenv now recommends against system-wide installs https://github.com/pypa/pipenv/pull/2762.
I use pipenv
for my django
app.
$ mkdir djangoapp && cd djangoapp
$ pipenv install django==2.1
$ pipenv shell
(djangoapp) $ django-admin startproject example_project .
(djangoapp) $ python manage.py runserver
Now i am shifting to docker environment.
As per my understanding pipenv
only installs packages inside a virtualenv
You don’t need a virtual env inside a container, docket container IS a virtual environment in itself.
Later after going through many Dockerfile ‘s i found --system
option to install in the system.
For example the following i found:
https://testdriven.io/blog/dockerizing-django-with-postgres-gunicorn-and-nginx/
COPY ./Pipfile /usr/src/app/Pipfile
RUN pipenv install --skip-lock --system --dev
https://hub.docker.com/r/kennethreitz/pipenv/dockerfile
# -- Install dependencies:
ONBUILD RUN set -ex && pipenv install --deploy --system
https://wsvincent.com/beginners-guide-to-docker/
# Set work directory
WORKDIR /code
# Copy Pipfile
COPY Pipfile /code
# Install dependencies
RUN pip install pipenv
RUN pipenv install --system
So --system
is only sufficient or --deploy --system
is better way. And --skip-lock --system --dev
which is different again.
So can some one guide how to get my environment back in my Docker
A typical Docker deployment would involve having a requirements.txt
(it’s a file where you can store your pip dependencies, including Django itself) file and then in your Dockerfile
you do something like:
FROM python:3.7 # or whatever version you need
ADD requirements.txt /code/
WORKDIR /code
# install your Python dependencies
RUN pip install -r requirements.txt
# run Django
CMD [ "python", "./manage.py", "runserver", "0.0.0.0:8000"]
You don’t need pipenv
here at all since you no longer have a virtual environment as you say.
Even better you can configure a lot of that stuff in a docker-compose.yml
file and then use docker-compose
to run and manage your services, not just Django.
Docker have a very good tutorial on dockerising Django with it. And if you’re unsure what’s going on in the Dockerfile
itself, check the manual.
In either a docker image, a CI pipeline, a production server or even in your development workstation: you should always include the --deploy
flag in your installs unless you want to potentially relock all dependencies, e.g. while evolving your requirements. It will check that the lockfile is up-to-date and will never install anything that is not listed there.
As for the --system
flag, you’d better drop it. There is no real harm on using a virtual environment inside docker images, but some subtle benefits. See this comment by @anishtain4. Pipenv now recommends against system-wide installs https://github.com/pypa/pipenv/pull/2762.