installing python packages without internet and using source code as .tar.gz and .whl
Question:
we are trying to install couple of python packages without internet.
For ex : python-keystoneclient
For that we have the packages downloaded from https://pypi.python.org/pypi/python-keystoneclient/1.7.1 and kept it in server.
However, while installing tar.gz and .whl packages , the installation is looking for dependent packages to be installed first. Since there is no internet connection in the server, it is getting failed.
For ex : For python-keystoneclient we have the following dependent packages
stevedore (>=1.5.0)
six (>=1.9.0)
requests (>=2.5.2)
PrettyTable (<0.8,>=0.7)
oslo.utils (>=2.0.0)
oslo.serialization (>=1.4.0)
oslo.i18n (>=1.5.0)
oslo.config (>=2.3.0)
netaddr (!=0.7.16,>=0.7.12)
debtcollector (>=0.3.0)
iso8601 (>=0.1.9)
Babel (>=1.3)
argparse
pbr (<2.0,>=1.6)
When i try to install packages one by one from the above list, once again its looking for nested dependency .
Is there any way we could list ALL the dependent packages for installing a python module like python-keystoneclient.
Answers:
pipdeptree
is a command line utility for displaying the python packages installed in an virtualenv in form of a dependency tree.
Just use it:
https://github.com/naiquevin/pipdeptree
This is how I handle this case:
On the machine where I have access to Internet:
mkdir keystone-deps
pip download python-keystoneclient -d "/home/aviuser/keystone-deps"
tar cvfz keystone-deps.tgz keystone-deps
Then move the tar file to the destination machine that does not have Internet access and perform the following:
tar xvfz keystone-deps.tgz
cd keystone-deps
pip install python_keystoneclient-2.3.1-py2.py3-none-any.whl -f ./ --no-index
You may need to add –no-deps to the command as follows:
pip install python_keystoneclient-2.3.1-py2.py3-none-any.whl -f ./ --no-index --no-deps
We have a similar situation at work, where the production machines have no access to the Internet; therefore everything has to be managed offline and off-host.
Here is what I tried with varied amounts of success:
-
basket
which is a small utility that you run on your internet-connected host. Instead of trying to install a package, it will instead download it, and everything else it requires to be installed into a directory. You then move this directory onto your target machine. Pros: very easy and simple to use, no server headaches; no ports to configure. Cons: there aren’t any real showstoppers, but the biggest one is that it doesn’t respect any version pinning you may have; it will always download the latest version of a package.
-
Run a local pypi server. Used pypiserver
and devpi
. pypiserver
is super simple to install and setup; devpi
takes a bit more finagling. They both do the same thing – act as a proxy/cache for the real pypi and as a local pypi server for any home-grown packages. localshop
is a new one that wasn’t around when I was looking, it also has the same idea. So how it works is your internet-restricted machine will connect to these servers, they are then connected to the Internet so that they can cache and proxy the actual repository.
The problem with the second approach is that although you get maximum compatibility and access to the entire repository of Python packages, you still need to make sure any/all dependencies are installed on your target machines (for example, any headers for database drivers and a build toolchain). Further, these solutions do not cater for non-pypi repositories (for example, packages that are hosted on github).
We got very far with the second option though, so I would definitely recommend it.
Eventually, getting tired of having to deal with compatibility issues and libraries, we migrated the entire circus of servers to commercially supported docker containers.
This means that we ship everything pre-configured, nothing actually needs to be installed on the production machines and it has been the most headache-free solution for us.
We replaced the pypi repositories with a local docker image server.
If you want to install a bunch of dependencies from, say a requirements.txt, you would do:
mkdir dependencies
pip download -r requirements.txt -d "./dependencies"
tar cvfz dependencies.tar.gz dependencies
And, once you transfer the dependencies.tar.gz to the machine which does not have internet you would do:
tar zxvf dependencies.tar.gz
cd dependencies
pip install * -f ./ --no-index
This isn’t an answer. I was struggling but then realized that my install was trying to connect to internet to download dependencies.
So, I downloaded and installed dependencies first and then installed with below command. It worked
python -m pip install filename.tar.gz
You can manually download the ‘whl’ file from PyPI:
https://pypi.org/project/google-cloud-debugger-client/#files
Then locate it in the root folder and you can just install it via pip:
pip install google_cloud_debugger_client-1.2.1-py2.py3-none-any.whl
This is also an add up to the answer by
Praveen Yalagandula. As downloading for specific version of python --python-version
and specific platform like linux --platform manylinux1_x86_64
was not elaborated in the answer.
pip3 download somePackage --platform manylinux1_x86_64 --only-binary=:all: -d "/Users/ajaytomgeorge/Dev/wheels/"
There are advanced arguments also which you can pass through FULL LIST of ARGUMENTS
Examples
--progress-bar
--no-build-isolation
--use-pep517
--check-build-dependencies
--ignore-requires-python
-d
--platform
--python-version.
--implementations
Another e.g for python 2.7 and mac
pip download --only-binary=:all: --platform macosx-10_10_x86_64 --python-version 27 --implementation cp SomePackage
For windows users if you want to install pkgs in an env that is not connected to internet
on a host that is connected to internet
mkdir dependencies
pip download -r requirements.txt -d "./dependencies"
tar cvfz dependencies.tar.gz dependencies
on a host that is not
tar zxvf dependencies.tar.gz
cd dependencies
for %f in (*.whl) do pip install --no-index --find-links=./ %f
we are trying to install couple of python packages without internet.
For ex : python-keystoneclient
For that we have the packages downloaded from https://pypi.python.org/pypi/python-keystoneclient/1.7.1 and kept it in server.
However, while installing tar.gz and .whl packages , the installation is looking for dependent packages to be installed first. Since there is no internet connection in the server, it is getting failed.
For ex : For python-keystoneclient we have the following dependent packages
stevedore (>=1.5.0)
six (>=1.9.0)
requests (>=2.5.2)
PrettyTable (<0.8,>=0.7)
oslo.utils (>=2.0.0)
oslo.serialization (>=1.4.0)
oslo.i18n (>=1.5.0)
oslo.config (>=2.3.0)
netaddr (!=0.7.16,>=0.7.12)
debtcollector (>=0.3.0)
iso8601 (>=0.1.9)
Babel (>=1.3)
argparse
pbr (<2.0,>=1.6)
When i try to install packages one by one from the above list, once again its looking for nested dependency .
Is there any way we could list ALL the dependent packages for installing a python module like python-keystoneclient.
pipdeptree
is a command line utility for displaying the python packages installed in an virtualenv in form of a dependency tree.
Just use it:
https://github.com/naiquevin/pipdeptree
This is how I handle this case:
On the machine where I have access to Internet:
mkdir keystone-deps
pip download python-keystoneclient -d "/home/aviuser/keystone-deps"
tar cvfz keystone-deps.tgz keystone-deps
Then move the tar file to the destination machine that does not have Internet access and perform the following:
tar xvfz keystone-deps.tgz
cd keystone-deps
pip install python_keystoneclient-2.3.1-py2.py3-none-any.whl -f ./ --no-index
You may need to add –no-deps to the command as follows:
pip install python_keystoneclient-2.3.1-py2.py3-none-any.whl -f ./ --no-index --no-deps
We have a similar situation at work, where the production machines have no access to the Internet; therefore everything has to be managed offline and off-host.
Here is what I tried with varied amounts of success:
-
basket
which is a small utility that you run on your internet-connected host. Instead of trying to install a package, it will instead download it, and everything else it requires to be installed into a directory. You then move this directory onto your target machine. Pros: very easy and simple to use, no server headaches; no ports to configure. Cons: there aren’t any real showstoppers, but the biggest one is that it doesn’t respect any version pinning you may have; it will always download the latest version of a package. -
Run a local pypi server. Used
pypiserver
anddevpi
.pypiserver
is super simple to install and setup;devpi
takes a bit more finagling. They both do the same thing – act as a proxy/cache for the real pypi and as a local pypi server for any home-grown packages.localshop
is a new one that wasn’t around when I was looking, it also has the same idea. So how it works is your internet-restricted machine will connect to these servers, they are then connected to the Internet so that they can cache and proxy the actual repository.
The problem with the second approach is that although you get maximum compatibility and access to the entire repository of Python packages, you still need to make sure any/all dependencies are installed on your target machines (for example, any headers for database drivers and a build toolchain). Further, these solutions do not cater for non-pypi repositories (for example, packages that are hosted on github).
We got very far with the second option though, so I would definitely recommend it.
Eventually, getting tired of having to deal with compatibility issues and libraries, we migrated the entire circus of servers to commercially supported docker containers.
This means that we ship everything pre-configured, nothing actually needs to be installed on the production machines and it has been the most headache-free solution for us.
We replaced the pypi repositories with a local docker image server.
If you want to install a bunch of dependencies from, say a requirements.txt, you would do:
mkdir dependencies
pip download -r requirements.txt -d "./dependencies"
tar cvfz dependencies.tar.gz dependencies
And, once you transfer the dependencies.tar.gz to the machine which does not have internet you would do:
tar zxvf dependencies.tar.gz
cd dependencies
pip install * -f ./ --no-index
This isn’t an answer. I was struggling but then realized that my install was trying to connect to internet to download dependencies.
So, I downloaded and installed dependencies first and then installed with below command. It worked
python -m pip install filename.tar.gz
You can manually download the ‘whl’ file from PyPI:
https://pypi.org/project/google-cloud-debugger-client/#files
Then locate it in the root folder and you can just install it via pip:
pip install google_cloud_debugger_client-1.2.1-py2.py3-none-any.whl
This is also an add up to the answer by
Praveen Yalagandula. As downloading for specific version of python --python-version
and specific platform like linux --platform manylinux1_x86_64
was not elaborated in the answer.
pip3 download somePackage --platform manylinux1_x86_64 --only-binary=:all: -d "/Users/ajaytomgeorge/Dev/wheels/"
There are advanced arguments also which you can pass through FULL LIST of ARGUMENTS
Examples
--progress-bar
--no-build-isolation
--use-pep517
--check-build-dependencies
--ignore-requires-python
-d
--platform
--python-version.
--implementations
Another e.g for python 2.7 and mac
pip download --only-binary=:all: --platform macosx-10_10_x86_64 --python-version 27 --implementation cp SomePackage
For windows users if you want to install pkgs in an env that is not connected to internet
on a host that is connected to internet
mkdir dependencies
pip download -r requirements.txt -d "./dependencies"
tar cvfz dependencies.tar.gz dependencies
on a host that is not
tar zxvf dependencies.tar.gz
cd dependencies
for %f in (*.whl) do pip install --no-index --find-links=./ %f