How can I call 'git pull' from within Python?
Question:
Using the github webhooks, I would like to be able to pull any changes to a remote development server. At the moment, when in the appropriate directory, git pull
gets any changes that need to be made. However, I can’t figure out how to call that function from within Python. I have tried the following:
import subprocess
process = subprocess.Popen("git pull", stdout=subprocess.PIPE)
output = process.communicate()[0]
But this results in the following error
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python2.7/subprocess.py", line 679, in __init__
errread, errwrite)
File "/usr/lib/python2.7/subprocess.py", line 1249, in _execute_child
raise child_exception
OSError: [Errno 2] No such file or directory
Is there a way that I can call this bash command from within Python?
Answers:
subprocess.Popen
expects a list of the program name and arguments. You’re passing it a single string, which is (with the default shell=False
) equivalent to:
['git pull']
That means that subprocess tries to find a program named literally git pull
, and fails to do so: In Python 3.3, your code raises the exception FileNotFoundError: [Errno 2] No such file or directory: 'git pull'
. Instead, pass in a list, like this:
import subprocess
process = subprocess.Popen(["git", "pull"], stdout=subprocess.PIPE)
output = process.communicate()[0]
By the way, in Python 2.7+, you can simplify this code with the check_output
convenience function:
import subprocess
output = subprocess.check_output(["git", "pull"])
Also, to use git functionality, it’s by no way necessary (albeit simple and portable) to call the git binary. Consider using git-python or Dulwich.
Have you considered using GitPython? It’s designed to handle all this nonsense for you.
import git
g = git.cmd.Git(git_dir)
g.pull()
Try:
subprocess.Popen("git pull", stdout=subprocess.PIPE, shell=True)
This is a sample recipe, I’ve been using in one of my projects. Agreed that there are multiple ways to do this though. 🙂
>>> import subprocess, shlex
>>> git_cmd = 'git status'
>>> kwargs = {}
>>> kwargs['stdout'] = subprocess.PIPE
>>> kwargs['stderr'] = subprocess.PIPE
>>> proc = subprocess.Popen(shlex.split(git_cmd), **kwargs)
>>> (stdout_str, stderr_str) = proc.communicate()
>>> return_code = proc.wait()
>>> print return_code
0
>>> print stdout_str
# On branch dev
# Untracked files:
# (use "git add <file>..." to include in what will be committed)
#
# file1
# file2
nothing added to commit but untracked files present (use "git add" to track)
>>> print stderr_str
The problem with your code was, you were not passing an array for subprocess.Popen()
and hence was trying to run a single binary called git pull
. Instead it needs to execute the binary git
with the first argument being pull
and so on.
The accepted answer using GitPython is little better than just using subprocess
directly.
The problem with this approach is that if you want to parse the output, you end up looking at the result of a “porcelain” command, which is a bad idea
Using GitPython in this way is like getting a shiny new toolbox, and then using it for the pile of screws that hold it together instead of the tools inside. Here’s how the API was designed to be used:
import git
repo = git.Repo('Path/to/repo')
repo.remotes.origin.pull()
If you want to check if something changed, you can use
current = repo.head.commit
repo.remotes.origin.pull()
if current != repo.head.commit:
print("It changed")
If you’re using Python 3.5+ prefer subprocess.run
to subprocess.Popen
for scenarios it can handle. For example:
import subprocess
subprocess.run(["git", "pull"], check=True, stdout=subprocess.PIPE).stdout
Try:
import subprocess
cwd = '/path/to/relevant/dir'
command = 'git pull'
process = subprocess.Popen(command.split(), stdout=subprocess.PIPE, cwd=cwd)
output, unused_err = process.communicate()
print(output)
Using the github webhooks, I would like to be able to pull any changes to a remote development server. At the moment, when in the appropriate directory, git pull
gets any changes that need to be made. However, I can’t figure out how to call that function from within Python. I have tried the following:
import subprocess
process = subprocess.Popen("git pull", stdout=subprocess.PIPE)
output = process.communicate()[0]
But this results in the following error
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python2.7/subprocess.py", line 679, in __init__
errread, errwrite)
File "/usr/lib/python2.7/subprocess.py", line 1249, in _execute_child
raise child_exception
OSError: [Errno 2] No such file or directory
Is there a way that I can call this bash command from within Python?
subprocess.Popen
expects a list of the program name and arguments. You’re passing it a single string, which is (with the default shell=False
) equivalent to:
['git pull']
That means that subprocess tries to find a program named literally git pull
, and fails to do so: In Python 3.3, your code raises the exception FileNotFoundError: [Errno 2] No such file or directory: 'git pull'
. Instead, pass in a list, like this:
import subprocess
process = subprocess.Popen(["git", "pull"], stdout=subprocess.PIPE)
output = process.communicate()[0]
By the way, in Python 2.7+, you can simplify this code with the check_output
convenience function:
import subprocess
output = subprocess.check_output(["git", "pull"])
Also, to use git functionality, it’s by no way necessary (albeit simple and portable) to call the git binary. Consider using git-python or Dulwich.
Have you considered using GitPython? It’s designed to handle all this nonsense for you.
import git
g = git.cmd.Git(git_dir)
g.pull()
Try:
subprocess.Popen("git pull", stdout=subprocess.PIPE, shell=True)
This is a sample recipe, I’ve been using in one of my projects. Agreed that there are multiple ways to do this though. 🙂
>>> import subprocess, shlex
>>> git_cmd = 'git status'
>>> kwargs = {}
>>> kwargs['stdout'] = subprocess.PIPE
>>> kwargs['stderr'] = subprocess.PIPE
>>> proc = subprocess.Popen(shlex.split(git_cmd), **kwargs)
>>> (stdout_str, stderr_str) = proc.communicate()
>>> return_code = proc.wait()
>>> print return_code
0
>>> print stdout_str
# On branch dev
# Untracked files:
# (use "git add <file>..." to include in what will be committed)
#
# file1
# file2
nothing added to commit but untracked files present (use "git add" to track)
>>> print stderr_str
The problem with your code was, you were not passing an array for subprocess.Popen()
and hence was trying to run a single binary called git pull
. Instead it needs to execute the binary git
with the first argument being pull
and so on.
The accepted answer using GitPython is little better than just using subprocess
directly.
The problem with this approach is that if you want to parse the output, you end up looking at the result of a “porcelain” command, which is a bad idea
Using GitPython in this way is like getting a shiny new toolbox, and then using it for the pile of screws that hold it together instead of the tools inside. Here’s how the API was designed to be used:
import git
repo = git.Repo('Path/to/repo')
repo.remotes.origin.pull()
If you want to check if something changed, you can use
current = repo.head.commit
repo.remotes.origin.pull()
if current != repo.head.commit:
print("It changed")
If you’re using Python 3.5+ prefer subprocess.run
to subprocess.Popen
for scenarios it can handle. For example:
import subprocess
subprocess.run(["git", "pull"], check=True, stdout=subprocess.PIPE).stdout
Try:
import subprocess
cwd = '/path/to/relevant/dir'
command = 'git pull'
process = subprocess.Popen(command.split(), stdout=subprocess.PIPE, cwd=cwd)
output, unused_err = process.communicate()
print(output)