Pipe output from shell command to a python script
Question:
I want to run a mysql
command and set the output of that to be a variable in my python script.
Here is the shell command I’m trying to run:
$ mysql my_database --html -e "select * from limbs" | ./script.py
Here is the python script:
#!/usr/bin/env python
import sys
def hello(variable):
print variable
How would I accept the variable in the python script and have it print the output?
Answers:
When you pipe the output of one command to a pytho script, it goes to sys.stdin. You can read from sys.stdin just like a file. Example:
import sys
print sys.stdin.read()
This program literally outputs its input.
You need to read from stdin to retrieve the data in the python script e.g.
#!/usr/bin/env python
import sys
def hello(variable):
print variable
data = sys.stdin.read()
hello(data)
If all you want to do here is grab some data from a mysql database and then manipulate it with Python I would skip piping it into the script and just use the Python MySql module to do the SQL query.
If you want your script to behave like many unix command line tools and accept a pipe or a filename as first argument, you can use the following:
#!/usr/bin/env python
import sys
# use stdin if it's full
if not sys.stdin.isatty():
input_stream = sys.stdin
# otherwise, read the given filename
else:
try:
input_filename = sys.argv[1]
except IndexError:
message = 'need filename as first argument if stdin is not full'
raise IndexError(message)
else:
input_stream = open(input_filename, 'rU')
for line in input_stream:
print(line) # do something useful with each line
Since this answer pops up on Google at the top when searching for piping data to a python script
, I’d like to add another method, which I have found in [J. Beazley’s Python Cookbook][1] after searching for a less ‘gritty’ aproach than using sys
. IMO, more pythonic and self-explanatory even to new users.
import fileinput
with fileinput.input() as f_input:
for line in f_input:
print(line, end='')
This approach also works for commands structured like this:
$ ls | ./filein.py # Prints a directory listing to stdout.
$ ./filein.py /etc/passwd # Reads /etc/passwd to stdout.
$ ./filein.py < /etc/passwd # Reads /etc/passwd to stdout.
If you require more complex solutions, you can compine argparse
and fileinput
[as shown in this gist by martinth][2]:
import argparse
import fileinput
if __name__ == '__main__':
parser = ArgumentParser()
parser.add_argument('--dummy', help='dummy argument')
parser.add_argument('files', metavar='FILE', nargs='*', help='files to read, if empty, stdin is used')
args = parser.parse_args()
# If you would call fileinput.input() without files it would try to process all arguments.
# We pass '-' as only file when argparse got no files which will cause fileinput to read from stdin
for line in fileinput.input(files=args.files if len(args.files) > 0 else ('-', )):
print(line)
[1]: https://library.oreilly.com/book/0636920027072/python-cookbook-3rd-edition/199.xhtml?ref=toc#_accepting_script_input_via_redirection_pipes_or_input_files
[2]: https://gist.github.com/martinth/ed991fb8cdcac3dfadf7
I stumbled on this trying to pipe a bash command to a python script that I did not write (and didn’t want to modify to accept sys.stdin
). I found process substitution mentioned here (https://superuser.com/questions/461946/can-i-use-pipe-output-as-a-shell-script-argument) to work fine.
Ex.
some_script.py -arg1 <(bash command)
You can use the command line tool xargs
echo 'arg1' | xargs python script.py
arg1
is now accessible from sys.argv[1]
in script.py
The one-liner that also works for Windows and on Python 3.10.3
is using sys.stdin.read()
, like this:
echo 'Hello!' | python -c "import sys;d=sys.stdin.read(); print('{}n'.format(d))"
I want to run a mysql
command and set the output of that to be a variable in my python script.
Here is the shell command I’m trying to run:
$ mysql my_database --html -e "select * from limbs" | ./script.py
Here is the python script:
#!/usr/bin/env python
import sys
def hello(variable):
print variable
How would I accept the variable in the python script and have it print the output?
When you pipe the output of one command to a pytho script, it goes to sys.stdin. You can read from sys.stdin just like a file. Example:
import sys
print sys.stdin.read()
This program literally outputs its input.
You need to read from stdin to retrieve the data in the python script e.g.
#!/usr/bin/env python
import sys
def hello(variable):
print variable
data = sys.stdin.read()
hello(data)
If all you want to do here is grab some data from a mysql database and then manipulate it with Python I would skip piping it into the script and just use the Python MySql module to do the SQL query.
If you want your script to behave like many unix command line tools and accept a pipe or a filename as first argument, you can use the following:
#!/usr/bin/env python
import sys
# use stdin if it's full
if not sys.stdin.isatty():
input_stream = sys.stdin
# otherwise, read the given filename
else:
try:
input_filename = sys.argv[1]
except IndexError:
message = 'need filename as first argument if stdin is not full'
raise IndexError(message)
else:
input_stream = open(input_filename, 'rU')
for line in input_stream:
print(line) # do something useful with each line
Since this answer pops up on Google at the top when searching for piping data to a python script
, I’d like to add another method, which I have found in [J. Beazley’s Python Cookbook][1] after searching for a less ‘gritty’ aproach than using sys
. IMO, more pythonic and self-explanatory even to new users.
import fileinput
with fileinput.input() as f_input:
for line in f_input:
print(line, end='')
This approach also works for commands structured like this:
$ ls | ./filein.py # Prints a directory listing to stdout.
$ ./filein.py /etc/passwd # Reads /etc/passwd to stdout.
$ ./filein.py < /etc/passwd # Reads /etc/passwd to stdout.
If you require more complex solutions, you can compine argparse
and fileinput
[as shown in this gist by martinth][2]:
import argparse
import fileinput
if __name__ == '__main__':
parser = ArgumentParser()
parser.add_argument('--dummy', help='dummy argument')
parser.add_argument('files', metavar='FILE', nargs='*', help='files to read, if empty, stdin is used')
args = parser.parse_args()
# If you would call fileinput.input() without files it would try to process all arguments.
# We pass '-' as only file when argparse got no files which will cause fileinput to read from stdin
for line in fileinput.input(files=args.files if len(args.files) > 0 else ('-', )):
print(line)
[1]: https://library.oreilly.com/book/0636920027072/python-cookbook-3rd-edition/199.xhtml?ref=toc#_accepting_script_input_via_redirection_pipes_or_input_files
[2]: https://gist.github.com/martinth/ed991fb8cdcac3dfadf7
I stumbled on this trying to pipe a bash command to a python script that I did not write (and didn’t want to modify to accept sys.stdin
). I found process substitution mentioned here (https://superuser.com/questions/461946/can-i-use-pipe-output-as-a-shell-script-argument) to work fine.
Ex.
some_script.py -arg1 <(bash command)
You can use the command line tool xargs
echo 'arg1' | xargs python script.py
arg1
is now accessible from sys.argv[1]
in script.py
The one-liner that also works for Windows and on Python 3.10.3
is using sys.stdin.read()
, like this:
echo 'Hello!' | python -c "import sys;d=sys.stdin.read(); print('{}n'.format(d))"