What is the idiomatic way to iterate over a binary file?

Question:

With a text file, I can write this:

with open(path, 'r') as file:
    for line in file:
        # handle the line

This is equivalent to this:

with open(path, 'r') as file:
    for line in iter(file.readline, ''):
        # handle the line

This idiom is documented in PEP 234 but I have failed to locate a similar idiom for binary files.

With a binary file, I can write this:

with open(path, 'rb') as file:
    while True:
        chunk = file.read(1024 * 64)
        if not chunk:
            break
        # handle the chunk

I have tried the same idiom that with a text file:

def make_read(file, size):
    def read():
        return file.read(size)
    return read

with open(path, 'rb') as file:
    for chunk in iter(make_read(file, 1024 * 64), b''):
        # handle the chunk

Is it the idiomatic way to iterate over a binary file in Python?

Asked By: dawg

||

Answers:

Try:

chunk_size = 4 * 1024 * 1024  # MB

with open('large_file.dat','rb') as f:
    for chunk in iter(lambda: f.read(chunk_size), b''):
        handle(chunk)

iter needs a function with zero arguments.

  • a plain f.read would read the whole file, since the size parameter is missing;
  • f.read(1024) means call a function and pass its return value (data loaded from file) to iter, so iter does not get a function at all;
  • (lambda:f.read(1234)) is a function that takes zero arguments (nothing between lambda and :) and calls f.read(1234).
Answered By: liori

I don’t know of any built-in way to do this, but a wrapper function is easy enough to write:

def read_in_chunks(infile, chunk_size=1024*64):
    while True:
        chunk = infile.read(chunk_size)
        if chunk:
            yield chunk
        else:
            # The chunk was empty, which means we're at the end
            # of the file
            return

Then at the interactive prompt:

>>> from chunks import read_in_chunks
>>> infile = open('quicklisp.lisp')
>>> for chunk in read_in_chunks(infile):
...     print chunk
... 
<contents of quicklisp.lisp in chunks>

Of course, you can easily adapt this to use a with block:

with open('quicklisp.lisp') as infile:
    for chunk in read_in_chunks(infile):
        print chunk

And you can eliminate the if statement like this.

def read_in_chunks(infile, chunk_size=1024*64):
    chunk = infile.read(chunk_size)
    while chunk:
        yield chunk
        chunk = infile.read(chunk_size)
Answered By: Jason Baker

The Pythonic way to read a binary file iteratively is using the built-in function iter with two arguments and the standard function functools.partial, as described in the Python library documentation:

iter(object[, sentinel])

Return an iterator object. The first argument is interpreted very differently depending on the presence of the second argument. Without a second argument, object must be a collection object which supports the iteration protocol (the __iter__() method), or it must support the sequence protocol (the __getitem__() method with integer arguments starting at 0). If it does not support either of those protocols, TypeError is raised. If the second argument, sentinel, is given, then object must be a callable object. The iterator created in this case will call object with no arguments for each call to its __next__() method; if the value returned is equal to sentinel, StopIteration will be raised, otherwise the value will be returned.

See also Iterator Types.

One useful application of the second form of iter() is to build a block-reader. For example, reading fixed-width blocks from a binary database file until the end of file is reached:

from functools import partial

with open('mydata.db', 'rb') as f:
    for block in iter(partial(f.read, 64), b''):
        process_block(block)
Answered By: Maggyero

Nearly 10 years after this question and now Python 3.8 has the := Walrus Operator described in PEP 572.

To read a file in chunks idiomatically and expressively (with Python 3.8 or later) you can do:

# A loop that cannot be trivially rewritten using 2-arg iter().
while chunk := file.read(1024 * 64):
    process(chunk)
Answered By: dawg

In Python 3.8+, there is a new assignment expression := – known as the "walrus operator" – that assigns values to variables. See PEP 572 for more details. Thus, to read a file in chunks, you could do:

def read_in_chunks(file_path, chunk_size=1024):
    with open(file_path, 'rb') as f:
        while chunk := f.read(chunk_size):
            yield chunk  # or process the chunk as desired
Answered By: Chris
Categories: questions Tags: , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.