Python HTTP server that supports chunked encoding?

Question:

I’m looking for a well-supported multithreaded Python HTTP server that supports chunked encoding replies. (I.e. “Transfer-Encoding: chunked” on responses). What’s the best HTTP server base to start with for this purpose?

Asked By: slacy

||

Answers:

I am pretty sure that WSGI compliant servers should support that. Essentially, WSGI applications return iterable chunks, which the webserver returns. I don’t have first hand experience with this, but here is a list of compliant servers.

I should think that it would be fairly easy to roll your own though, if WSGI servers dont meet what you are looking for, using the Python’s builtin CGIHTTPServer. It is already multithreaded, so it would just be up to you to chunk the responses.

Answered By: Shane C. Mason

Twisted supports chunked transfer encoding (API link) (see also the API doc for HTTPChannel). There are a number of production-grade projects using Twisted (for example, Apple uses it for the iCalendar server in Mac OS X Server), so it’s quite well supported and very robust.

Answered By: Jarret Hardie

Twisted supports chunked transfer and it does so transparently. i.e., if your request handler does not specify a response length, twisted will automatically switch to chunked transfer and it will generate one chunk per call to Request.write.

Answered By: mathieu

I managed to do it using Tornado:

#!/usr/bin/env python

import logging

import tornado.httpserver
import tornado.ioloop
import tornado.options
import tornado.web

from tornado.options import define, options

define("port", default=8080, help="run on the given port", type=int)

@tornado.web.stream_request_body
class MainHandler(tornado.web.RequestHandler):
    def post(self):
        print()
    def data_received(self, chunk):
        self.write(chunk)

        logging.info(chunk)

def main():
    tornado.options.parse_command_line()

    application = tornado.web.Application([
        (r"/", MainHandler),
    ])

    http_server = tornado.httpserver.HTTPServer(application)
    http_server.listen(options.port)

    tornado.ioloop.IOLoop.current().start()

if __name__ == "__main__":
    main()
Answered By: Iulian Onofrei

You can implement a simple chunked server using Python’s HTTPServer, by adding this to your serve function:

    def _serve(self, res):
        response = next(res)

        content_type = 'application/json'

        self.send_response(200)
        self.send_header('Content-Type', content_type)
        self.send_header('Transfer-Encoding', 'chunked')
        self.end_headers()

        try:
            while True:
                # This line removed as suggested by @SergeyNudnov
                # r = response.encode('utf-8')
                l = len(r)
                self.wfile.write('{:X}rn{}rn'.format(l, r).encode('utf-8'))

                response = next(it)
        except StopIteration:
            pass

        self.wfile.write('0rnrn'.encode('utf-8'))

I would not recommend it for production use.

Answered By: Orwellophile

The script below is a full working example. It could be used as a CGI script to stream data under Apache or IIS:

#!/usr/bin/env pythonw
import sys
import os
import time

# Minimal length of response to avoid its buffering by IIS+FastCGI
# This value was found by testing this script from a browser and
# ensuring that every event received separately and in full
response_padding = 284


def send_chunk(r):
    # Binary write into stdout
    os.write(1, "{:X}rn{}rn".format(len(r), r).encode('utf-8'))


class Unbuffered(object):
    """
    Stream wrapper to disable output buffering
    To be used in the CGI scripts
    https://stackoverflow.com/a/107717/9921853
    """
    def __init__(self, stream):
        self.stream = stream

    def write(self, data):
        self.stream.write(data)
        self.stream.flush()

    def writelines(self, lines):
        self.stream.writelines(lines)
        self.stream.flush()

    def __getattr__(self, attr):
        return getattr(self.stream, attr)


# Ensure stdout is unbuffered to avoid problems serving this CGI script on IIS
# Also web.config should have responseBufferLimit="0" applied to the CgiModule handler
sys.stdout = Unbuffered(sys.stdout)
print(
    "Transfer-Encoding: chunkedn"
    "Content-Type: text/event-stream; charset=utf-8n"
)

# Fixing an issue, that IIS provides a wrong file descriptor for stdin, if no data passed to the POST request
sys.stdin = sys.stdin or open(os.devnull, 'r')

progress = 0

send_chunk(
    (
        "event: startedn"
        f"data: {progress}"
    ).ljust(response_padding) + "nn"
)

while progress < 5:
    time.sleep(2)
    progress += 1

    send_chunk(
        (
            "event: progressn"
            f"data: {progress}"
        ).ljust(response_padding) + "nn"
    )

time.sleep(2)

send_chunk(
    (
        "event: completedn"
        f"data: {progress+1}"
    ).ljust(response_padding) + "nn"
)

# To close stream
send_chunk('')

########################################################
# All Done
Answered By: Sergey Nudnov
Categories: questions Tags: , ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.