Skip to content

v1.21.0

Chunked detection does not work as generators never have an `__iter__`
attribute.  They do have `__next__`.

Example that now works with this commit:

    def read_in_chunks(file_object, chunk_size=4096):
        while True:
            data = file_object.read(chunk_size)
            if not data:
                break
            yield data

    file = open(filename, "rb")
    r = requests.post(url, data=read_in_chunks(file))
Assets 2
Loading