Skip to content

In 1.4.1, unsupported codec in consumer no longer gives clear error #1443

Closed
@ekimekim

Description

@ekimekim

In 1.3.5, attempting to consume a message encoded with an unavailable codec (in my case, lz4) would give you a clear error message LZ4 decompression unsupported. However, in 1.4.1 I'm getting a far more confusing error:

  File "/usr/lib/python2.7/site-packages/kafka_python-1.4.1-py2.7.egg/kafka/consumer/group.py", line 603, in poll
    records = self._poll_once(remaining, max_records)
  File "/usr/lib/python2.7/site-packages/kafka_python-1.4.1-py2.7.egg/kafka/consumer/group.py", line 652, in _poll_once
    records, _ = self._fetcher.fetched_records(max_records)
  File "/usr/lib/python2.7/site-packages/kafka_python-1.4.1-py2.7.egg/kafka/consumer/fetcher.py", line 337, in fetched_records
    self._next_partition_records = self._parse_fetched_data(completion)
  File "/usr/lib/python2.7/site-packages/kafka_python-1.4.1-py2.7.egg/kafka/consumer/fetcher.py", line 788, in _parse_fetched_data
    unpacked = list(self._unpack_message_set(tp, records))
  File "/usr/lib/python2.7/site-packages/kafka_python-1.4.1-py2.7.egg/kafka/consumer/fetcher.py", line 455, in _unpack_message_set
    for record in batch:
  File "/usr/lib/python2.7/site-packages/kafka_python-1.4.1-py2.7.egg/kafka/record/default_records.py", line 249, in __iter__
    self._maybe_uncompress()
  File "/usr/lib/python2.7/site-packages/kafka_python-1.4.1-py2.7.egg/kafka/record/default_records.py", line 165, in _maybe_uncompress
    uncompressed = lz4_decode(data.tobytes())
TypeError: 'NoneType' object is not callable

I haven't looked too closely, but it looks like the new kafka.record codepath doesn't check has_lz4() before calling lz4_decode, causing the above.

Metadata

Metadata

Assignees

Labels

No labels
No labels

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions