Skip to content

AttributeError: 'NoneType' object has no attribute 'failed' #1278

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
rbrunhuber opened this issue Oct 24, 2017 · 1 comment
Closed

AttributeError: 'NoneType' object has no attribute 'failed' #1278

rbrunhuber opened this issue Oct 24, 2017 · 1 comment

Comments

@rbrunhuber
Copy link

rbrunhuber commented Oct 24, 2017

After fixing bug #1276 and #1277 I now have another issue and I'm hitting a roadblock. Here is the traceback:

Environment:
•Kafka 0.10.0.2.6
•Kerberos 5 version 1.12.2 (MS AD server)
•Python 3.6.1 |Anaconda 4.4.0 (64-bit)
•kafka-python (1.3.6.dev0)
•gssapi (1.2.2)

Traceback (most recent call last):
  File "/home/myname/workspace/python/kafkatest/kafkatest.py", line 23, in <module>
    for message in consumer:
  File "/opt/continuum/anaconda3/lib/python3.6/site-packages/kafka/consumer/group.py", line 1095, in __next__
    return next(self._iterator)
  File "/opt/continuum/anaconda3/lib/python3.6/site-packages/kafka/consumer/group.py", line 1018, in _message_generator
    self._coordinator.ensure_coordinator_ready()
  File "/opt/continuum/anaconda3/lib/python3.6/site-packages/kafka/coordinator/base.py", line 216, in ensure_coordinator_ready
    if future.failed():
AttributeError: 'NoneType' object has no attribute 'failed'

I'm using python 3.6.1 from anaconda and kafka-python 1.3.6-dev0 (commit a345dcd ) and the following code:

from kafka import KafkaConsumer
import logging 

logging.basicConfig(level=logging.DEBUG)
#logging.getLogger().setLevel(logging.WARN)
#logging.getLogger('BrokerConnection').setLevel(logging.WARN) 

consumer = KafkaConsumer('testsos1'
                         ,bootstrap_servers=['kafka1-lab:6667']
                         ,security_protocol='SASL_PLAINTEXT'
                         ,api_version=(0, 10, 0)
                         ,group_id='python_client'
                         ,sasl_mechanism='GSSAPI'
                         ,auto_offset_reset='earliest'
                         )

for message in consumer:
    print (message) 

the complete log looks like this:

DEBUG:kafka.metrics.metrics:Added sensor with name connections-closed
DEBUG:kafka.metrics.metrics:Added sensor with name connections-created
DEBUG:kafka.metrics.metrics:Added sensor with name select-time
DEBUG:kafka.metrics.metrics:Added sensor with name io-time
INFO:kafka.client:Bootstrapping cluster metadata from [('kafka1-lab', 6667, <AddressFamily.AF_UNSPEC: 0>)]
DEBUG:kafka.client:Attempting to bootstrap via node at kafka1-lab:6667
DEBUG:kafka.metrics.metrics:Added sensor with name bytes-sent-received
DEBUG:kafka.metrics.metrics:Added sensor with name bytes-sent
DEBUG:kafka.metrics.metrics:Added sensor with name bytes-received
DEBUG:kafka.metrics.metrics:Added sensor with name request-latency
DEBUG:kafka.metrics.metrics:Added sensor with name node-bootstrap.bytes-sent
DEBUG:kafka.metrics.metrics:Added sensor with name node-bootstrap.bytes-received
DEBUG:kafka.metrics.metrics:Added sensor with name node-bootstrap.latency
DEBUG:kafka.conn:<BrokerConnection node_id=bootstrap host=kafka1-lab/kafka1-lab port=6667>: creating new socket
DEBUG:kafka.conn:<BrokerConnection node_id=bootstrap host=kafka1-lab/10.20.30.40 port=6667>: setting socket option (6, 1, 1)
INFO:kafka.conn:<BrokerConnection node_id=bootstrap host=kafka1-lab/10.20.30.40 port=6667>: connecting to 10.20.30.40:6667
DEBUG:kafka.conn:<BrokerConnection node_id=bootstrap host=kafka1-lab/10.20.30.40 port=6667>: established TCP connection
DEBUG:kafka.conn:<BrokerConnection node_id=bootstrap host=kafka1-lab/10.20.30.40 port=6667>: initiating SASL authentication
DEBUG:kafka.protocol.parser:Sending request SaslHandShakeRequest_v0(mechanism='GSSAPI')
DEBUG:kafka.conn:<BrokerConnection node_id=bootstrap host=kafka1-lab/10.20.30.40 port=6667> Request 1: SaslHandShakeRequest_v0(mechanism='GSSAPI')
DEBUG:kafka.protocol.parser:Received correlation id: 1
DEBUG:kafka.protocol.parser:Processing response SaslHandShakeResponse_v0
DEBUG:kafka.conn:<BrokerConnection node_id=bootstrap host=kafka1-lab/10.20.30.40 port=6667> Response 1 (0.6361007690429688 ms): SaslHandShakeResponse_v0(error_code=0, enabled_mechanisms=['GSSAPI'])
DEBUG:kafka.conn:<BrokerConnection node_id=bootstrap host=kafka1-lab/10.20.30.40 port=6667>: GSSAPI name: kafka/sername.domain.com@DOMAIN.COM
INFO:kafka.conn:<BrokerConnection node_id=bootstrap host=kafka1-lab/10.20.30.40 port=6667>: Authenticated as kafka/servername.domain.com@DOMAIN.COM via GSSAPI
DEBUG:kafka.conn:<BrokerConnection node_id=bootstrap host=kafka1-lab/10.20.30.40 port=6667>: Connection complete.
DEBUG:kafka.client:Node bootstrap connected
DEBUG:kafka.protocol.parser:Sending request MetadataRequest_v1(topics=NULL)
DEBUG:kafka.conn:<BrokerConnection node_id=bootstrap host=kafka1-lab/10.20.30.40 port=6667> Request 2: MetadataRequest_v1(topics=NULL)
ERROR:kafka.conn:<BrokerConnection node_id=bootstrap host=kafka1-lab/10.20.30.40 port=6667>: socket disconnected
INFO:kafka.conn:<BrokerConnection node_id=bootstrap host=kafka1-lab/10.20.30.40 port=6667>: Closing connection. ConnectionError: socket disconnected
DEBUG:kafka.conn:<BrokerConnection node_id=bootstrap host=kafka1-lab/10.20.30.40 port=6667>: reconnect backoff 0.04703295241241606 after 1 failures
ERROR:kafka.client:Unable to bootstrap from [('kafka1-lab', 6667, <AddressFamily.AF_UNSPEC: 0>)]
DEBUG:kafka.metrics.metrics:Added sensor with name bytes-fetched
DEBUG:kafka.metrics.metrics:Added sensor with name records-fetched
DEBUG:kafka.metrics.metrics:Added sensor with name fetch-latency
DEBUG:kafka.metrics.metrics:Added sensor with name records-lag
DEBUG:kafka.metrics.metrics:Added sensor with name fetch-throttle-time
DEBUG:kafka.metrics.metrics:Added sensor with name heartbeat-latency
DEBUG:kafka.metrics.metrics:Added sensor with name join-latency
DEBUG:kafka.metrics.metrics:Added sensor with name sync-latency
DEBUG:kafka.metrics.metrics:Added sensor with name commit-latency
INFO:kafka.consumer.subscription_state:Updating subscribed topics to: ('testsos1',)
DEBUG:kafka.client:Give up sending metadata request since no node is available
Traceback (most recent call last):
  File "/home/ny30180a/workspace/python/kafkatest/kafkatest.py", line 23, in <module>
    for message in consumer:
  File "/opt/continuum/anaconda3/lib/python3.6/site-packages/kafka/consumer/group.py", line 1095, in __next__
    return next(self._iterator)
  File "/opt/continuum/anaconda3/lib/python3.6/site-packages/kafka/consumer/group.py", line 1018, in _message_generator
    self._coordinator.ensure_coordinator_ready()
  File "/opt/continuum/anaconda3/lib/python3.6/site-packages/kafka/coordinator/base.py", line 216, in ensure_coordinator_ready
    if future.failed():
AttributeError: 'NoneType' object has no attribute 'failed'

Can anybody please help with debugging/fixing this issue?

@dpkp
Copy link
Owner

dpkp commented Oct 24, 2017

Thanks for the bug report -- fix is in PR 1279

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants