kandi background
Explore Kits

kafka-python | Python client for Apache Kafka | Pub Sub library

 by   dpkp Python Version: 2.0.2 License: Apache-2.0

 by   dpkp Python Version: 2.0.2 License: Apache-2.0

Download this library from

kandi X-RAY | kafka-python Summary

kafka-python is a Python library typically used in Messaging, Pub Sub, Kafka applications. kafka-python has no bugs, it has no vulnerabilities, it has build file available, it has a Permissive License and it has high support. You can install using 'pip install kafka-python' or download it from GitHub, PyPI.
Python client for Apache Kafka
Support
Support
Quality
Quality
Security
Security
License
License
Reuse
Reuse

kandi-support Support

  • kafka-python has a highly active ecosystem.
  • It has 4490 star(s) with 1183 fork(s). There are 149 watchers for this library.
  • It had no major release in the last 12 months.
  • There are 180 open issues and 1158 have been closed. On average issues are closed in 75 days. There are 28 open pull requests and 0 closed requests.
  • It has a negative sentiment in the developer community.
  • The latest version of kafka-python is 2.0.2
kafka-python Support
Best in #Pub Sub
Average in #Pub Sub
kafka-python Support
Best in #Pub Sub
Average in #Pub Sub

quality kandi Quality

  • kafka-python has 0 bugs and 0 code smells.
kafka-python Quality
Best in #Pub Sub
Average in #Pub Sub
kafka-python Quality
Best in #Pub Sub
Average in #Pub Sub

securitySecurity

  • kafka-python has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
  • kafka-python code analysis shows 0 unresolved vulnerabilities.
  • There are 0 security hotspots that need review.
kafka-python Security
Best in #Pub Sub
Average in #Pub Sub
kafka-python Security
Best in #Pub Sub
Average in #Pub Sub

license License

  • kafka-python is licensed under the Apache-2.0 License. This license is Permissive.
  • Permissive licenses have the least restrictions, and you can use them in most projects.
kafka-python License
Best in #Pub Sub
Average in #Pub Sub
kafka-python License
Best in #Pub Sub
Average in #Pub Sub

buildReuse

  • kafka-python releases are available to install and integrate.
  • Deployable package is available in PyPI.
  • Build file is available. You can build the component from source.
  • kafka-python saves you 10196 person hours of effort in developing the same functionality from scratch.
  • It has 20744 lines of code, 1500 functions and 140 files.
  • It has medium code complexity. Code complexity directly impacts maintainability of the code.
kafka-python Reuse
Best in #Pub Sub
Average in #Pub Sub
kafka-python Reuse
Best in #Pub Sub
Average in #Pub Sub
Top functions reviewed by kandi - BETA

kandi has reviewed kafka-python and discovered the below as its top functions. This is intended to give you an instant insight into kafka-python implemented functionality, and help decide if they suit your requirements.

  • Return the set of nodes that are ready to be ready
    • Returns the leader for the given partition
    • Return the number of pending waiters
    • Returns True if all items in the queue have been consumed
  • Append a record to the accumulator
    • Increment the counter
    • Decrement the counter
    • Allocates a buffer from the pool
  • Handle a heartbeat response
    • Start the Kafka consumer
      • Get the committed offset of the given partition
        • Process a produce response
          • Commit offsets to Kafka
            • Returns a list of consumer group offsets
              • Alter configs of given config resources
                • Handle a group coordinator response
                  • Perform assignment on given members
                    • Argument parser
                      • Create new topics
                        • Encode a varint
                          • Handle a JoinGroupResponse
                            • Check the version of the broker
                              • Start the consumer
                                • Populate the sorted set of topics
                                  • Invoked when a member assignment is received
                                    • Subscribe to the consumer

                                      Get all kandi verified functions for this library.

                                      Get all kandi verified functions for this library.

                                      kafka-python Key Features

                                      Python client for Apache Kafka

                                      kafka-python Examples and Code Snippets

                                      See all related Code Snippets

                                      Set consumer offset

                                      copy iconCopydownload iconDownload
                                      def demoListPageMessages(topicName):
                                          consumer = KafkaConsumer(bootstrap_servers="localhost:9092",auto_offset_reset='earliest',consumer_timeout_ms=1000)
                                          tp = TopicPartition(topicName, 0)
                                          consumer.assign([tp])
                                          consumer.seek_to_beginning()
                                          consumer.seek(tp, 5)
                                      
                                          for msg in consumer:
                                              print(msg.value)
                                      

                                      get results from kafka for a specific period of time

                                      copy iconCopydownload iconDownload
                                      partitions = consumer.assignment()
                                      
                                      month_ago_timestamp = int(month_ago.timestamp() * 1000)
                                      partition_to_timestamp = {part: month_ago_timestamp for part in partitions}
                                      mapping = consumer.offsets_for_times(partition_to_timestamp)
                                      
                                      for partition, offset_and_timestamp in partition_to_offset_and_timestamp.items():
                                          consumer.seek(partition, offset_and_timestamp[0])
                                      
                                      partitions = consumer.assignment()
                                      
                                      month_ago_timestamp = int(month_ago.timestamp() * 1000)
                                      partition_to_timestamp = {part: month_ago_timestamp for part in partitions}
                                      mapping = consumer.offsets_for_times(partition_to_timestamp)
                                      
                                      for partition, offset_and_timestamp in partition_to_offset_and_timestamp.items():
                                          consumer.seek(partition, offset_and_timestamp[0])
                                      
                                      partitions = consumer.assignment()
                                      
                                      month_ago_timestamp = int(month_ago.timestamp() * 1000)
                                      partition_to_timestamp = {part: month_ago_timestamp for part in partitions}
                                      mapping = consumer.offsets_for_times(partition_to_timestamp)
                                      
                                      for partition, offset_and_timestamp in partition_to_offset_and_timestamp.items():
                                          consumer.seek(partition, offset_and_timestamp[0])
                                      
                                      topic = 'some_topic_name'
                                      consumer = KafkaConsumer(bootstrap_servers=PROD_KAFKA_SERVER,
                                                               security_protocol=PROTOCOL,
                                                               group_id=GROUP_ID,
                                                               sasl_mechanism=SASL_MECHANISM, sasl_plain_username=SASL_USERNAME,
                                                               sasl_plain_password=SASL_PASSWORD)
                                      
                                      month_ago = (datetime.now() - relativedelta(months=1)).timestamp()
                                      topic_partition = TopicPartition(topic, 0)
                                      assigned_topic = [topic_partition]
                                      consumer.assign(assigned_topic)
                                      
                                      partitions = consumer.assignment()
                                      partition_to_timestamp = {part: int(month_ago * 1000) for part in partitions}
                                      end_offsets = consumer.end_offsets(list(partition_to_timestamp.keys()))
                                      
                                      mapping = consumer.offsets_for_times(partition_to_timestamp)
                                      for partition, ts in mapping.items():
                                          end_offset = end_offsets.get(partition)
                                          consumer.seek(partition, ts[0])
                                          for msg in consumer:
                                              value = json.loads(msg.value.decode('utf-8'))
                                              # do something
                                              if msg.offset == end_offset - 1:
                                                  consumer.close()
                                                  break
                                      

                                      How to consume messages in last N days using confluent-kafka-python?

                                      copy iconCopydownload iconDownload
                                      topicparts = [TopicPartition(topic_name, i) for i in range(0, 8)]
                                      
                                      whents = datetime.fromisoformat("2022-01-01T12:34:56.000")
                                      whenms = int(whents) * 1000   # to get milliseconds
                                      
                                      topicparts = [TopicPartition(topic_name, i, whenms) for i in range(0, 8)]
                                      
                                      topicparts = [TopicPartition(topic_name, i) for i in range(0, 8)]
                                      
                                      whents = datetime.fromisoformat("2022-01-01T12:34:56.000")
                                      whenms = int(whents) * 1000   # to get milliseconds
                                      
                                      topicparts = [TopicPartition(topic_name, i, whenms) for i in range(0, 8)]
                                      

                                      How do I get the the offset of last message of a Kafka topic using confluent-kafka-python?

                                      copy iconCopydownload iconDownload
                                      
                                      from confluent_kafka import Consumer, TopicPartition
                                      
                                      # create the Consumer with a connection to your brokers
                                      
                                      topic_name = "my.topic"
                                      
                                      topicparts = [TopicPartition(topic_name, i) for i in range(0, 8)]
                                      
                                      offsets = consumer.get_watermark_offsets(topicparts)
                                      
                                      for p in enumerate(offsets):
                                          msg = "partition {p} starting offset {so} last offset {lo}"
                                          print(msg.format(p=p, so=offsets[p][0], lo=offsets[p][1]))
                                      

                                      get result from concurrent.futures which runs a kafka consumer in a python ThreadPoolExecuter

                                      copy iconCopydownload iconDownload
                                      consumer = KafkaConsumer(CONSUMER_TOPIC, group_id='ME2',
                                                               bootstrap_servers=[f"{KAFKA_SERVER_HOST}:{KAFKA_SERVER_PORT}"],
                                                               value_deserializer=lambda x: json.loads(x.decode('utf-8')),
                                                               enable_auto_commit=True,
                                                               auto_offset_reset='latest',
                                                               max_poll_records=1,
                                                               max_poll_interval_ms=300000,
                                                               consumer_timeout_ms=300000)
                                      

                                      TypeError: partitions must be TopicPartition namedtuples

                                      copy iconCopydownload iconDownload
                                      from kafka.structs import TopicPartition
                                      
                                      
                                      ...
                                      consumer.seek_to_beginning(TopicPartition(topic_name,0))
                                      
                                      tps = [TopicPartition(topic_name, i) for i in range(32)]
                                      consumer.seek_to_beginning(tps)
                                      
                                      from kafka.structs import TopicPartition
                                      
                                      
                                      ...
                                      consumer.seek_to_beginning(TopicPartition(topic_name,0))
                                      
                                      tps = [TopicPartition(topic_name, i) for i in range(32)]
                                      consumer.seek_to_beginning(tps)
                                      

                                      Kafka-python KafkaProducer __init__ takes 1 positional argument but 2 were given

                                      copy iconCopydownload iconDownload
                                      connection = KafkaProducer(bootstrap_servers=kafka_settings['bootstrap_servers'])
                                      future = connection.send(kafka_settings['topic'], b'your_message_here')
                                      

                                      How to decorate a Python process in order to capture HTTP requests and responses?

                                      copy iconCopydownload iconDownload
                                      import logging
                                      import sys
                                      from http.client import HTTPConnection
                                      
                                      logger = logging.getLogger(__name__)
                                      logger.setLevel(logging.DEBUG)
                                      handler = logging.StreamHandler(stream=sys.stdout)
                                      formatter = logging.Formatter(fmt="%(name)s %(funcName)s %(levelname)s: %(message)s")
                                      handler.setFormatter(formatter)
                                      logger.addHandler(handler)
                                      
                                      put_request_content = []
                                      get_response_content = []
                                      request_bodies = []
                                      
                                      
                                      def decorate_HTTPConnection():
                                          """Taken loosely from https://github.com/getsentry/sentry-python/blob/master/sentry_sdk/integrations/stdlib.py"""
                                      
                                          global put_request_content, get_response_content, request_bodies
                                      
                                          real_putrequest = HTTPConnection.putrequest
                                          real_getresponse = HTTPConnection.getresponse
                                          real__send_output = HTTPConnection._send_output
                                      
                                          def new_putrequest(self, method, url, skip_host=False, skip_accept_encoding=False):
                                              logger.info(f'{method}: {url}')
                                              put_request_content.append((method, url))
                                      
                                              real_putrequest(self, method, url, skip_host=skip_host, skip_accept_encoding=skip_accept_encoding)
                                      
                                          def new_getresponse(self):
                                              returned_response = real_getresponse(self)
                                      
                                              logger.info(returned_response)
                                              get_response_content.append(returned_response)
                                      
                                              return returned_response
                                      
                                          def new__send_output(self, message_body=None, encode_chunked=False):
                                              logger.info(f'Message body: {message_body}')
                                              request_bodies.append(message_body)
                                      
                                              real__send_output(self, message_body=message_body, encode_chunked=encode_chunked)
                                      
                                          HTTPConnection.putrequest = new_putrequest
                                          HTTPConnection.getresponse = new_getresponse
                                          HTTPConnection._send_output = new__send_output
                                      
                                      
                                      decorate_HTTPConnection()
                                      
                                      
                                      import logging
                                      import sys
                                      import requests
                                      
                                      from http_profiler.connection_decorator import put_request_content, get_response_content, request_bodies
                                      
                                      logger = logging.getLogger(__name__)
                                      logger.setLevel(logging.DEBUG)
                                      handler = logging.StreamHandler(stream=sys.stdout)
                                      formatter = logging.Formatter(fmt="%(name)s %(funcName)s %(levelname)s: %(message)s")
                                      handler.setFormatter(formatter)
                                      logger.addHandler(handler)
                                      
                                      
                                      def test_profile_http_get_via_requests_library(url):
                                          prev_len_put_request_content = len(put_request_content)
                                          prev_len_get_repsonse_content = len(get_response_content)
                                          prev_len_request_bodies = len(request_bodies)
                                      
                                          logger.info(f"Starting the test: GET {url}")
                                          resp = requests.get(url=url)
                                      
                                          assert resp is not None
                                          assert len(put_request_content) - prev_len_put_request_content == 1
                                          assert len(get_response_content) - prev_len_get_repsonse_content == 1
                                          assert len(request_bodies) - prev_len_request_bodies == 1
                                      
                                      
                                      def test_profile_http_post_via_requests_library(url, data=None):
                                          if data is None:
                                              data = {"message": "Hello world!"}
                                      
                                          prev_len_put_request_content = len(put_request_content)
                                          prev_len_get_repsonse_content = len(get_response_content)
                                          prev_len_request_bodies = len(request_bodies)
                                      
                                          logger.info(f"Starting the test: POST {url} with {data}")
                                          resp = requests.post(url=url, data=data)
                                      
                                          assert resp is not None
                                          assert len(put_request_content) - prev_len_put_request_content == 1
                                          assert len(get_response_content) - prev_len_get_repsonse_content == 1
                                          assert len(request_bodies) - prev_len_request_bodies == 1
                                      
                                      
                                      if __name__ == "__main__":
                                          test_profile_http_get_via_requests_library("https://example.com")
                                          test_profile_http_post_via_requests_library("https://example.com")
                                          logger.info(f'Requests: {put_request_content}')
                                          logger.info(f'Request bodies: {request_bodies}')
                                          logger.info(f'Responses: {[f"{response.status} {response.reason}" for response in get_response_content]}')
                                      
                                      
                                      __main__ test_profile_http_get_via_requests_library INFO: Starting the test: GET https://example.com
                                      http_profiler.connection_decorator new_putrequest INFO: GET: /
                                      http_profiler.connection_decorator new__send_output INFO: Message body: None
                                      http_profiler.connection_decorator new_getresponse INFO: <http.client.HTTPResponse object at 0x7ff40aa5df10>
                                      __main__ test_profile_http_post_via_requests_library INFO: Starting the test: POST https://example.com with {'message': 'Hello world!'}
                                      http_profiler.connection_decorator new_putrequest INFO: POST: /
                                      http_profiler.connection_decorator new__send_output INFO: Message body: b'message=Hello+world%21'
                                      http_profiler.connection_decorator new_getresponse INFO: <http.client.HTTPResponse object at 0x7ff40aa5deb0>
                                      __main__ <module> INFO: Requests: [('GET', '/'), ('POST', '/')]
                                      __main__ <module> INFO: Request bodies: [None, b'message=Hello+world%21']
                                      __main__ <module> INFO: Responses: ['200 OK', '200 OK']
                                      
                                      import logging
                                      import sys
                                      from http.client import HTTPConnection
                                      
                                      logger = logging.getLogger(__name__)
                                      logger.setLevel(logging.DEBUG)
                                      handler = logging.StreamHandler(stream=sys.stdout)
                                      formatter = logging.Formatter(fmt="%(name)s %(funcName)s %(levelname)s: %(message)s")
                                      handler.setFormatter(formatter)
                                      logger.addHandler(handler)
                                      
                                      put_request_content = []
                                      get_response_content = []
                                      request_bodies = []
                                      
                                      
                                      def decorate_HTTPConnection():
                                          """Taken loosely from https://github.com/getsentry/sentry-python/blob/master/sentry_sdk/integrations/stdlib.py"""
                                      
                                          global put_request_content, get_response_content, request_bodies
                                      
                                          real_putrequest = HTTPConnection.putrequest
                                          real_getresponse = HTTPConnection.getresponse
                                          real__send_output = HTTPConnection._send_output
                                      
                                          def new_putrequest(self, method, url, skip_host=False, skip_accept_encoding=False):
                                              logger.info(f'{method}: {url}')
                                              put_request_content.append((method, url))
                                      
                                              real_putrequest(self, method, url, skip_host=skip_host, skip_accept_encoding=skip_accept_encoding)
                                      
                                          def new_getresponse(self):
                                              returned_response = real_getresponse(self)
                                      
                                              logger.info(returned_response)
                                              get_response_content.append(returned_response)
                                      
                                              return returned_response
                                      
                                          def new__send_output(self, message_body=None, encode_chunked=False):
                                              logger.info(f'Message body: {message_body}')
                                              request_bodies.append(message_body)
                                      
                                              real__send_output(self, message_body=message_body, encode_chunked=encode_chunked)
                                      
                                          HTTPConnection.putrequest = new_putrequest
                                          HTTPConnection.getresponse = new_getresponse
                                          HTTPConnection._send_output = new__send_output
                                      
                                      
                                      decorate_HTTPConnection()
                                      
                                      
                                      import logging
                                      import sys
                                      import requests
                                      
                                      from http_profiler.connection_decorator import put_request_content, get_response_content, request_bodies
                                      
                                      logger = logging.getLogger(__name__)
                                      logger.setLevel(logging.DEBUG)
                                      handler = logging.StreamHandler(stream=sys.stdout)
                                      formatter = logging.Formatter(fmt="%(name)s %(funcName)s %(levelname)s: %(message)s")
                                      handler.setFormatter(formatter)
                                      logger.addHandler(handler)
                                      
                                      
                                      def test_profile_http_get_via_requests_library(url):
                                          prev_len_put_request_content = len(put_request_content)
                                          prev_len_get_repsonse_content = len(get_response_content)
                                          prev_len_request_bodies = len(request_bodies)
                                      
                                          logger.info(f"Starting the test: GET {url}")
                                          resp = requests.get(url=url)
                                      
                                          assert resp is not None
                                          assert len(put_request_content) - prev_len_put_request_content == 1
                                          assert len(get_response_content) - prev_len_get_repsonse_content == 1
                                          assert len(request_bodies) - prev_len_request_bodies == 1
                                      
                                      
                                      def test_profile_http_post_via_requests_library(url, data=None):
                                          if data is None:
                                              data = {"message": "Hello world!"}
                                      
                                          prev_len_put_request_content = len(put_request_content)
                                          prev_len_get_repsonse_content = len(get_response_content)
                                          prev_len_request_bodies = len(request_bodies)
                                      
                                          logger.info(f"Starting the test: POST {url} with {data}")
                                          resp = requests.post(url=url, data=data)
                                      
                                          assert resp is not None
                                          assert len(put_request_content) - prev_len_put_request_content == 1
                                          assert len(get_response_content) - prev_len_get_repsonse_content == 1
                                          assert len(request_bodies) - prev_len_request_bodies == 1
                                      
                                      
                                      if __name__ == "__main__":
                                          test_profile_http_get_via_requests_library("https://example.com")
                                          test_profile_http_post_via_requests_library("https://example.com")
                                          logger.info(f'Requests: {put_request_content}')
                                          logger.info(f'Request bodies: {request_bodies}')
                                          logger.info(f'Responses: {[f"{response.status} {response.reason}" for response in get_response_content]}')
                                      
                                      
                                      __main__ test_profile_http_get_via_requests_library INFO: Starting the test: GET https://example.com
                                      http_profiler.connection_decorator new_putrequest INFO: GET: /
                                      http_profiler.connection_decorator new__send_output INFO: Message body: None
                                      http_profiler.connection_decorator new_getresponse INFO: <http.client.HTTPResponse object at 0x7ff40aa5df10>
                                      __main__ test_profile_http_post_via_requests_library INFO: Starting the test: POST https://example.com with {'message': 'Hello world!'}
                                      http_profiler.connection_decorator new_putrequest INFO: POST: /
                                      http_profiler.connection_decorator new__send_output INFO: Message body: b'message=Hello+world%21'
                                      http_profiler.connection_decorator new_getresponse INFO: <http.client.HTTPResponse object at 0x7ff40aa5deb0>
                                      __main__ <module> INFO: Requests: [('GET', '/'), ('POST', '/')]
                                      __main__ <module> INFO: Request bodies: [None, b'message=Hello+world%21']
                                      __main__ <module> INFO: Responses: ['200 OK', '200 OK']
                                      
                                      import logging
                                      import sys
                                      from http.client import HTTPConnection
                                      
                                      logger = logging.getLogger(__name__)
                                      logger.setLevel(logging.DEBUG)
                                      handler = logging.StreamHandler(stream=sys.stdout)
                                      formatter = logging.Formatter(fmt="%(name)s %(funcName)s %(levelname)s: %(message)s")
                                      handler.setFormatter(formatter)
                                      logger.addHandler(handler)
                                      
                                      put_request_content = []
                                      get_response_content = []
                                      request_bodies = []
                                      
                                      
                                      def decorate_HTTPConnection():
                                          """Taken loosely from https://github.com/getsentry/sentry-python/blob/master/sentry_sdk/integrations/stdlib.py"""
                                      
                                          global put_request_content, get_response_content, request_bodies
                                      
                                          real_putrequest = HTTPConnection.putrequest
                                          real_getresponse = HTTPConnection.getresponse
                                          real__send_output = HTTPConnection._send_output
                                      
                                          def new_putrequest(self, method, url, skip_host=False, skip_accept_encoding=False):
                                              logger.info(f'{method}: {url}')
                                              put_request_content.append((method, url))
                                      
                                              real_putrequest(self, method, url, skip_host=skip_host, skip_accept_encoding=skip_accept_encoding)
                                      
                                          def new_getresponse(self):
                                              returned_response = real_getresponse(self)
                                      
                                              logger.info(returned_response)
                                              get_response_content.append(returned_response)
                                      
                                              return returned_response
                                      
                                          def new__send_output(self, message_body=None, encode_chunked=False):
                                              logger.info(f'Message body: {message_body}')
                                              request_bodies.append(message_body)
                                      
                                              real__send_output(self, message_body=message_body, encode_chunked=encode_chunked)
                                      
                                          HTTPConnection.putrequest = new_putrequest
                                          HTTPConnection.getresponse = new_getresponse
                                          HTTPConnection._send_output = new__send_output
                                      
                                      
                                      decorate_HTTPConnection()
                                      
                                      
                                      import logging
                                      import sys
                                      import requests
                                      
                                      from http_profiler.connection_decorator import put_request_content, get_response_content, request_bodies
                                      
                                      logger = logging.getLogger(__name__)
                                      logger.setLevel(logging.DEBUG)
                                      handler = logging.StreamHandler(stream=sys.stdout)
                                      formatter = logging.Formatter(fmt="%(name)s %(funcName)s %(levelname)s: %(message)s")
                                      handler.setFormatter(formatter)
                                      logger.addHandler(handler)
                                      
                                      
                                      def test_profile_http_get_via_requests_library(url):
                                          prev_len_put_request_content = len(put_request_content)
                                          prev_len_get_repsonse_content = len(get_response_content)
                                          prev_len_request_bodies = len(request_bodies)
                                      
                                          logger.info(f"Starting the test: GET {url}")
                                          resp = requests.get(url=url)
                                      
                                          assert resp is not None
                                          assert len(put_request_content) - prev_len_put_request_content == 1
                                          assert len(get_response_content) - prev_len_get_repsonse_content == 1
                                          assert len(request_bodies) - prev_len_request_bodies == 1
                                      
                                      
                                      def test_profile_http_post_via_requests_library(url, data=None):
                                          if data is None:
                                              data = {"message": "Hello world!"}
                                      
                                          prev_len_put_request_content = len(put_request_content)
                                          prev_len_get_repsonse_content = len(get_response_content)
                                          prev_len_request_bodies = len(request_bodies)
                                      
                                          logger.info(f"Starting the test: POST {url} with {data}")
                                          resp = requests.post(url=url, data=data)
                                      
                                          assert resp is not None
                                          assert len(put_request_content) - prev_len_put_request_content == 1
                                          assert len(get_response_content) - prev_len_get_repsonse_content == 1
                                          assert len(request_bodies) - prev_len_request_bodies == 1
                                      
                                      
                                      if __name__ == "__main__":
                                          test_profile_http_get_via_requests_library("https://example.com")
                                          test_profile_http_post_via_requests_library("https://example.com")
                                          logger.info(f'Requests: {put_request_content}')
                                          logger.info(f'Request bodies: {request_bodies}')
                                          logger.info(f'Responses: {[f"{response.status} {response.reason}" for response in get_response_content]}')
                                      
                                      
                                      __main__ test_profile_http_get_via_requests_library INFO: Starting the test: GET https://example.com
                                      http_profiler.connection_decorator new_putrequest INFO: GET: /
                                      http_profiler.connection_decorator new__send_output INFO: Message body: None
                                      http_profiler.connection_decorator new_getresponse INFO: <http.client.HTTPResponse object at 0x7ff40aa5df10>
                                      __main__ test_profile_http_post_via_requests_library INFO: Starting the test: POST https://example.com with {'message': 'Hello world!'}
                                      http_profiler.connection_decorator new_putrequest INFO: POST: /
                                      http_profiler.connection_decorator new__send_output INFO: Message body: b'message=Hello+world%21'
                                      http_profiler.connection_decorator new_getresponse INFO: <http.client.HTTPResponse object at 0x7ff40aa5deb0>
                                      __main__ <module> INFO: Requests: [('GET', '/'), ('POST', '/')]
                                      __main__ <module> INFO: Request bodies: [None, b'message=Hello+world%21']
                                      __main__ <module> INFO: Responses: ['200 OK', '200 OK']
                                      

                                      Disable Certificate validation in SchemaRegistryClient Confluent Kafka

                                      copy iconCopydownload iconDownload
                                      topic="mytopic"
                                      registry_configuration="schema registry url"
                                      url = urljoin(registry_configuration, f'/subjects/{topic}-value/versions/latest')
                                      
                                      
                                      schema_registry_response = requests.get(url, verify=False)
                                      schema_registry_response.raise_for_status()
                                              
                                      consumption_schema=schema_registry_response.json()['schema']
                                      consumption_schema = avro.schema.parse(consumption_schema)
                                      
                                      schema_registry_client = SchemaRegistryClient({'url': registry_configuration})
                                      
                                      basic_conf=_get_basic_configuration()
                                      consumer_conf = {
                                                       'group.id': 'myconsumergroupid',
                                                       'auto.offset.reset': 'earliest'}
                                      consumer_conf.update(basic_conf)
                                      
                                      
                                      cn=Consumer(consumer_conf)    
                                      cn.subscribe(['mytopic'])
                                      reader = DatumReader(consumption_schema)
                                      
                                      while True:
                                        msg=cn.poll(10)
                                        if msg is None:
                                          break
                                        m=msg.value()
                                        message_bytes = io.BytesIO(m)
                                        
                                        message_bytes.seek(5)
                                        decoder = BinaryDecoder(message_bytes)
                                        event_dict = reader.read(decoder) 
                                        print(event_dict)
                                        
                                      

                                      How to read text files in a folder and write to Kafka Topic on windows

                                      copy iconCopydownload iconDownload
                                      bin\windows\connect-standalone config\connect-standalone.properties spooldir.properties
                                      

                                      See all related Code Snippets

                                      Community Discussions

                                      Trending Discussions on kafka-python
                                      • &quot;The filename or extension is too long&quot; while installing confluent-kafka?
                                      • Set consumer offset
                                      • get results from kafka for a specific period of time
                                      • How to consume messages in last N days using confluent-kafka-python?
                                      • How do I get the the offset of last message of a Kafka topic using confluent-kafka-python?
                                      • Kafka-connect to PostgreSQL - org.apache.kafka.connect.errors.DataException: Failed to deserialize topic to to Avro
                                      • confluent-kafka-python json_producer : Unrecognized field: schemaType
                                      • Deserialize Protobuf kafka messages with Flink
                                      • How to run Faust from Docker - ERROR: Failed building wheel for python-rocksdb
                                      • How to use kafka connect with JDBC sink and source using python
                                      Trending Discussions on kafka-python

                                      QUESTION

                                      &quot;The filename or extension is too long&quot; while installing confluent-kafka?

                                      Asked 2022-Mar-30 at 05:53

                                      I have some trouble installing confluent-kafka by using "pip install confluent-kafka". But I got this error: "The filename or extension is too long." Details are below.

                                      Collecting confluent-kafka
                                        Using cached confluent-kafka-1.8.2.tar.gz (104 kB)
                                        Preparing metadata (setup.py) ... done
                                      Building wheels for collected packages: confluent-kafka
                                        Building wheel for confluent-kafka (setup.py) ... error
                                        error: subprocess-exited-with-error
                                      
                                        × python setup.py bdist_wheel did not run successfully.
                                        │ exit code: 1
                                        ╰─> [48 lines of output]
                                            running bdist_wheel
                                            running build
                                            running build_py
                                            creating build
                                            creating build\lib.win-amd64-3.10
                                            creating build\lib.win-amd64-3.10\confluent_kafka
                                            copying src\confluent_kafka\deserializing_consumer.py -> build\lib.win-amd64-3.10\confluent_kafka
                                            copying src\confluent_kafka\error.py -> build\lib.win-amd64-3.10\confluent_kafka
                                            copying src\confluent_kafka\serializing_producer.py -> build\lib.win-amd64-3.10\confluent_kafka
                                            copying src\confluent_kafka\__init__.py -> build\lib.win-amd64-3.10\confluent_kafka
                                            creating build\lib.win-amd64-3.10\confluent_kafka\admin
                                            copying src\confluent_kafka\admin\__init__.py -> build\lib.win-amd64-3.10\confluent_kafka\admin
                                            creating build\lib.win-amd64-3.10\confluent_kafka\avro
                                            copying src\confluent_kafka\avro\cached_schema_registry_client.py -> build\lib.win-amd64-3.10\confluent_kafka\avro
                                            copying src\confluent_kafka\avro\error.py -> build\lib.win-amd64-3.10\confluent_kafka\avro
                                            copying src\confluent_kafka\avro\load.py -> build\lib.win-amd64-3.10\confluent_kafka\avro
                                            copying src\confluent_kafka\avro\__init__.py -> build\lib.win-amd64-3.10\confluent_kafka\avro
                                            creating build\lib.win-amd64-3.10\confluent_kafka\kafkatest
                                            copying src\confluent_kafka\kafkatest\verifiable_client.py -> build\lib.win-amd64-3.10\confluent_kafka\kafkatest
                                            copying src\confluent_kafka\kafkatest\verifiable_consumer.py -> build\lib.win-amd64-3.10\confluent_kafka\kafkatest
                                            copying src\confluent_kafka\kafkatest\verifiable_producer.py -> build\lib.win-amd64-3.10\confluent_kafka\kafkatest
                                            copying src\confluent_kafka\kafkatest\__init__.py -> build\lib.win-amd64-3.10\confluent_kafka\kafkatest
                                            creating build\lib.win-amd64-3.10\confluent_kafka\schema_registry
                                            copying src\confluent_kafka\schema_registry\avro.py -> build\lib.win-amd64-3.10\confluent_kafka\schema_registry
                                            copying src\confluent_kafka\schema_registry\error.py -> build\lib.win-amd64-3.10\confluent_kafka\schema_registry
                                            copying src\confluent_kafka\schema_registry\json_schema.py -> build\lib.win-amd64-3.10\confluent_kafka\schema_registry
                                            copying src\confluent_kafka\schema_registry\protobuf.py -> build\lib.win-amd64-3.10\confluent_kafka\schema_registry
                                            copying src\confluent_kafka\schema_registry\schema_registry_client.py -> build\lib.win-amd64-3.10\confluent_kafka\schema_registry
                                            copying src\confluent_kafka\schema_registry\__init__.py -> build\lib.win-amd64-3.10\confluent_kafka\schema_registry
                                            creating build\lib.win-amd64-3.10\confluent_kafka\serialization
                                            copying src\confluent_kafka\serialization\__init__.py -> build\lib.win-amd64-3.10\confluent_kafka\serialization
                                            creating build\lib.win-amd64-3.10\confluent_kafka\avro\serializer
                                            copying src\confluent_kafka\avro\serializer\message_serializer.py -> build\lib.win-amd64-3.10\confluent_kafka\avro\serializer
                                            copying src\confluent_kafka\avro\serializer\__init__.py -> build\lib.win-amd64-3.10\confluent_kafka\avro\serializer
                                            running build_ext
                                            building 'confluent_kafka.cimpl' extension
                                            creating build\temp.win-amd64-3.10
                                            creating build\temp.win-amd64-3.10\Release
                                            creating build\temp.win-amd64-3.10\Release\Users
                                            creating build\temp.win-amd64-3.10\Release\Users\Among
                                            creating build\temp.win-amd64-3.10\Release\Users\Among\AppData
                                            creating build\temp.win-amd64-3.10\Release\Users\Among\AppData\Local
                                            creating build\temp.win-amd64-3.10\Release\Users\Among\AppData\Local\Temp
                                            creating build\temp.win-amd64-3.10\Release\Users\Among\AppData\Local\Temp\pip-install-e4mpsn3s
                                            creating build\temp.win-amd64-3.10\Release\Users\Among\AppData\Local\Temp\pip-install-e4mpsn3s\confluent-kafka_a53471ea97464e83aa35d4164a2c7040
                                            creating build\temp.win-amd64-3.10\Release\Users\Among\AppData\Local\Temp\pip-install-e4mpsn3s\confluent-kafka_a53471ea97464e83aa35d4164a2c7040\src
                                            creating build\temp.win-amd64-3.10\Release\Users\Among\AppData\Local\Temp\pip-install-e4mpsn3s\confluent-kafka_a53471ea97464e83aa35d4164a2c7040\src\confluent_kafka
                                            error: could not create 'build\temp.win-amd64-3.10\Release\Users\Among\AppData\Local\Temp\pip-install-e4mpsn3s\confluent-kafka_a53471ea97464e83aa35d4164a2c7040\src\confluent_kafka': The filename or extension is too long
                                            [end of output]
                                      
                                        note: This error originates from a subprocess, and is likely not a problem with pip.
                                        ERROR: Failed building wheel for confluent-kafka
                                        Running setup.py clean for confluent-kafka
                                      Failed to build confluent-kafka
                                      Installing collected packages: confluent-kafka
                                        Running setup.py install for confluent-kafka ... error
                                        error: subprocess-exited-with-error
                                      
                                        × Running setup.py install for confluent-kafka did not run successfully.
                                        │ exit code: 1
                                        ╰─> [50 lines of output]
                                            running install
                                            C:\Users\Among\AppData\Local\Programs\Python\Python310\lib\site-packages\setuptools\command\install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools.
                                              warnings.warn(
                                            running build
                                            running build_py
                                            creating build
                                            creating build\lib.win-amd64-3.10
                                            creating build\lib.win-amd64-3.10\confluent_kafka
                                            copying src\confluent_kafka\deserializing_consumer.py -> build\lib.win-amd64-3.10\confluent_kafka
                                            copying src\confluent_kafka\error.py -> build\lib.win-amd64-3.10\confluent_kafka
                                            copying src\confluent_kafka\serializing_producer.py -> build\lib.win-amd64-3.10\confluent_kafka
                                            copying src\confluent_kafka\__init__.py -> build\lib.win-amd64-3.10\confluent_kafka
                                            creating build\lib.win-amd64-3.10\confluent_kafka\admin
                                            copying src\confluent_kafka\admin\__init__.py -> build\lib.win-amd64-3.10\confluent_kafka\admin
                                            creating build\lib.win-amd64-3.10\confluent_kafka\avro
                                            copying src\confluent_kafka\avro\cached_schema_registry_client.py -> build\lib.win-amd64-3.10\confluent_kafka\avro
                                            copying src\confluent_kafka\avro\error.py -> build\lib.win-amd64-3.10\confluent_kafka\avro
                                            copying src\confluent_kafka\avro\load.py -> build\lib.win-amd64-3.10\confluent_kafka\avro
                                            copying src\confluent_kafka\avro\__init__.py -> build\lib.win-amd64-3.10\confluent_kafka\avro
                                            creating build\lib.win-amd64-3.10\confluent_kafka\kafkatest
                                            copying src\confluent_kafka\kafkatest\verifiable_client.py -> build\lib.win-amd64-3.10\confluent_kafka\kafkatest
                                            copying src\confluent_kafka\kafkatest\verifiable_consumer.py -> build\lib.win-amd64-3.10\confluent_kafka\kafkatest
                                            copying src\confluent_kafka\kafkatest\verifiable_producer.py -> build\lib.win-amd64-3.10\confluent_kafka\kafkatest
                                            copying src\confluent_kafka\kafkatest\__init__.py -> build\lib.win-amd64-3.10\confluent_kafka\kafkatest
                                            creating build\lib.win-amd64-3.10\confluent_kafka\schema_registry
                                            copying src\confluent_kafka\schema_registry\avro.py -> build\lib.win-amd64-3.10\confluent_kafka\schema_registry
                                            copying src\confluent_kafka\schema_registry\error.py -> build\lib.win-amd64-3.10\confluent_kafka\schema_registry
                                            copying src\confluent_kafka\schema_registry\json_schema.py -> build\lib.win-amd64-3.10\confluent_kafka\schema_registry
                                            copying src\confluent_kafka\schema_registry\protobuf.py -> build\lib.win-amd64-3.10\confluent_kafka\schema_registry
                                            copying src\confluent_kafka\schema_registry\schema_registry_client.py -> build\lib.win-amd64-3.10\confluent_kafka\schema_registry
                                            copying src\confluent_kafka\schema_registry\__init__.py -> build\lib.win-amd64-3.10\confluent_kafka\schema_registry
                                            creating build\lib.win-amd64-3.10\confluent_kafka\serialization
                                            copying src\confluent_kafka\serialization\__init__.py -> build\lib.win-amd64-3.10\confluent_kafka\serialization
                                            creating build\lib.win-amd64-3.10\confluent_kafka\avro\serializer
                                            copying src\confluent_kafka\avro\serializer\message_serializer.py -> build\lib.win-amd64-3.10\confluent_kafka\avro\serializer
                                            copying src\confluent_kafka\avro\serializer\__init__.py -> build\lib.win-amd64-3.10\confluent_kafka\avro\serializer
                                            running build_ext
                                            building 'confluent_kafka.cimpl' extension
                                            creating build\temp.win-amd64-3.10
                                            creating build\temp.win-amd64-3.10\Release
                                            creating build\temp.win-amd64-3.10\Release\Users
                                            creating build\temp.win-amd64-3.10\Release\Users\Among
                                            creating build\temp.win-amd64-3.10\Release\Users\Among\AppData
                                            creating build\temp.win-amd64-3.10\Release\Users\Among\AppData\Local
                                            creating build\temp.win-amd64-3.10\Release\Users\Among\AppData\Local\Temp
                                            creating build\temp.win-amd64-3.10\Release\Users\Among\AppData\Local\Temp\pip-install-e4mpsn3s
                                            creating build\temp.win-amd64-3.10\Release\Users\Among\AppData\Local\Temp\pip-install-e4mpsn3s\confluent-kafka_a53471ea97464e83aa35d4164a2c7040
                                            creating build\temp.win-amd64-3.10\Release\Users\Among\AppData\Local\Temp\pip-install-e4mpsn3s\confluent-kafka_a53471ea97464e83aa35d4164a2c7040\src
                                            creating build\temp.win-amd64-3.10\Release\Users\Among\AppData\Local\Temp\pip-install-e4mpsn3s\confluent-kafka_a53471ea97464e83aa35d4164a2c7040\src\confluent_kafka
                                            error: could not create 'build\temp.win-amd64-3.10\Release\Users\Among\AppData\Local\Temp\pip-install-e4mpsn3s\confluent-kafka_a53471ea97464e83aa35d4164a2c7040\src\confluent_kafka': The filename or extension is too long
                                            [end of output]
                                      
                                        note: This error originates from a subprocess, and is likely not a problem with pip.
                                      error: legacy-install-failure
                                      
                                      × Encountered error while trying to install package.
                                      ╰─> confluent-kafka
                                      
                                      note: This is an issue with the package mentioned above, not pip.
                                      hint: See above for output from the failure.
                                      

                                      I tried to use other method in this link "https://github.com/confluentinc/confluent-kafka-python/issues/1002". But it also doesn't work.

                                      Please help me!!

                                      ANSWER

                                      Answered 2022-Mar-30 at 05:53

                                      Windows versions lower than 1607 have limitations in place for maximum length for a path (set by MAX_PATH), which restricts file paths' lengths to be capped at 260 characters.

                                      Fortunately, if you are running Windows 10 version 1607, you can enable support for long paths:

                                      1. Click Win+R
                                      2. Type regedit and press Enter
                                      3. Go to Computer\HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\FileSystem
                                      4. Edit or create a key named LongPathsEnabled (type: REG_DWORD)
                                      5. Enter 1 as its value and press OK.
                                      6. Restart your system and try again. It should work now.

                                      Read more: Maximum Path Length Limitation in Windows

                                      Source https://stackoverflow.com/questions/71477633

                                      Community Discussions, Code Snippets contain sources that include Stack Exchange Network

                                      Vulnerabilities

                                      No vulnerabilities reported

                                      Install kafka-python

                                      You can install using 'pip install kafka-python' or download it from GitHub, PyPI.
                                      You can use kafka-python like any standard Python library. You will need to make sure that you have a development environment consisting of a Python distribution including header files, a compiler, pip, and git installed. Make sure that your pip, setuptools, and wheel are up to date. When using pip it is generally recommended to install packages in a virtual environment to avoid changes to the system.

                                      Support

                                      For any new features, suggestions and bugs create an issue on GitHub. If you have any questions check and ask questions on community page Stack Overflow .

                                      DOWNLOAD this Library from

                                      Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from
                                      over 430 million Knowledge Items
                                      Find more libraries
                                      Reuse Solution Kits and Libraries Curated by Popular Use Cases
                                      Explore Kits

                                      Save this library and start creating your kit

                                      Explore Related Topics

                                      Share this Page

                                      share link
                                      Reuse Pre-built Kits with kafka-python
                                      Consider Popular Pub Sub Libraries
                                      Try Top Libraries by dpkp
                                      Compare Pub Sub Libraries with Highest Support
                                      Compare Pub Sub Libraries with Highest Quality
                                      Compare Pub Sub Libraries with Highest Security
                                      Compare Pub Sub Libraries with Permissive License
                                      Compare Pub Sub Libraries with Highest Reuse
                                      Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from
                                      over 430 million Knowledge Items
                                      Find more libraries
                                      Reuse Solution Kits and Libraries Curated by Popular Use Cases
                                      Explore Kits

                                      Save this library and start creating your kit

                                      • © 2022 Open Weaver Inc.