confluent-kafka-python | Confluent 's Kafka Python Client | Pub Sub library

 by   confluentinc Python Version: v2.1.1 License: Non-SPDX

kandi X-RAY | confluent-kafka-python Summary

kandi X-RAY | confluent-kafka-python Summary

confluent-kafka-python is a Python library typically used in Messaging, Pub Sub, Kafka applications. confluent-kafka-python has no bugs, it has no vulnerabilities, it has build file available and it has high support. However confluent-kafka-python has a Non-SPDX License. You can install using 'pip install confluent-kafka-python' or download it from GitHub, PyPI.

confluent-kafka-python provides a high-level Producer, Consumer and AdminClient compatible with all [Apache KafkaTM] brokers >= v0.8, [Confluent Cloud] and the [Confluent Platform] The client is:. See the [API documentation] for more info.
Support
    Quality
      Security
        License
          Reuse

            kandi-support Support

              confluent-kafka-python has a highly active ecosystem.
              It has 3292 star(s) with 842 fork(s). There are 336 watchers for this library.
              There were 1 major release(s) in the last 12 months.
              There are 213 open issues and 827 have been closed. On average issues are closed in 117 days. There are 50 open pull requests and 0 closed requests.
              It has a positive sentiment in the developer community.
              The latest version of confluent-kafka-python is v2.1.1

            kandi-Quality Quality

              confluent-kafka-python has 0 bugs and 0 code smells.

            kandi-Security Security

              confluent-kafka-python has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
              confluent-kafka-python code analysis shows 0 unresolved vulnerabilities.
              There are 0 security hotspots that need review.

            kandi-License License

              confluent-kafka-python has a Non-SPDX License.
              Non-SPDX licenses can be open source with a non SPDX compliant license, or non open source licenses, and you need to review them closely before use.

            kandi-Reuse Reuse

              confluent-kafka-python releases are available to install and integrate.
              Deployable package is available in PyPI.
              Build file is available. You can build the component from source.
              Installation instructions, examples and code snippets are available.

            Top functions reviewed by kandi - BETA

            kandi has reviewed confluent-kafka-python and discovered the below as its top functions. This is intended to give you an instant insight into confluent-kafka-python implemented functionality, and help decide if they suit your requirements.
            • Temporarily alter configs
            • Create futures for each key in futmap_keys
            • Create a future
            • Alters configured configs
            • Register a new Avro schema
            • Store a schema
            • Create an ACL
            • Create an ACL binding
            • Delete ACL
            • Parse a string or None
            • Update compatibility level
            • Make a Kafka result from a f
            • Make resource result from futures
            • Get the compatibility level
            • Returns the compatibility level
            • Delete a subject
            • Collect artifacts matching the git ref
            • Gets a registered schema
            • Tests whether a subject is compatible with the given schema
            • Retrieve a message from the server
            • List topics
            • Check the schema for a subject
            • Describe ACLs
            • Test whether a subject is compatible
            • Get a schema by its ID
            • Get the version of a subject
            Get all kandi verified functions for this library.

            confluent-kafka-python Key Features

            No Key Features are available at this moment for confluent-kafka-python.

            confluent-kafka-python Examples and Code Snippets

            How to consume messages in last N days using confluent-kafka-python?
            Pythondot img1Lines of Code : 7dot img1License : Strong Copyleft (CC BY-SA 4.0)
            copy iconCopy
            topicparts = [TopicPartition(topic_name, i) for i in range(0, 8)]
            
            whents = datetime.fromisoformat("2022-01-01T12:34:56.000")
            whenms = int(whents) * 1000   # to get milliseconds
            
            topicparts = [TopicPartition(topic_n
            How do I get the the offset of last message of a Kafka topic using confluent-kafka-python?
            Pythondot img2Lines of Code : 15dot img2License : Strong Copyleft (CC BY-SA 4.0)
            copy iconCopy
            
            from confluent_kafka import Consumer, TopicPartition
            
            # create the Consumer with a connection to your brokers
            
            topic_name = "my.topic"
            
            topicparts = [TopicPartition(topic_name, i) for i in range(0, 8)]
            
            offsets = consumer.get_watermark_of
            copy iconCopy
            Welcome to
                  ____              __
                 / __/__  ___ _____/ /__
                _\ \/ _ \/ _ `/ __/  '_/
               /___/ .__/\_,_/_/ /_/\_\   version 3.2.0
                  /_/
                     
            Using Scala version 2.12.15 (OpenJDK 64-Bit Server VM, Java 1.8.0_292)
            <
            Not receiving messages from Kafka Topic
            Pythondot img4Lines of Code : 85dot img4License : Strong Copyleft (CC BY-SA 4.0)
            copy iconCopy
            from queue import Queue
            import threading
            import time
            import json
            from kafka_message_consumer import KafkaMessageConsumer
            from kafka_discovery_executor import KafkaDiscoveryExecutor
            
            def main():
                with open('kafka_properties.json') as f:
            
            Python confluent_kafka - List All Consumer currently Listening to a Topic
            Pythondot img5Lines of Code : 12dot img5License : Strong Copyleft (CC BY-SA 4.0)
            copy iconCopy
            from confluent_kafka.admin import AdminClient
            broker = '1.1.1.1:9092' # replace appropriately
            a = AdminClient({'bootstrap.servers': broker})
            groups = a.list_groups(timeout=10)
            print(" {} consumer groups".format(len(groups)))
            for g in group
            Kafka message key is None
            Pythondot img6Lines of Code : 3dot img6License : Strong Copyleft (CC BY-SA 4.0)
            copy iconCopy
            p.produce(topic, key="key", value="value")
            
            
            How do I decode the schema id from avro event in Kafka wire format?
            Pythondot img7Lines of Code : 3dot img7License : Strong Copyleft (CC BY-SA 4.0)
            copy iconCopy
            from struct import unpack
            magic, schema_id = unpack('>bI', header_bytes)
            
            How to read and process high priority messages in kafka consumer?
            Pythondot img8Lines of Code : 88dot img8License : Strong Copyleft (CC BY-SA 4.0)
            copy iconCopy
            Topic1 (Priority Low):    1 partitions
            Topic2 (Priority medium): 5 partitions
            Topic3 (Priority High):  20 partitions
            
            topic = %MY_TOPIC_NAME_INJECTED_BY_ENV_VAR%
            consumer.subscribe(topic)
            
            while True:
            
                messages 
            Convert avro serialized messages into json using python consumer
            Pythondot img9Lines of Code : 30dot img9License : Strong Copyleft (CC BY-SA 4.0)
            copy iconCopy
            from confluent_kafka.avro import AvroConsumer
            from confluent_kafka.avro.serializer import SerializerError
            
            
            c = AvroConsumer({
                'bootstrap.servers': 'mybroker,mybroker2',
                'group.id': 'groupid',
                'schema.registry.url': 'http://127
            Change retention policy topic kafka
            Pythondot img10Lines of Code : 11dot img10License : Strong Copyleft (CC BY-SA 4.0)
            copy iconCopy
            from kafka.admin import KafkaAdminClient, ConfigResource
            
            admin_client = KafkaAdminClient(
                bootstrap_servers="localhost:9092", 
                client_id='test'
            )
            
            topic_list = []
            topic_list.append(ConfigResource(restype='TOPIC','your_topic_name',

            Community Discussions

            QUESTION

            "The filename or extension is too long" while installing confluent-kafka?
            Asked 2022-Mar-30 at 05:53

            I have some trouble installing confluent-kafka by using "pip install confluent-kafka". But I got this error: "The filename or extension is too long." Details are below.

            ...

            ANSWER

            Answered 2022-Mar-30 at 05:53

            Windows versions lower than 1607 have limitations in place for maximum length for a path (set by MAX_PATH), which restricts file paths' lengths to be capped at 260 characters.

            Fortunately, if you are running Windows 10 version 1607, you can enable support for long paths:

            1. Click Win+R
            2. Type regedit and press Enter
            3. Go to Computer\HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\FileSystem
            4. Edit or create a key named LongPathsEnabled (type: REG_DWORD)
            5. Enter 1 as its value and press OK.
            6. Restart your system and try again. It should work now.

            Read more: Maximum Path Length Limitation in Windows

            Source https://stackoverflow.com/questions/71477633

            QUESTION

            How to consume messages in last N days using confluent-kafka-python?
            Asked 2022-Feb-20 at 21:19

            This question is similar to Python KafkaConsumer start consuming messages from a timestamp except I want to know how to do it in the official Python Kafka client by Confluent.

            I looked into the Consumer.offsets_for_times function but I'm confused by that it accepts timestamps in the TopicPartition.offset field.

            How is a offset equivalent to a timestamp?

            ...

            ANSWER

            Answered 2022-Feb-19 at 14:14

            That method doesn't accept timestamps; only partitions that you want to find timestamps for.

            https://docs.confluent.io/platform/current/clients/confluent-kafka-python/html/index.html#confluent_kafka.TopicPartition.TopicPartition

            Perhaps you mean the timeout parameter?

            Source https://stackoverflow.com/questions/71181728

            QUESTION

            How do I get the the offset of last message of a Kafka topic using confluent-kafka-python?
            Asked 2022-Feb-19 at 23:51

            I need to retrive last N messages of a topic using confluent-kafka-python.

            I've been reading https://docs.confluent.io/platform/current/clients/confluent-kafka-python/html/index.html# for a day, but no finding any appropriate method for getting the offset of the last message, thus I cannot calculate the offset for consumer to start with.

            Please help. Thanks!

            ...

            ANSWER

            Answered 2022-Feb-19 at 23:51

            You need the get_watermark_offsets() function of the Consumer. You call it with a list of TopicPartition and it returns a tuple (int, int) (low, high) for each partition.

            https://docs.confluent.io/platform/current/clients/confluent-kafka-python/html/index.html#confluent_kafka.Consumer.get_watermark_offsets

            Something like this:

            Source https://stackoverflow.com/questions/71190043

            QUESTION

            Kafka-connect to PostgreSQL - org.apache.kafka.connect.errors.DataException: Failed to deserialize topic to to Avro
            Asked 2022-Feb-11 at 14:44
            Setup

            I've installed latest (7.0.1) version of Confluent platform in standalone mode on Ubuntu virtual machine.

            Python producer for Avro format

            Using this sample Avro producer to generate stream from data to Kafka topic (pmu214).

            Producer seems to work ok. I'll give full code on request. Producer output:

            ...

            ANSWER

            Answered 2022-Feb-11 at 14:42

            If you literally ran the Python sample code, then the key is not Avro, so a failure on the key.converter would be expected, as shown

            Error converting message key

            Source https://stackoverflow.com/questions/71079242

            QUESTION

            confluent-kafka-python json_producer : Unrecognized field: schemaType
            Asked 2022-Jan-08 at 15:08

            have encountered this error "Unrecognized field: schemaType (HTTP status code 422, SR code 422)" when i execute a json_producer.py example in Confluent Github repository

            this is my docker-compose:

            ...

            ANSWER

            Answered 2022-Jan-08 at 15:08

            Jsonschema support was not added to the Confluent Schema Registry until version 6.0, and this is why the error reports issues about a schemaType field, because any lower versions of the Registry response/request payload do not know about that field.

            Upgrading to at least that version, or using the latest version of the image will solve that error

            If you just want to produce JSON, then you don't need the Registry. More details at https://www.confluent.io/blog/kafka-connect-deep-dive-converters-serialization-explained/ .
            You can use the regular producer.py example and provide JSON objects as strings on the CLI

            Source https://stackoverflow.com/questions/70558327

            QUESTION

            Deserialize Protobuf kafka messages with Flink
            Asked 2022-Jan-03 at 21:36

            I am trying to read and print Protobuf message from Kafka using Apache Flink.

            I followed the official docs with no success: https://nightlies.apache.org/flink/flink-docs-master/docs/dev/datastream/fault-tolerance/serialization/third_party_serializers/

            The Flink consumer code is:

            ...

            ANSWER

            Answered 2022-Jan-03 at 20:50

            The confluent protobuf serializer doesn't produce content that can be directly deserialized by other deserializers. The format is described in confluent's documentation: it starts with a magic byte (that is always zero), followed by a four byte schema ID. The protobuf payload follows, starting with byte 5.

            The getProducedType method should return appropriate TypeInformation, in this case TypeInformation.of(User.class). Without this you may run into problems at runtime.

            Deserializers used with KafkaSource don't need to implement isEndOfStream, but it won't hurt anything.

            Source https://stackoverflow.com/questions/70568413

            QUESTION

            How to decorate a Python process in order to capture HTTP requests and responses?
            Asked 2021-Oct-14 at 20:23

            I'm currently investigating a very large Python codebase with lots of side effects and unexpected behavior, and I'd like to get a grasp on what it's doing by seeing all of the outbound HTTP requests that it makes throughout its execution, at any point in the call stack. Are there any utilities or integration paths that allow me to automatically profile the complete set of network calls made by code written in Python?

            Specifically, as opposed to a solely external tool, I would like to be able to interact with the captured HTTP requests and responses programmatically from within the profiled module or an adjacent module; for example to:

            I've looked at the offerings of different observability tools. For instance, Sentry appears to automatically integrate with Python's httplib to create a "breadcrumb" for each request; however Sentry only records this information when an exception is being thrown, and its default behavior is only to publish to its Web UI. New Relic also offers the ability to view "external service" calls as part of its application performance monitoring offerings, again through its own dashboard. In both cases, however, they each lack an officially-supported Python handler that would permit the tasks described above to occur within the process that generates the outbound network requests.

            ...

            ANSWER

            Answered 2021-Oct-14 at 20:23

            I looked at Sentry's Python SDK source code, to see how they integrated with http.client, and adapted their approach in a way that generalizes to meet my needs.

            Here is the code that I wrote to decorate the http.client.HTTPConnection object to tap into requests, request bodies, and response objects. This particular example appends the data I'd like to collect to global lists that live under the profiling module, as well as logging that same data to standard out. You can easily substitute whatever bespoke functionality you'd like in place of those calls to list.append and logger.info:

            Source https://stackoverflow.com/questions/69565444

            QUESTION

            Disable Certificate validation in SchemaRegistryClient Confluent Kafka
            Asked 2021-Sep-12 at 03:13

            So, I want to read a topic from kafka (Confluent) where data lies in Avro format.

            For certain unavoidable reasons , I would like to disable certificate validation.

            I am using security.protocol= SASL_SSL and sasl.mechanisms= OAUTHBEARER

            I can connect to Kafka by disabling the ssl certificate validation

            ...

            ANSWER

            Answered 2021-Sep-10 at 21:54

            I have found an answer to this.

            Basically this S.O post here. Especially the answer after the accepted answer , if you are using Confluent kafka

            and avro documentation here , because my schema wasn't coming from a file , but as a http response, so I had to parse it using avro.schema.parse

            Final skeleton code

            Source https://stackoverflow.com/questions/69134926

            QUESTION

            Kafka authorization failed only on port 9092
            Asked 2021-Aug-03 at 09:58

            I use the confluent kafka docker image and have enabled authentication and authorization with the following config. KAFKA_ADVERTISED_LISTENERS=PLAINTEXT://:9092,SASL_SSL://:9093

            => 9093 SASL_SSL
            => 9092 PLAINTEXT

            Here is a part of my config:
            Container environment variables

            ...

            ANSWER

            Answered 2021-Aug-03 at 09:58

            You did not enable authentication on port 9092,

            with combination of KAFKA_ALLOW_EVERYONE_IF_NO_ACL_FOUND

            , you are getting authorization failure,

            to fix it you should change to SASL_PLAINTEXT to allow SASL authentication without TLS encryption

            PLAINTEXT://:9092 -> SASL_PLAINTEXT://:9092

            Source https://stackoverflow.com/questions/68630429

            QUESTION

            How to overwrite default config values in kafka for confluent-python client?
            Asked 2021-Apr-10 at 14:30

            As a beginner, I'm exploring Apache Kafka and confluent-kafka-python clients. When I tried to send simple messages from the producer, the consumer was able to consume messages successfully. Thought that I would give it a try for sending the image as payload. So going forward with a 1MB(png) image, my producer was unable to produce messages. The error which I encountered was

            ...

            ANSWER

            Answered 2021-Apr-10 at 14:30

            It doesn't look like you changed the broker defaults very much; it's still around 1MB.

            For your client errors, you need to add message.max.bytes to the Producer config

            If you need any other client properties, such as the consumer max fetch bytes, those are documented here

            https://github.com/edenhill/librdkafka/blob/master/CONFIGURATION.md

            Overall, the recommendation would be to upload your images to a centralized filestore, then send their URI location via Kafka as plain strings. This will increase throughput and reduce storage needs for your brokers, especially if you're sending/copying the same image data over multiple topics

            Source https://stackoverflow.com/questions/67031208

            Community Discussions, Code Snippets contain sources that include Stack Exchange Network

            Vulnerabilities

            No vulnerabilities reported

            Install confluent-kafka-python

            NOTE: The pre-built Linux wheels do NOT contain SASL Kerberos/GSSAPI support. If you need SASL Kerberos/GSSAPI support you must install librdkafka and its dependencies using the repositories below and then build confluent-kafka using the command in the "Install from source from PyPi" section below.

            Support

            For any new features, suggestions and bugs create an issue on GitHub. If you have any questions check and ask questions on community page Stack Overflow .
            Find more information at:

            Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items

            Find more libraries
            CLONE
          • HTTPS

            https://github.com/confluentinc/confluent-kafka-python.git

          • CLI

            gh repo clone confluentinc/confluent-kafka-python

          • sshUrl

            git@github.com:confluentinc/confluent-kafka-python.git

          • Stay Updated

            Subscribe to our newsletter for trending solutions and developer bootcamps

            Agree to Sign up and Terms & Conditions

            Share this Page

            share link

            Explore Related Topics

            Consider Popular Pub Sub Libraries

            EventBus

            by greenrobot

            kafka

            by apache

            celery

            by celery

            rocketmq

            by apache

            pulsar

            by apache

            Try Top Libraries by confluentinc

            librdkafka

            by confluentincC

            ksql

            by confluentincJava

            confluent-kafka-go

            by confluentincGo

            confluent-kafka-dotnet

            by confluentincC#

            kafka-streams-examples

            by confluentincJava