kafkacat | Generic command line non-JVM Apache Kafka producer | Pub Sub library

 by   edenhill C Version: 1.6.0 License: Non-SPDX

kandi X-RAY | kafkacat Summary

kandi X-RAY | kafkacat Summary

kafkacat is a C library typically used in Messaging, Pub Sub, Kafka applications. kafkacat has no bugs, it has no vulnerabilities and it has medium support. However kafkacat has a Non-SPDX License. You can download it from GitHub.

kafkacat is a generic non-JVM producer and consumer for Apache Kafka >=0.8, think of it as a netcat for Kafka. In producer mode kafkacat reads messages from stdin, delimited with a configurable delimiter (-D, defaults to newline), and produces them to the provided Kafka cluster (-b), topic (-t) and partition (-p). In consumer mode kafkacat reads messages from a topic and partition and prints them to stdout using the configured message delimiter. There's also support for the Kafka >=0.9 high-level balanced consumer, use the -G switch and provide a list of topics to join the group. kafkacat also features a Metadata list (-L) mode to display the current state of the Kafka cluster and its topics and partitions. Supports Avro message deserialization using the Confluent Schema-Registry, and generic primitive deserializers (see examples below). kafkacat is fast and lightweight; statically linked it is no more than 150Kb.
Support
    Quality
      Security
        License
          Reuse

            kandi-support Support

              kafkacat has a medium active ecosystem.
              It has 3403 star(s) with 321 fork(s). There are 77 watchers for this library.
              OutlinedDot
              It had no major release in the last 12 months.
              There are 72 open issues and 189 have been closed. On average issues are closed in 201 days. There are 6 open pull requests and 0 closed requests.
              It has a neutral sentiment in the developer community.
              The latest version of kafkacat is 1.6.0

            kandi-Quality Quality

              kafkacat has no bugs reported.

            kandi-Security Security

              kafkacat has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.

            kandi-License License

              kafkacat has a Non-SPDX License.
              Non-SPDX licenses can be open source with a non SPDX compliant license, or non open source licenses, and you need to review them closely before use.

            kandi-Reuse Reuse

              kafkacat releases are available to install and integrate.
              Installation instructions, examples and code snippets are available.

            Top functions reviewed by kandi - BETA

            kandi's functional review helps you automatically verify the functionalities of the libraries and avoid rework.
            Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of kafkacat
            Get all kandi verified functions for this library.

            kafkacat Key Features

            No Key Features are available at this moment for kafkacat.

            kafkacat Examples and Code Snippets

            No Code Snippets are available at this moment for kafkacat.

            Community Discussions

            QUESTION

            Kerberros GSSAPI doesn't work within kafkacat alpine container
            Asked 2021-May-13 at 11:50

            Previously I've reported it into kafkacat tracker but the issue has been closed as related to cyrus-sasl/krb5.

            ...

            ANSWER

            Answered 2021-May-13 at 11:50

            Very strange issue, and honestly I can't say why, but adding into krb5.conf:

            Source https://stackoverflow.com/questions/67509575

            QUESTION

            Sending Avro messages to Kafka
            Asked 2021-Apr-01 at 13:14

            I have an app that produces an array of messages in raw JSON periodically. I was able to convert that to Avro using the avro-tools. I did that because I needed the messages to include schema due to the limitations of Kafka-Connect JDBC sink. I can open this file on notepad++ and see that it includes the schema and a few lines of data.

            Now I would like to send this to my central Kafka Broker and then use Kafka Connect JDBC sink to put the data in a database. I am having a hard time understanding how I should be sending these Avro files I have to my Kafka Broker. Do I need a schema registry for my purposes? I believe Kafkacat does not support Avro so I suppose I will have to stick with the kafka-producer.sh that comes with the Kafka installation (please correct me if I am wrong).

            Question is: Can someone please share the steps to produce my Avro file to a Kafka broker without getting Confluent getting involved.

            Thanks,

            ...

            ANSWER

            Answered 2021-Apr-01 at 13:14

            To use the Kafka Connect JDBC Sink, your data needs an explicit schema. The converter that you specify in your connector configuration determines where the schema is held. This can either be embedded within the JSON message (org.apache.kafka.connect.json.JsonConverter with schemas.enabled=true) or held in the Schema Registry (one of io.confluent.connect.avro.AvroConverter, io.confluent.connect.protobuf.ProtobufConverter, or io.confluent.connect.json.JsonSchemaConverter).

            To learn more about this see https://www.confluent.io/blog/kafka-connect-deep-dive-converters-serialization-explained

            To write an Avro message to Kafka you should serialise it as Avro and store the schema in the Schema Registry. There is a Go client library to use with examples

            without getting Confluent getting involved.

            It's not entirely clear what you mean by this. The Kafka Connect JDBC Sink is written by Confluent. The best way to manage schemas is with the Schema Registry. If you don't want to use the Schema Registry then you can embed the schema in your JSON message but it's a suboptimal way of doing things.

            Source https://stackoverflow.com/questions/66896740

            QUESTION

            Kafkacat how to republish a binary message maintaining key
            Asked 2021-Feb-23 at 19:57

            I have been trying to use kafkacat to find a message in a topic and publish it back into the topic. We use protobuf so the message values should be in bytes (Keys can be different such as strings or bytes). However, I am unable to publish the message that could be deserialized properly.

            How can I do this with kafkacat? I am also open to using other recommended tools for doing this.

            Example attempt:

            ...

            ANSWER

            Answered 2021-Feb-23 at 19:57

            the producer is treating each line as a new message

            That's correct.

            If you have a binary file, I suggest writing code for this, as kafkacat assumes UTF8 encoded strings as input

            Source https://stackoverflow.com/questions/66338402

            QUESTION

            Problem with ADVERTISED_LISTENER on macos
            Asked 2020-Nov-14 at 16:00

            I start kafka with this docker-compose.yml on my Mac:

            ...

            ANSWER

            Answered 2020-Nov-12 at 14:55

            I played around with your particular example, and couldn't get it to work.

            For what it's worth, this Docker Compose is how I run Kafka on Docker locally, and it's accessible both from the host machine, and other containers.

            You might find this blog useful if you want to continue with your existing approach and debug it further.

            Source https://stackoverflow.com/questions/64805874

            QUESTION

            Kafka Java Producer is not able to send message to kafka instance
            Asked 2020-Sep-02 at 08:57

            I am running a kafka instance in a docker container with following docker-compose.yml file.

            ...

            ANSWER

            Answered 2020-Aug-09 at 21:40

            Broker configuration seems to be fine since you get back the correct metadata.

            I think the problem is in your code. kafkaTemplate.send() is an asynchronous operation and most likely your process ends before the producer manages to actually send the message. Try adding a .get() to that send method to force it in being synchronous.

            Source https://stackoverflow.com/questions/63331265

            QUESTION

            Configuration of a specific topic. Kafkacat
            Asked 2020-Aug-18 at 10:50

            I have a topic "topic-one" and I want to know if it has "log.cleanup.policy = compact" configured or not.

            Is it possible with kafkacat, extract the properties and / or configuration of a specific topic?

            ...

            ANSWER

            Answered 2020-Aug-18 at 10:50

            kafkacat does not yet support the Topic Admin API (which allows you to alter and view cluster configs). Suggest you use kafka-configs.sh from the Apache Kafka distribution in the meantine.

            Source https://stackoverflow.com/questions/63447582

            QUESTION

            How to talk to Kafka KUDO Internal Setup?
            Asked 2020-Jun-12 at 16:33

            I have Kafka setup via KUDO:

            ...

            ANSWER

            Answered 2020-Jun-12 at 16:33

            Found out the answer:

            Use kubefwd CLI

            Source https://stackoverflow.com/questions/62268622

            QUESTION

            NiFi in docker container fails to talk to kafka: TimoutException, kafkacat ist working just fine
            Asked 2020-Jun-10 at 12:03

            I have set up NiFi (1.11.4) & Kafka(2.5) via docker (docker-compose file below, actual NiFi flow definition https://github.com/geoHeil/streaming-reference). When trying to follow up on basic getting started tutorials (such as https://towardsdatascience.com/big-data-managing-the-flow-of-data-with-apache-nifi-and-apache-kafka-af674cd8f926) which combine processors such as:

            • generate flowfile (CSV)
            • update attribute
            • PublishKafka2.0

            I run into issues of timeoutException:

            ...

            ANSWER

            Answered 2020-Jun-10 at 12:03

            You're using the wrong port to connect to the broker. By connecting to 9092 you connect to the listener that advertises localhost:9092 to the client for subsequent connections. That's why it works when you use kafkacat from your local machine (because 9092 is exposed to your local machine)

            If you use broker:29092 then the broker will give the client the correct address for the connection (i.e. broker:29092).

            To understand more about advertised listeners see this blog

            Source https://stackoverflow.com/questions/62302509

            QUESTION

            Healthcheck not working at all when using docker-compose (My service do not wait for Kafka to be started before launching)
            Asked 2020-Jun-09 at 04:00

            I have three services on my docker-compose:

            ...

            ANSWER

            Answered 2020-Jun-09 at 04:00

            Not clear why you need a JAR file. This should work just as well

            Source https://stackoverflow.com/questions/62135235

            QUESTION

            cannot get data from table KSQL
            Asked 2020-May-13 at 16:15

            I create a rekeyed stream

            ...

            ANSWER

            Answered 2018-May-14 at 08:50

            A KSQL table differs from a KSQL Stream, in that it gives you the latest value for a given key. So if you are expecting to see the same number of messages in your table as your source stream, you should have the same number of unique keys.

            If you're seeing fewer messages then it suggests that ROOT is not unique.

            Depending on the problem that you're modelling, you should either :

            • (a) be using a Stream not a Table, or
            • (b) change the key that you are using

            Ref:

            Source https://stackoverflow.com/questions/50315383

            Community Discussions, Code Snippets contain sources that include Stack Exchange Network

            Vulnerabilities

            No vulnerabilities reported

            Install kafkacat

            On recent enough Debian systems:.
            The bootstrap.sh build script will download and build the required dependencies, providing a quick and easy means of building kafkacat. Internet connectivity and wget/curl is required by this script. The resulting kafkacat binary will be linked statically to avoid runtime dependencies. NOTE: Requires curl and cmake (for yajl) to be installed.

            Support

            For any new features, suggestions and bugs create an issue on GitHub. If you have any questions check and ask questions on community page Stack Overflow .
            Find more information at:

            Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items

            Find more libraries
            CLONE
          • HTTPS

            https://github.com/edenhill/kafkacat.git

          • CLI

            gh repo clone edenhill/kafkacat

          • sshUrl

            git@github.com:edenhill/kafkacat.git

          • Stay Updated

            Subscribe to our newsletter for trending solutions and developer bootcamps

            Agree to Sign up and Terms & Conditions

            Share this Page

            share link

            Explore Related Topics

            Consider Popular Pub Sub Libraries

            EventBus

            by greenrobot

            kafka

            by apache

            celery

            by celery

            rocketmq

            by apache

            pulsar

            by apache

            Try Top Libraries by edenhill

            librdkafka

            by edenhillC

            kcat

            by edenhillC

            librd

            by edenhillC

            mklove

            by edenhillShell

            trivup

            by edenhillPython