spring-cloud-stream-binder-kafka | Spring Cloud Stream binders for Apache Kafka | Pub Sub library

 by   spring-cloud Java Version: 4.0.4 License: Apache-2.0

kandi X-RAY | spring-cloud-stream-binder-kafka Summary

kandi X-RAY | spring-cloud-stream-binder-kafka Summary

spring-cloud-stream-binder-kafka is a Java library typically used in Messaging, Pub Sub, Kafka applications. spring-cloud-stream-binder-kafka has no vulnerabilities, it has build file available, it has a Permissive License and it has high support. However spring-cloud-stream-binder-kafka has 8 bugs. You can download it from GitHub, Maven.

Spring Cloud Stream binders for Apache Kafka and Kafka Streams
Support
    Quality
      Security
        License
          Reuse

            kandi-support Support

              spring-cloud-stream-binder-kafka has a highly active ecosystem.
              It has 316 star(s) with 295 fork(s). There are 53 watchers for this library.
              OutlinedDot
              It had no major release in the last 12 months.
              There are 11 open issues and 752 have been closed. On average issues are closed in 154 days. There are no pull requests.
              OutlinedDot
              It has a negative sentiment in the developer community.
              The latest version of spring-cloud-stream-binder-kafka is 4.0.4

            kandi-Quality Quality

              spring-cloud-stream-binder-kafka has 8 bugs (0 blocker, 0 critical, 3 major, 5 minor) and 555 code smells.

            kandi-Security Security

              spring-cloud-stream-binder-kafka has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
              spring-cloud-stream-binder-kafka code analysis shows 0 unresolved vulnerabilities.
              There are 11 security hotspots that need review.

            kandi-License License

              spring-cloud-stream-binder-kafka is licensed under the Apache-2.0 License. This license is Permissive.
              Permissive licenses have the least restrictions, and you can use them in most projects.

            kandi-Reuse Reuse

              spring-cloud-stream-binder-kafka releases are not available. You will need to build from source code and install.
              Deployable package is available in Maven.
              Build file is available. You can build the component from source.
              Installation instructions are not available. Examples and code snippets are available.
              It has 20065 lines of code, 1188 functions and 133 files.
              It has medium code complexity. Code complexity directly impacts maintainability of the code.

            Top functions reviewed by kandi - BETA

            kandi has reviewed spring-cloud-stream-binder-kafka and discovered the below as its top functions. This is intended to give you an instant insight into spring-cloud-stream-binder-kafka implemented functionality, and help decide if they suit your requirements.
            • Create a Kafka consumer producer
            • Reset offsets for auto rebalance
            • Create back off for rollback processor
            • Setup rebalance listener
            • Creates a message handler for the producer
            • Add no header patterns that should never be mapped
            • Removes no headers from a headers
            • Initialize the bean factories
            • Builds list of input bindings
            • Initializes the function factory
            • Evaluates the function components
            • Create event type processor
            • Gets the partition information for a given topic
            • Gets a queryable store
            • Builds the health check
            • Extract headers from message headers
            • Filters the bean registry
            • Create a polled consumer resources
            • Sets the Kafka s global properties
            • Perform the health check
            • Binds a Kafka s consumer
            • Bind producer
            • Checks and binds metrics to the given registry
            • Binding consumer
            • Extract resolvable types
            • Start the downloader
            Get all kandi verified functions for this library.

            spring-cloud-stream-binder-kafka Key Features

            No Key Features are available at this moment for spring-cloud-stream-binder-kafka.

            spring-cloud-stream-binder-kafka Examples and Code Snippets

            No Code Snippets are available at this moment for spring-cloud-stream-binder-kafka.

            Community Discussions

            QUESTION

            Example on handling processing exception in Spring Cloud Streams with Kafka Streams Binder and the functional style processor
            Asked 2022-Feb-28 at 06:29

            I am using Spring Cloud Streams with the Kafka Streams Binder, the functional style processor API and also multiple processors.

            It's really cool to configure a processing application with multiple processors and multiple Kafka topics in this way and staying in the Spring Boot universe with /actuator, WebClient and so on. Actually I like it more than using plain Apache Kafka Streams.

            BUT: I would like to integrate exception handling for exceptions occurring within the processors and sending these unprocessable messages to a DLQ. I have setup already DLQs for deserialization errors, but I found no good advice on achieving this besides sobychacko's answer on a similar question. But this is only a snippet! Does anybody have a more detailed example? I am asking this because the Spring Cloud Stream documentation on branching looks quite different.

            ...

            ANSWER

            Answered 2021-Sep-29 at 18:30

            Glad to hear about your usage of Spring Cloud Stream with Kafka Streams.

            The reference docs you mentioned are from an older release. Please navigate to the newer docs from this page: https://spring.io/projects/spring-cloud-stream#learn

            This question has come up before. See if these could help with your use case:

            Error handling in Spring Cloud Kafka Streams

            How to stop sending to kafka topic when control goes to catch block Functional kafka spring

            Source https://stackoverflow.com/questions/69380821

            QUESTION

            Spring Cloud Stream connect to multiple hosts for single binder (RabbitMQ)
            Asked 2022-Feb-22 at 19:37

            we are using Spring Cloud Stream to listen to rabbitMQ multiple queues, especially the SCF model

            • The spring-cloud-stream-reactive module is deprecated in favor of native support via Spring Cloud Function programming model.

            by the time there was a single node/host it was working good (application.yml snippet shared below),

            however the moment we try to connect multiple nodes it is failing, Can someone guide how to connect the same or have some sample related to Spring Cloud Documentation

            Following Code is working as expected

            ...

            ANSWER

            Answered 2022-Feb-22 at 19:37

            Upon adding the binders config for both rabbit1 and rabbit2 it resolved the issue:

            Below is the sample config which I tried and was able to consume messages successfully

            Source https://stackoverflow.com/questions/71216982

            QUESTION

            Binding GlobalStateStore into Processor with spring-cloud-stream-binder-kafka
            Asked 2022-Feb-17 at 17:01

            Initial Question: I have a question how I can bind my GlobalStateStore to a processor. My Application has a GlobalStateStore with an own processor ("GlobalConfigProcessor") to keep the Store up to date. Also, I have another Processor ("MyClassProcessor") which is called in my Consumer Function. Now I try to access the store from MyClassProcessor, but I get an exception saying : Invalid topology: StateStore config_statestore is not added yet.

            Update on current situation: I setup a test repository to give a better overview over my situation. This can be found here: https://github.com/fx42/store-example

            As you can see in the repo, I have two Consumers which both consume different topics. The Config-Topic provides an event which I want to write to a GlobalStateStore. Here are the StateStoreUpdateConsumer.java and the StateStoreProcessor.java involved. With the MyClassEventConsumer.java I process another Input-Topic and want to read values from the GlobalStateStore. As provided in this doc I can't initialize GlobalStateStores just as StateStoreBean but instead I have to add this actively with the StreamsBuilderFactoryBeanCustomizer Bean. This Code is currently commented out in the StreamConfig.java. Without this code I get the Exception

            ...

            ANSWER

            Answered 2022-Feb-17 at 17:01

            I figured out my problem. For me it was the @EnableKafkaStreams annotation which I used. I assume this was the reason I had two different contexts running in parallel and they collided. Also I needed to use the StreamsBuilderFactoryBeanConfigurer instead of StreamsBuilderFactoryBeanCustomizer to get the GlobalStateStore registered correctly. Theses changes done in the linked test-repo which now can start the Application Context properly.

            Source https://stackoverflow.com/questions/71145107

            QUESTION

            Set key Serde in spring cloud streams for kafka
            Asked 2022-Jan-27 at 14:22

            I'm trying to setup a spring cloud stream project with kafka. Everything is working as expected but the key de/serialization. The files are the pom.xml:

            ...

            ANSWER

            Answered 2022-Jan-27 at 14:22

            You can override the default serializer at the binder or binding level. e.g. for a specific binding:

            Source https://stackoverflow.com/questions/70871226

            QUESTION

            Dynamic destination in Spring Cloud Stream from Azure Event Hub to Kafka
            Asked 2022-Jan-21 at 17:07

            I'm trying to use Spring Cloud Stream to process messages sent to an Azure Event Hub instance. Those messages should be routed to a tenant-specific topic determined at runtime, based on message content, on a Kafka cluster. For development purposes, I'm running Kafka locally via Docker. I've done some research about bindings not known at configuration time and have found that dynamic destination resolution might be exactly what I need for this scenario.

            However, the only way to get my solution working is to use StreamBridge. I would rather use the dynamic destination header spring.cloud.stream.sendto.destination, in that way the processor could be written as a Function<> instead of a Consumer<> (it is not properly a sink). The main concern about this approach is that, since the final solution will be deployed with Spring Data Flow, I'm afraid I will have troubles configuring the streams if using StreamBridge.

            Moving on to the code, this is the processor function, I stripped away the unrelated parts

            ...

            ANSWER

            Answered 2022-Jan-20 at 21:56

            Not sure what exactly is causing the issues you have. I just created a basic sample app demonstrating the sendto.destination header and verified that the app works as expected. It is a multi-binder application with two Kafka clusters connected. The function will consume from the first cluster and then using the sendto header, produce the output to the second cluster. Compare the code/config in this sample with your app and see what is missing.

            I see references to StreamBridge in the stacktrace you shared. However, when using the sendto.destination header, it shouldn't go through StreamBridge.

            Source https://stackoverflow.com/questions/70785204

            QUESTION

            spring cloud stream kafka: : Error creating bean with name 'supplierInitializer' defined in class path resource
            Asked 2021-Dec-07 at 15:33

            In the event you only have single bean of type java.util.function.[Supplier/Function/Consumer], you can skip the spring.cloud.function.definition property, since such functional bean will be > > auto-discovered. However, it is considered best practice to use such property to avoid any confusion.

            So I do have multiple beans in my project but still only one bean of type supplier.

            Not sure what exactly what I am missing.

            EXCEPTION TRACE:

            ...

            ANSWER

            Answered 2021-Dec-07 at 15:33

            For 2021.0.0 you have to use:

            Source https://stackoverflow.com/questions/70260291

            QUESTION

            What is the difference between spring-kafka and Apache-Kafka-Streams-Binder regarding the interaction with Kafka Stream API?
            Asked 2021-Nov-11 at 00:31

            My understanding was that spring-kafka was created to interact with Kafka Client APIs, and later on, spring-cloud-stream project was created for "building highly scalable event-driven microservices connected with shared messaging systems", and this project includes a couple of binders, one of them is a binder that allows the interaction with Kafka Stream API:

            ...

            ANSWER

            Answered 2021-Nov-11 at 00:31

            As Gary pointed out in the comments above, spring-kafka is the lower level library that provides the building blocks for Spring Cloud Stream Kafka Streams binder (spring-cloud-stream-binder-kafka-streams). The binder provides a programming model using which you can write your Kafka Streams processor as a java.util.funciton.Funciton or java.util.function.Consumer. You can have multiple such functions and each of them will build its own Kafka Streams topologies. Behind the scenes, the binder uses Spring-Kafka to build the Kafka Streams StreamsBuilder objet using the StreamsBuilderFactoryBean. Binder also allows you to compose various functions. The functional model comes largely from Spring Cloud Function, but it is adapted for the Kafka Streams in the binder implementation. The short answer is that both spring-Kafka and the Spring Cloud Stream Kafka Streams binder will work, but the binder gives a programming model and extra features consistent with Spring Cloud Stream whereas spring-kafka gives various low-level building blocks.

            Source https://stackoverflow.com/questions/69909019

            QUESTION

            Spring Cloud Stream database transaction does not roll back
            Asked 2021-Aug-26 at 16:38

            I am trying to write a spring-cloud-stream function (spring-starter-parent 2.5.3, java 11, spring-cloud-version 2020.0.3) which has both a Kafka and Postgres transaction. The function will raise a simulated error whenever the consumed message starts with the string "fail," which I expect to cause the database transaction to roll back, then cause the kafka transaction to roll back. (I am aware that the Kafka transaction is not XA, which is fine.) So far I have not gotten the database transaction to work, but the kafka transaction does.

            Currently I am using a @Transactional annotation, which does not appear to start a database transaction. (The Kafka binder documentation recommends synchronizing database + kafka transactions using the ChainedTransactionManager, but the Spring Kafka documentation states it is deprecated in favor of using the @Transactional annotation, and the S.C.S. example for this problem uses the @Transactional annotation and the default transaction manager created by the start-jpa library (I think)). I can see in my debugger that regardless of whether or not I @EnableTransactionManagement and use a @Transactional on my consumer, the consumer is executed in a kafka transaction using a transaction template higher in the stack, but I do not see a database transaction anywhere.

            I have a few questions I want to understand:

            • Am I correct that the Kafka Listener Container runs my consumers in the context of a Kafka transaction regardless of whether or not I have a @Transactional annotation? And if so, is there a way to only run specific functions in a Kafka transaction?
            • Would the above change for producers, since the container doesn't have a way to intercept calls to the producers (as far as I know)?
            • What should I do to synchronize a Kafka and a database transactions so that the DB commit happens before the Kafka commit?

            I have the following Crud Repository, collection of handlers, and application.yml:

            ...

            ANSWER

            Answered 2021-Aug-26 at 16:38
              @Bean
              @Transactional
              public Consumer persistAndSplit(
                  StreamBridge bridge,
                  AuditLogRepository repository
              ) {
            

            Source https://stackoverflow.com/questions/68941306

            QUESTION

            Failed sending to DLQ using Spring Cloud Stream in batch mode
            Asked 2021-Aug-16 at 14:05

            Trying to configure Spring to send bad messages to dead letter queue while using batch mode. But as a result in dlq topic there is nothing.

            I use Spring Boot 2.5.3 and Spring Cloud 2020.0.3. This automatically resolves version of spring-cloud-stream-binder-kafka-parent as 3.1.3.

            Here is application.properties:

            ...

            ANSWER

            Answered 2021-Aug-09 at 15:35

            When using spring-cloud-stream, the container is not created by Boot's container factory, it is created by the binder; the error handler @Bean won't be automatically wired in.

            You have to configure a ListenerContainerCustomizer @Bean instead.

            Example here: Can I apply graceful shutdown when using Spring Cloud Stream Kafka 3.0.3.RELEASE?

            Source https://stackoverflow.com/questions/68634379

            QUESTION

            spring version 2.6.7 doesn't support REPLACE THREAD option in KStream
            Asked 2021-Aug-16 at 07:45

            I use spring boot version 2.5.3, spring-cloud-stream-binder-kafka-stream version 3.1.3 and kafka-clients version 2.8.0. I want to use REPLACE_THREAD option for uncaught exception handler in kafka streams.

            But I'm not able to use that since StreamsBuilderFactoryBeanConfigurer (2.6.7 version) doesn't support fb.setUncaughtExceptionHandler(ex -> { log.error("Uncaught exception: ", e); snsService.publish("UncaughtException thrown"); return StreamsUncaughtExceptionHandler.StreamThreadExceptionResponse.REPLACE_THREAD; });

            Is it possible to replace the streams thread with fb.setUncaughtExceptionHandler(new Thread.UncaughtExceptionHandler()?

            Thanks in Advance!

            ...

            ANSWER

            Answered 2021-Aug-16 at 07:45

            Community Discussions, Code Snippets contain sources that include Stack Exchange Network

            Vulnerabilities

            No vulnerabilities reported

            Install spring-cloud-stream-binder-kafka

            You can download it from GitHub, Maven.
            You can use spring-cloud-stream-binder-kafka like any standard Java library. Please include the the jar files in your classpath. You can also use any IDE and you can run and debug the spring-cloud-stream-binder-kafka component as you would do with any other Java program. Best practice is to use a build tool that supports dependency management such as Maven or Gradle. For Maven installation, please refer maven.apache.org. For Gradle installation, please refer gradle.org .

            Support

            There is a "full" profile that will generate documentation.
            Find more information at:

            Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items

            Find more libraries
            Install
            Maven
            Gradle
            CLONE
          • HTTPS

            https://github.com/spring-cloud/spring-cloud-stream-binder-kafka.git

          • CLI

            gh repo clone spring-cloud/spring-cloud-stream-binder-kafka

          • sshUrl

            git@github.com:spring-cloud/spring-cloud-stream-binder-kafka.git

          • Stay Updated

            Subscribe to our newsletter for trending solutions and developer bootcamps

            Agree to Sign up and Terms & Conditions

            Share this Page

            share link

            Explore Related Topics

            Consider Popular Pub Sub Libraries

            EventBus

            by greenrobot

            kafka

            by apache

            celery

            by celery

            rocketmq

            by apache

            pulsar

            by apache

            Try Top Libraries by spring-cloud

            spring-cloud-netflix

            by spring-cloudJava

            spring-cloud-gateway

            by spring-cloudJava

            spring-cloud-kubernetes

            by spring-cloudJava

            spring-cloud-config

            by spring-cloudJava

            spring-cloud-sleuth

            by spring-cloudJava