spring-cloud-stream-samples | Samples for Spring Cloud Stream | Continuous Deployment library
kandi X-RAY | spring-cloud-stream-samples Summary
kandi X-RAY | spring-cloud-stream-samples Summary
This repository contains a collection of applications written using Spring Cloud Stream. All the applications are self contained. They can be run against either Kafka or RabbitMQ middleware technologies. You have the option of running the samples against local or Docker containerized versions of Kafka and Rabbit. For convenience, docker-compose.yml files are provided as part of each application wherever it is applicable. For this reason, Docker Compose is required and it’s recommended to use the latest version. These compose files bring up the middleware (kafka or Rabbit) and other necessary components for running each app. If you bring up Kafka or RabbitMQ in Docker containers, please make sure that you bring them down while in the same sample directory. You can read the README that is part of each sample and follow along the instructions to run them. You can build the entire samples by going to the root of the repository and then do: ./mvnw clean package However, the recommended approach to build them is to pick the sample that you are interested in and go to that particular app and follow the instructions there in the README for that app.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
- Downloads a file from a URL
- Creates a thumbnail for the given URL .
- Consumer consumer .
- Create a function that invokes couchbase consumer .
- Process an order request .
- The source1 bean .
- Factory method for timer messages .
- Sends a sensor .
- Handles a POST request .
- Generate a random message .
spring-cloud-stream-samples Key Features
spring-cloud-stream-samples Examples and Code Snippets
Community Discussions
Trending Discussions on spring-cloud-stream-samples
QUESTION
I'm trying to create a reactive Spring Cloud Stream application with kafka following the functional approach (Spring Boot: 2.3.4, SC: Hoxton.SR9, SC Stream: 3.0.9, SC Function 3.0.11). Problem: Automatically deserialized object has empty field values.
Json Payload of the kafka-message:
...ANSWER
Answered 2020-Dec-15 at 11:07According to https://github.com/spring-cloud/spring-cloud-stream/issues/2056 this issue is related to 3.0.11-RELEASE of spring-cloud-function.
Downgrading spring-cloud-function to 3.0.10-RELEASE solved the issue for now:
QUESTION
I have a working application that uses the latest update for Producers that came with Hoxton. Now I'm trying to add some integration tests, asserting that the Producer is actually producing a message as expected. The problem is, the consumer I use in the test never reads anything from the topic.
In order to make this issue reproducible I've resused a project (spring-cloud-stream-samples/source-samples/dynamic-destination-source-kafka
) from the spring cloud stream samples, adapting it as follows:
DynamicDestinationSourceApplication (The EmitterProcessor is now a bean)
...ANSWER
Answered 2020-Feb-03 at 01:07I had testImplementation("org.springframework.cloud:spring-cloud-stream-test-support")
incorrectly added as a dependency. This uses a Test Binder that is not meant to be used with integration tests.
QUESTION
I'm trying to consume confluent avro message from kafka topic as Kstream with spring boot 2.0.
I was able to consume the message as MessageChannel
but not as KStream
.
ANSWER
Answered 2019-Nov-25 at 15:11Try spring.cloud.stream.kafka.streams.binder.configuration.schema.registry.url: ...
QUESTION
I am trying build out a simple streams app based on Kafka Streams using this example.
However when I am starting the app, I get the below error: Can someone please point out on what I am missing out here? Here is the code, config & error
...ANSWER
Answered 2018-Jun-05 at 07:05Try moving EnableBinding
annotation to DemoApplication
class. I believe it should be put on @Configuration
class, not on arbitrary @Component
.
QUESTION
I have a spring boot application where I am using spring-cloud-stream to consume from a kafka topic, do some processing and publish to another kafka topic. The application works fine and I've written unit tests (using the TestBinder) which are running fine as well.
I am now trying to write an integration test with an embedded Kafka and test the end-to-end functionality. I have followed the sample here https://github.com/spring-cloud/spring-cloud-stream-samples/blob/master/testing-samples/test-embedded-kafka/src/test/java/demo/EmbeddedKafkaApplicationTests.java to write the test however this is not working - I am unable to receive any message on the output topic.
application.yml
...ANSWER
Answered 2019-Oct-12 at 13:24
ConsumerRecords records = consumer.poll(0);
You need wait for the subscription to occur; 0 won't do it; the sample waits for up to 10 seconds.
However, it's safer to use
embeddedKafkaRule().getEmbeddedKafka().consumeFromAnEmbeddedTopic(...);
because it reliably waits for assignment using a ConsumerRebalanceListener
.
Once subscribed, you can also use
QUESTION
I'm trying to modify one of spring cloud stream samples and the results I'm getting are confusing - even though I registered only a single stream listener for my channel I'm getting only every second message. I suspect this is caused by default load balancing for a single kafka partition, but I can't figure out how to confirm this.
docker ps
shows only a single instance of kafka broker being up
ANSWER
Answered 2019-Aug-12 at 13:43You have two consumers on the output
channel - the binding to the topic and your receive()
service activator.
The default round robin processing sends messages alternately to your service activator and the topic.
QUESTION
Is it possible to use interactive query (InteractiveQueryService) within Spring Cloud Stream the class with @EnableBinding annotation or within the method with @StreamListener? I tried instantiating ReadOnlyKeyValueStore within provided KStreamMusicSampleApplication class and process method but its always null.
My @StreamListener method is listening to a bunch of KTables and KStreams and during the process topology e.g filtering, I have to check whether the key from a KStream already exists in a particular KTable.
I tried to figure out how to scan an incoming KTable to check if a key already exists but no luck. Then I came across InteractiveQueryService whose get() method could be used to check if a key exists inside a state store materializedAs from a KTable. The problem is that I can't access it from with the process topology (@EnableBinding or @StreamListener). It can only be accessed from outside these annotation e.g RestController.
Is there a way to scan an incoming KTable to check for the existence of a key or value? if not then can we access InteractiveQueryService within the process topology?
...ANSWER
Answered 2019-May-01 at 14:16InteractiveQueryService
in Spring Cloud Stream is not available to be used within the actual topology in your StreamListener
. As you mentioned, it is supposed to be used outside of your main topology. However, with the use case you described, you still can use the state store from your main flow. For example, if you have an incoming KStream
and a KTable
which is materialized as a state store, then you can call process
on the KStream
and access the state store that way. Here is a rough code to achieve that. You need to convert this to fit into your specific use case, but here is the idea.
QUESTION
Spring Kafka, and thus Spring Cloud Stream, allow us to create transactional Producers and Processors. We can see that functionality in action in one of the sample projects: https://github.com/spring-cloud/spring-cloud-stream-samples/tree/master/transaction-kafka-samples:
...ANSWER
Answered 2019-Apr-26 at 14:11There is no guarantee, only within Kafka itself.
Spring provides transaction synchronization so the the commits are close together but it is possible for the DB to commit and the Kafka does not. So you have to deal with the possibility of duplicates.
The correct way to do this, when using spring-kafka directly, is NOT with @Transactional
but to use a ChainedKafkaTransactionManager
in the listener container.
See Transaction Synchronization.
Also see Distributed transactions in Spring, with and without XA and the "Best Efforts 1PC pattern" for background.
However, with Stream, there is no support for the chained transaction manager, so the @Transactional
is required (with the DB transaction manager). This will provide similar results to chained tx manager, with the DB committing first, just before Kafka.
QUESTION
I have micro services built based on Spring Cloud Stream. Testing team needs to create integration test for these services. What are best practices?
Based on the sample below, Sink/Source/Processor from different applications needs to be in classpath of the testing project. Is the expectation is to package each service and include it in testing project?
Thanks
...ANSWER
Answered 2019-Apr-18 at 10:53When it comes to "integration" testing of spring-cloud-stream the scope starts/stops within a single stream. What you are asking about is testing a flow where several streams are connected via remote queues/topics etc. That is out of scope of spring-cloud-stream testing.
However, there is another framework which is specifically designed to create, manage, monitor, control etc., as well as test these flows. I am talking about Spring Cloud Data Flow where with a simple set of commands and/or using GUI you can assemble your stream apps into a flow.
QUESTION
I am trying to configure sending out message to two out put streams like following.
...ANSWER
Answered 2018-Nov-14 at 17:55It is not possible to send to multiple destinations as you describe from a StreamListener
method when using regular MessageChannel
based binders. It is possible to send to multiple topics using the Kafka Streams binder's branching feature which you are referring to in the link provided above. If you want to send to multiple destinations in your application, one option is to use the dynamic destination feature of Spring Cloud Stream. Here is an example of how dynamic destinations work.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install spring-cloud-stream-samples
You can use spring-cloud-stream-samples like any standard Java library. Please include the the jar files in your classpath. You can also use any IDE and you can run and debug the spring-cloud-stream-samples component as you would do with any other Java program. Best practice is to use a build tool that supports dependency management such as Maven or Gradle. For Maven installation, please refer maven.apache.org. For Gradle installation, please refer gradle.org .
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page