flume-kafka | A kafka source & sink for flume | Stream Processing library
kandi X-RAY | flume-kafka Summary
kandi X-RAY | flume-kafka Summary
Since this plugin for flume is going to merge into flume, I’ve splited this plugin to two dependent plugin [flumg-ng-kafka-source] and [flume-ng-kafka-sink] ASFv2 branch is okay, but I advised to use new plugin. This project is used for [flume-ng] to communicate with [kafka 0.7,2] For v0.2 now, I think the parameters pass to flume-kafka need to be handled by users, not by code. Before this version, I add many parameters of kafka and their default value in code. That is to say, whatever parameters you write in conf file, they will be passed to Kafka producer or consumers. I cannot control if the parameters you wrote will take effect. The responsibilites for using correct parameters or find out what parameters to use, in my opinion, are yours. On the other hand, it is simple if Kafka add some new parameters:). Configuration of Kafka Sink.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
- Set Kafka topic
- Gets the Kafka config properties
- Creates a producer
- Returns the value of a Kafka config parameter
- Configure the consumer
- Creates a consumer
- Process the incoming messages
- Opens the event
- Stop the producer
- Stops the consumer
flume-kafka Key Features
flume-kafka Examples and Code Snippets
Community Discussions
Trending Discussions on flume-kafka
QUESTION
I try to setup KafkaChannel (or KafkaSource) in Flume. And I constantly receive following Exception
Caused by: javax.security.auth.login.LoginException: Could not login: the client is being asked for a password, but the Kafka client code does not currently support obtaining a password from the user. Make sure -Djava.security.auth.login.config property passed to JVM and the client is configured to use a ticket cache (using the JAAS configuration setting 'useTicketCache=true)'. Make sure you are using FQDN of the Kafka broker you are trying to connect to. not available to garner authentication information from the user
My jaas.conf
is following:
ANSWER
Answered 2019-Jun-11 at 11:29Thanks to this post (original) I've noticed that KafkaClient
config specified in Flume 1.6 documentation provided by Cloudera was missing some options. Then I took a look at Official Apache Flume 1.7 documentation and noticed that I miss the following properties:
QUESTION
We've meet strange problem with flume-kafka-sink
, kafka
broker failed multiple times and producing duplicate messages(every 50 record are same), but the settings about producer.sinks.r.request.required.acks = 1
, quota to kafka
documentation "This option provides the lowest latency but the weakest durability guarantees (some data will be lost when a server fails)", It can't be produce duplicate data? Is that means the problem caused by flume
or flume-kafka-sink
?
ANSWER
Answered 2018-Jan-30 at 09:13Flume-Kafka-Sink produce messages batch by batch and will retry after some fail write. During some broker fail, some partition leaders can't reach. When a batch write happen, some parition will success, but some failed, when Flume-Kafka-Sink retry, the success part will be duplicated.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install flume-kafka
You can use flume-kafka like any standard Java library. Please include the the jar files in your classpath. You can also use any IDE and you can run and debug the flume-kafka component as you would do with any other Java program. Best practice is to use a build tool that supports dependency management such as Maven or Gradle. For Maven installation, please refer maven.apache.org. For Gradle installation, please refer gradle.org .
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page