connect-cassandra | A Cassandra session store for connect
kandi X-RAY | connect-cassandra Summary
kandi X-RAY | connect-cassandra Summary
A Cassandra session store for connect
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of connect-cassandra
connect-cassandra Key Features
connect-cassandra Examples and Code Snippets
Community Discussions
Trending Discussions on connect-cassandra
QUESTION
I went through the Cassandra Sink doc but I don't see how to specify the partition and clustering keys.
The doc says this:
You can configure this connector to manage the schema on the Cassandra cluster. When altering an existing table the key is ignored. This is to avoid the potential issues around changing a primary key on an existing table. The key schema is used to generate a primary key for the table when it is created.
If it is a new table, the Connector will use the Key schema (from the KStream I suppose) to create the primary key. That might be Ok for the Partition Key, but not for the Clustering key.
So are we forced to create all the tables with the right keys before running the Streaming app, or is there a way to adjust things ?
...ANSWER
Answered 2020-Apr-04 at 14:11Confluent's connector requires that all columns that are in the primary key should be in the key of the topic (as struct, if I remember correctly). This is one of the its limitations, as it may not be matching your output from application. In this case you'll need to transform topic to match this requirement.
Instead of Confluent's connector, I recommend to take DataStax's Kafka Connector that is carefully designed to effective load of data into Cassandra/DSE. It has following features (more information is in the following blog post):
- Store data from one topic into one or multiple Cassandra tables (to support data denormalization);
- Mapping of data in topic into Cassandra columns is defined by configuration file, so you can take any piece of key or value of the message, and map into column;
- very effective by using unlogged batches where possible & lightweight;
- support different security features of Cassandra/DSE;
Connector is free to use for DSE starting with DSE 4.8, and Cassandra starting with 2.1.
QUESTION
I am trying to setup kafka-connect-cassandra on an AWS instance.
I have setup plugin.path
in connect-avro-distributed.properties
file:
ANSWER
Answered 2018-Feb-18 at 00:59This is a classpath issue. Looks like maybe you have an incompatible version of guava in the classpath? If your plugin path isn't including this method in any of the jars it has, that's a connector packaging issue. If it is, then you probably have two versions hanging around. Double check that plugin path with a find command to inspect all jars for that class in the message as a first step. Ultimately, you'll need to figure out what version of the dependency the connector expects and get that version and only that version into the plugin path.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install connect-cassandra
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page