yb-kafka-connector | Kafka Connect YugabyteDB Connector | Change Data Capture library

 by   yugabyte Java Version: Current License: Apache-2.0

kandi X-RAY | yb-kafka-connector Summary

kandi X-RAY | yb-kafka-connector Summary

yb-kafka-connector is a Java library typically used in Utilities, Change Data Capture, Kafka applications. yb-kafka-connector has no bugs, it has no vulnerabilities, it has build file available, it has a Permissive License and it has low support. You can download it from GitHub.

In this approach, the source connector streams table updates in YugabyteDB to Kafka topics. It is based on YugabyteDB's Change Data Capture (CDC) feature. CDC allows the connector to simply subscribe to these table changes and then publish the changes to selected Kafka topics. More documentation to follow.
Support
    Quality
      Security
        License
          Reuse

            kandi-support Support

              yb-kafka-connector has a low active ecosystem.
              It has 13 star(s) with 6 fork(s). There are 8 watchers for this library.
              OutlinedDot
              It had no major release in the last 6 months.
              There are 8 open issues and 1 have been closed. There are 1 open pull requests and 0 closed requests.
              It has a neutral sentiment in the developer community.
              The latest version of yb-kafka-connector is current.

            kandi-Quality Quality

              yb-kafka-connector has 0 bugs and 0 code smells.

            kandi-Security Security

              yb-kafka-connector has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
              yb-kafka-connector code analysis shows 0 unresolved vulnerabilities.
              There are 0 security hotspots that need review.

            kandi-License License

              yb-kafka-connector is licensed under the Apache-2.0 License. This license is Permissive.
              Permissive licenses have the least restrictions, and you can use them in most projects.

            kandi-Reuse Reuse

              yb-kafka-connector releases are not available. You will need to build from source code and install.
              Build file is available. You can build the component from source.
              Installation instructions, examples and code snippets are available.
              It has 541 lines of code, 29 functions and 3 files.
              It has high code complexity. Code complexity directly impacts maintainability of the code.

            Top functions reviewed by kandi - BETA

            kandi has reviewed yb-kafka-connector and discovered the below as its top functions. This is intended to give you an instant insight into yb-kafka-connector implemented functionality, and help decide if they suit your requirements.
            • Process a set of sink records
            • Bind column by column type
            • Binds a bound statement to a bound statement
            • Gets the prepared statements from the database
            • Parse a timestamp
            • Sanitize and get column names
            • Generate INSERT statement
            • Gets columns from a table
            • Creates and binds the columns of a sink record
            • Starts the Cassandra session
            • Creates a Cassandra client
            • Returns a list of contact points from a comma separated list of host ports
            • Construct a ContactPoint from a string
            • Get Cassandra Session for contact points
            • Creates a list of configs
            • Stops the cassandra session
            Get all kandi verified functions for this library.

            yb-kafka-connector Key Features

            No Key Features are available at this moment for yb-kafka-connector.

            yb-kafka-connector Examples and Code Snippets

            No Code Snippets are available at this moment for yb-kafka-connector.

            Community Discussions

            QUESTION

            How to configure yb-cdc-connector to resume printing logs upon yugabyte db restart (stop and start)?
            Asked 2020-Jan-31 at 13:30

            Following, How to capture data change in yugabyte db? to publish the change logs.

            It works properly. But, yb-connector.jar is not resuming the publishing change data logs, after restarting yugabyte db.

            Commands Executed:

            ...

            ANSWER

            Answered 2020-Jan-31 at 13:30

            This isn't yet supported. Can you create an issue on github https://github.com/yugabyte/yugabyte-db/issues ?

            Source https://stackoverflow.com/questions/59995811

            QUESTION

            How to print change data logs at stdout in yugabyte?
            Asked 2020-Jan-22 at 06:55

            I am using yugabyte db 1.3.0.0 and following https://docs.yugabyte.com/latest/deploy/cdc/use-cdc/ to learn yugabyte db cdc.

            Procedure Followed:

            ...

            ANSWER

            Answered 2020-Jan-21 at 18:05

            The error shows that the jar file cannot be accessed in the system directory you moved. Most likely a permission issue. Note that you don't even need to move this jar into a system directory. So keeping the jar is some normal directory is the simplest option.

            Also, you don't need the stream_id as well since you want the default behavior most times. So try the following command in a user directory where you have the jar file.

            Source https://stackoverflow.com/questions/59798260

            Community Discussions, Code Snippets contain sources that include Stack Exchange Network

            Vulnerabilities

            No vulnerabilities reported

            Install yb-kafka-connector

            Setup and start Kafka. Install YugabyteDB and create the keyspace/table. Set up and run the Kafka Connect Sink. Confirm that the rows are in the target table in the YugabyteDB cluster, using cqlsh. Note that the timestamp value gets printed as a human-readable date format automatically.
            Setup and start Kafka Download the Apache Kafka tarball mkdir -p ~/yb-kafka cd ~/yb-kafka wget http://apache.cs.utah.edu/kafka/2.0.0/kafka_2.11-2.0.0.tgz tar -xzf kafka_2.11-2.0.0.tgz Any latest version can be chosen, this is just as a sample. Start Zookeeper and Kafka server ~/yb-kafka/kafka_2.11-2.0.0/bin/zookeeper-server-start.sh config/zookeeper.properties & ~/yb-kafka/kafka_2.11-2.0.0/bin/kafka-server-start.sh config/server.properties & Create a Kafka topic $ ~/yb-kafka/kafka_2.11-2.0.0/bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 --topic test This needs to be done only once. Run the following to produce data in that topic: $ ~/yb-kafka/kafka_2.11-2.0.0/bin/kafka-console-producer.sh --broker-list localhost:9092 --topic test_topic Just cut-and-paste the following lines at the prompt: {"key" : "A", "value" : 1, "ts" : 1541559411000} {"key" : "B", "value" : 2, "ts" : 1541559412000} {"key" : "C", "value" : 3, "ts" : 1541559413000} Feel free to Ctrl-C this process or switch to a different shell as more values can be added later as well to the same topic.
            Install YugabyteDB and create the keyspace/table. Install YugabyteDB and start a local cluster. Create a keyspace and table by running the following command. You can find cqlsh in the bin subdirectory located inside the YugabyteDB installation folder. $> cqlsh cqlsh> CREATE KEYSPACE IF NOT EXISTS demo; cqlsh> CREATE TABLE demo.test_table (key text, value bigint, ts timestamp, PRIMARY KEY (key));
            Set up and run the Kafka Connect Sink Setup the required jars needed by connect cd ~/yb-kafka/yb-kafka-connector/ mvn clean install -DskipTests cp ~/yb-kafka/yb-kafka-connector/target/yb-kafka-connnector-1.0.0.jar ~/yb-kafka/kafka_2.11-2.0.0/libs/ cd ~/yb-kafka/kafka_2.11-2.0.0/libs/ wget http://central.maven.org/maven2/io/netty/netty-all/4.1.25.Final/netty-all-4.1.25.Final.jar wget http://central.maven.org/maven2/com/yugabyte/cassandra-driver-core/3.2.0-yb-18/cassandra-driver-core-3.2.0-yb-18.jar wget http://central.maven.org/maven2/com/codahale/metrics/metrics-core/3.0.1/metrics-core-3.0.1.jar Finally, run the connect sink in standalone mode: ~/yb-kafka/kafka_2.11-2.0.0/bin/connect-standalone.sh ~/yb-kafka/yb-kafka-connector/resources/examples/kafka.connect.properties ~/yb-kafka/yb-kafka-connector/resources/examples/yugabyte.sink.properties Note: Setting the bootstrap.servers to a remote host/ports in the kafka.connect.properties file can help connect to any accessible existing Kafka cluster. The keyspace and tablename values in the yugabyte.sink.properties file should match the values in the cqlsh commands in step 5. The topics value should match the topic name from producer in step 6. Setting the yugabyte.cql.contact.points to a non-local list of host/ports will help connect to any remote accessible existing YugaByte DB cluster. Check the console output (optional) You should see something like this (relevant lines from YBSinkTask.java) on the console: [2018-10-28 16:24:16,037] INFO Start with keyspace=demo, table=test_table (com.yb.connect.sink.YBSinkTask:79) [2018-10-28 16:24:16,054] INFO Connecting to nodes: /127.0.0.1:9042,/127.0.0.2:9042,/127.0.0.3:9042 (com.yb.connect.sink.YBSinkTask:189) [2018-10-28 16:24:16,517] INFO Connected to cluster: cluster1 (com.yb.connect.sink.YBSinkTask:155) [2018-10-28 16:24:16,594] INFO Processing 3 records from Kafka. (com.yb.connect.sink.YBSinkTask:95) [2018-10-28 16:24:16,602] INFO Insert INSERT INTO demo.test_table(key,ts,value) VALUES (?,?,?) (com.yb.connect.sink.YBSinkTask:439) [2018-10-28 16:24:16,612] INFO Prepare SinkRecord ... [2018-10-28 16:24:16,618] INFO Bind 'ts' of type timestamp (com.yb.connect.sink.YBSinkTask:255) ...
            Confirm that the rows are in the target table in the YugabyteDB cluster, using cqlsh. cqlsh> select * from demo.test_table; key | value | ts ----+-------+--------------------------------- A | 1 | 2018-11-07 02:56:51.000000+0000 C | 3 | 2018-11-07 02:56:53.000000+0000 B | 2 | 2018-11-07 02:56:52.000000+0000 Note that the timestamp value gets printed as a human-readable date format automatically.

            Support

            For any new features, suggestions and bugs create an issue on GitHub. If you have any questions check and ask questions on community page Stack Overflow .
            Find more information at:

            Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items

            Find more libraries
            CLONE
          • HTTPS

            https://github.com/yugabyte/yb-kafka-connector.git

          • CLI

            gh repo clone yugabyte/yb-kafka-connector

          • sshUrl

            git@github.com:yugabyte/yb-kafka-connector.git

          • Stay Updated

            Subscribe to our newsletter for trending solutions and developer bootcamps

            Agree to Sign up and Terms & Conditions

            Share this Page

            share link

            Explore Related Topics

            Consider Popular Change Data Capture Libraries

            debezium

            by debezium

            libusb

            by libusb

            tinyusb

            by hathach

            bottledwater-pg

            by confluentinc

            WHID

            by whid-injector

            Try Top Libraries by yugabyte

            yugabyte-db

            by yugabyteC

            yugastore-java

            by yugabyteJava

            yugabyte-operator

            by yugabyteGo

            yugastore

            by yugabyteJavaScript

            orm-examples

            by yugabytePHP