kafka-connect-mongodb | MongoDB sink connector for Kafka Connect | Change Data Capture library

 by   startappdev Scala Version: Current License: Apache-2.0

kandi X-RAY | kafka-connect-mongodb Summary

kandi X-RAY | kafka-connect-mongodb Summary

kafka-connect-mongodb is a Scala library typically used in Utilities, Change Data Capture, MongoDB, Kafka applications. kafka-connect-mongodb has no bugs, it has no vulnerabilities, it has a Permissive License and it has low support. You can download it from GitHub.

MongoDB sink connector for Kafka Connect
Support
    Quality
      Security
        License
          Reuse

            kandi-support Support

              kafka-connect-mongodb has a low active ecosystem.
              It has 12 star(s) with 8 fork(s). There are 15 watchers for this library.
              OutlinedDot
              It had no major release in the last 6 months.
              There are 4 open issues and 8 have been closed. On average issues are closed in 118 days. There are no pull requests.
              It has a neutral sentiment in the developer community.
              The latest version of kafka-connect-mongodb is current.

            kandi-Quality Quality

              kafka-connect-mongodb has no bugs reported.

            kandi-Security Security

              kafka-connect-mongodb has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.

            kandi-License License

              kafka-connect-mongodb is licensed under the Apache-2.0 License. This license is Permissive.
              Permissive licenses have the least restrictions, and you can use them in most projects.

            kandi-Reuse Reuse

              kafka-connect-mongodb releases are not available. You will need to build from source code and install.
              Installation instructions, examples and code snippets are available.

            Top functions reviewed by kandi - BETA

            kandi's functional review helps you automatically verify the functionalities of the libraries and avoid rework.
            Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of kafka-connect-mongodb
            Get all kandi verified functions for this library.

            kafka-connect-mongodb Key Features

            No Key Features are available at this moment for kafka-connect-mongodb.

            kafka-connect-mongodb Examples and Code Snippets

            No Code Snippets are available at this moment for kafka-connect-mongodb.

            Community Discussions

            QUESTION

            Docker image for Confluent - adding Confluent Hub connectors
            Asked 2020-Sep-19 at 21:58

            I wanted to slightly modify Confluent's Git repo Dockerfile to have in my Confluent Connect page mongoDB and Snowflake connections. Everything runs ok but I don't see them in the portal.

            Should docker-compose.yml be modified as well?

            Original code:

            ...

            ANSWER

            Answered 2020-Sep-19 at 21:58

            I think you can try to do the following.

            1. Modify your Dockerfile:

            Source https://stackoverflow.com/questions/63964087

            QUESTION

            How to get full document when using kafka mongodb source connector when tracking update operations on a collection?
            Asked 2020-Jul-29 at 15:14

            I am using Kakfa MongoDB Source Connector [https://www.confluent.io/hub/mongodb/kafka-connect-mongodb] with confluent platform v5.4. Below is my MongoDB Source Connector config

            ...

            ANSWER

            Answered 2020-Jul-29 at 15:14

            Use the property publish.full.document.only": "true" in the MongoDB Connector config for getting the full document in case any create and update operation is done on the MongoDB collection. Delete operations cannot be tracked as it does not go with the idea of the CDC(change data capture) concept. Only the changes (create/update) in the data can be captured.

            Source https://stackoverflow.com/questions/62966428

            QUESTION

            "The $changeStream stage is only supported on replica sets" error while using mongodb-source-connect
            Asked 2020-Jan-03 at 01:47

            Happy new year

            I am here because i faced an error while running kafka-mongodb-source-connect I was trying to run connect-standalone with connect-avro-standalone.properties and MongoSourceConnector.properties so that Connect write data which is written in mongodb to kafka topic.

            While trying that progress i faced this error and i couldn't find answer so i am writing here.

            This is what i wanted to do

            ...

            ANSWER

            Answered 2020-Jan-03 at 01:47

            QUESTION

            kafka mongodb sink connector not starting
            Asked 2019-Sep-01 at 07:58

            I've installed confluent_3.3.0 and started zookeper, schema-registry and kafka broker. I have also downloaded mongodb connector from this link.

            Description: I'm running sink connector using the following command:

            ./bin/connect-standalone etc/kafka/connect-standalone.properties /home/username/mongo-connect-test/kafka-connect-mongodb/quickstart-couchbase-sink.properties

            Problem: I'm getting the following error:

            ...

            ANSWER

            Answered 2017-Nov-08 at 23:17

            This connector is using, at its latest version, an old version of the kafka-clients API. Specifically, it is depending on a constructor of the class org.apache.kafka.common.config.AbstractConfig that does not exist in Apache Kafka versions >= 0.11.0.0

            Confluent Platform version 3.3.0 is using Apache Kafka 0.11.0.0

            To fix this issue, the recommended approach would be to update the connector code to use the most recent versions of Apache Kafka APIs.

            Source https://stackoverflow.com/questions/47175455

            QUESTION

            How to stream data from Kafka to MongoDB by Kafka Connector
            Asked 2019-Jul-09 at 07:07

            I want to stream data from Kafka to MongoDB by using Kafka Connector. I found this one https://github.com/hpgrahsl/kafka-connect-mongodb. But there is no step to do.

            After googling, it seems to lead to Confluent Platform what I don't want to use.

            Could anyone share me document/guideline how to use kafka-connect-mongodb without using Confluent Platform or another Kafka Connector to stream data from Kafka to MongoDB?

            Thank you in advance.

            What I tried

            Step1: I download mongo-kafka-connect-0.1-all.jar from maven central

            Step2: copy jar file to a new folder plugins inside kafka (I use Kafka on Windows, so the directory is D:\git\1.libraries\kafka_2.12-2.2.0\plugins)

            Step3: Edit file connect-standalone.properties by adding a new line plugin.path=/git/1.libraries/kafka_2.12-2.2.0/plugins

            Step4: I add new config file for mongoDB sink MongoSinkConnector.properties

            ...

            ANSWER

            Answered 2019-Jul-04 at 10:59

            There is an official source and sink connector from MongoDB themselves. It is available on Confluent Hub: https://www.confluent.io/hub/mongodb/kafka-connect-mongodb

            If you don't want to use Confluent Platform you can deploy Apache Kafka yourself - it includes Kafka Connect already. Which plugins (connectors) you use with it is up to you. In this case you would be using Kafka Connect (part of Apache Kafka) plus kafka-connect-mongodb (provided by MongoDB).

            Documentation on how to use it is here: https://github.com/mongodb/mongo-kafka/blob/master/docs/sink.md

            Source https://stackoverflow.com/questions/56880527

            Community Discussions, Code Snippets contain sources that include Stack Exchange Network

            Vulnerabilities

            No vulnerabilities reported

            Install kafka-connect-mongodb

            In the following example we will produce json data to a Kafka topic without schema, and insert it to a test collection in our MongoDB database with the connector in distributed mode.
            Download Kafka 0.9.0.0 or later.
            Create new database in your MongoDB named "testdb" and in that database, create new collection named "testcollection".
            Start Zookeeper: $./bin/zookeeper-server-start.sh config/zookeeper.properties
            Start Kafka broker: $./bin/kafka-server-start.sh config/server.properties
            Create a test topic: $./bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 1 --partitions 5 --topic testTopic
            Copy the jar file of the connector to your workspace folder: $cp /your-jar-location/kafka-connect-mongodb-assembly-1.0.jar /tmp/
            Add the jar to the classpath: $export CLASSPATH=/tmp/kafka-connect-mongodb-assembly-1.0.jar
            Copy the worker configuration file to your workspace directory: $cp config/connect-distributed.properties /tmp/
            Modify the properties file to: bootstrap.servers=localhost:9092 group.id=testGroup key.converter=org.apache.kafka.connect.json.JsonConverter value.converter=org.apache.kafka.connect.json.JsonConverter key.converter.schemas.enable=false value.converter.schemas.enable=false internal.key.converter=org.apache.kafka.connect.json.JsonConverter internal.value.converter=org.apache.kafka.connect.json.JsonConverter internal.key.converter.schemas.enable=false internal.value.converter.schemas.enable=false offset.storage.topic=connectoffsets offset.flush.interval.ms=10000 config.storage.topic=connectconfigs Notice that if your topic has high throughput, you may suffer from timeouts and rebalance issues, due to fetching too much records at once, which MongoDB isn't able to deal with. In order to deal with it, you can set the maximum number of fetched records by setting the consumer.max.partition.fetch.bytes parameter in your worker configs, where the parameter's value is a number that is low enough not to trigger the timeout, but high enough for not suffering from effectively synchronous message processing. Tuning will be required in that case.
            Create topics for connector offsets and configs: $./bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 5 --partitions 5 --topic connectoffsets $./bin/kafka-topics.sh --create --zookeeper localhost:2181 --replication-factor 5 --partitions 1 --topic connectconfigs
            Run the worker: $./bin/connect-distributed.sh /tmp/connect-distributed.properties
            Create a json configurations file (mongo_connector_configs.json): { "name":"mongo-connector-testTopic", "config" :{ "connector.class":"com.startapp.data.MongoSinkConnector", "tasks.max":"5", "db.host":"localhost", "db.port":"27017", "db.name":"testdb", "db.collections":"testcollection", "write.batch.enabled":"true", "write.batch.size":"200", "connect.use_schema":"false", "topics":"testTopic" } }
            Register the connector: $curl -X POST -H "Content-Type: application/json" --data @/tmp/mongo_connector_configs.json http://localhost:8083/connectors
            Run Kafka producer: $./bin/kafka-console-producer.sh --broker-list localhost:9092 --topic testTopic
            Produce some data: {"field1":"value1", "field2":"value2", "field3":"value3"}
            Make sure the data inserted to the collection.
            Unregister the connector: curl -i -X DELETE "http://localhost:8083/connectors/mongo-connector-testTopic"
            Modify connectors configurations (mongo_connector_configs.json): { "name":"mongo-connector-testTopic", "config" :{ "connector.class":"com.startapp.data.MongoSinkConnector", "tasks.max":"5", "db.host":"localhost", "db.port":"27017", "db.name":"testdb", "db.collections":"testcollection", "write.batch.enabled":"true", "write.batch.size":"200", "connect.use_schema":"false", "record.fields.rename":"field1=>mykey1, "field2=>testField2", "record.keys":"myKey1", "record.fields":"myKey1,testField2,field3", "topics":"testTopic" } }
            Create index on myKey1 in mongo: db.testcollection.createIndex( { myKey1: 1} )
            Register the connector: $curl -X POST -H "Content-Type: application/json" --data @/tmp/mongo_connector_configs.json http://localhost:8083/connectors
            Run Kafka producer: $./bin/kafka-console-producer.sh --broker-list localhost:9092 --topic testTopic
            Produce some data: {"field1":"a", "field2":"b", "field3":"c"} {"field1":"d", "field2":"e", "field3":"f"} {"field1":"a", "field2":"e", "field3":"f"}
            Make sure the data in the collection looks like this: {"myKey1":"d", "testField2":"e", "field3":"f"} {"myKey1":"a", "testField2":"e", "field3":"f"}
            Unregister the connector
            Modify connectors configurations (mongo_connector_configs.json) by adding "record.timestamp.name":"updateDate" to the configs
            Register the connector:
            Run Kafka producer:
            Produce some data: {"field1":"a", "field2":"b", "field3":"c"} {"field1":"d", "field2":"e", "field3":"f"} {"field1":"a", "field2":"e", "field3":"f"}
            Make sure the data in the collection looks like this: {"myKey1":"d", "testField2":"e", "field3":"f", "updateDate" : 1485788932525} {"myKey1":"a", "testField2":"e", "field3":"f", "updateDate" : 1485789932528}

            Support

            For any new features, suggestions and bugs create an issue on GitHub. If you have any questions check and ask questions on community page Stack Overflow .
            Find more information at:

            Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items

            Find more libraries
            CLONE
          • HTTPS

            https://github.com/startappdev/kafka-connect-mongodb.git

          • CLI

            gh repo clone startappdev/kafka-connect-mongodb

          • sshUrl

            git@github.com:startappdev/kafka-connect-mongodb.git

          • Stay Updated

            Subscribe to our newsletter for trending solutions and developer bootcamps

            Agree to Sign up and Terms & Conditions

            Share this Page

            share link

            Consider Popular Change Data Capture Libraries

            debezium

            by debezium

            libusb

            by libusb

            tinyusb

            by hathach

            bottledwater-pg

            by confluentinc

            WHID

            by whid-injector

            Try Top Libraries by startappdev

            express-context

            by startappdevJavaScript