pubsubplus-connector-kafka-source | Source connector to send data | Pub Sub library
kandi X-RAY | pubsubplus-connector-kafka-source Summary
kandi X-RAY | pubsubplus-connector-kafka-source Summary
pubsubplus-connector-kafka-source is a Java library typically used in Messaging, Pub Sub, Kafka applications. pubsubplus-connector-kafka-source has no bugs, it has no vulnerabilities, it has build file available and it has low support. However pubsubplus-connector-kafka-source has a Non-SPDX License. You can download it from GitHub, Maven.
The PubSub+ Kafka Source Connector consumes PubSub+ event broker real-time queue or topic data events and streams them to a Kafka topic as Source Records. The connector was created using PubSub+ high performance Java API to move PubSub+ data events to the Kafka Broker.
The PubSub+ Kafka Source Connector consumes PubSub+ event broker real-time queue or topic data events and streams them to a Kafka topic as Source Records. The connector was created using PubSub+ high performance Java API to move PubSub+ data events to the Kafka Broker.
Support
Quality
Security
License
Reuse
Support
pubsubplus-connector-kafka-source has a low active ecosystem.
It has 15 star(s) with 10 fork(s). There are 4 watchers for this library.
It had no major release in the last 12 months.
There are 4 open issues and 9 have been closed. On average issues are closed in 104 days. There are 2 open pull requests and 0 closed requests.
It has a neutral sentiment in the developer community.
The latest version of pubsubplus-connector-kafka-source is 3.1.0
Quality
pubsubplus-connector-kafka-source has no bugs reported.
Security
pubsubplus-connector-kafka-source has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
License
pubsubplus-connector-kafka-source has a Non-SPDX License.
Non-SPDX licenses can be open source with a non SPDX compliant license, or non open source licenses, and you need to review them closely before use.
Reuse
pubsubplus-connector-kafka-source releases are available to install and integrate.
Deployable package is available in Maven.
Build file is available. You can build the component from source.
Installation instructions, examples and code snippets are available.
Top functions reviewed by kandi - BETA
kandi has reviewed pubsubplus-connector-kafka-source and discovered the below as its top functions. This is intended to give you an instant insight into pubsubplus-connector-kafka-source implemented functionality, and help decide if they suit your requirements.
- Connect to Solver
- Method to create the session
- Initializes the solver
- Connects the JCS session
- Polls from the producer
- Schedules a message to be sent
- Stops the solver
- Print session stats
- Process incoming message
- Extract key schema from message header
- Process a received message
- Get the version
- Shutdown the SolaceSourceConnector
- Post reconnection after Solr failed
- Handle flow event
- Sets the listener exception
- From interface Queue
- Starts the SolaceSourceConnector
- This method is called when the client is re - established
- Get configuration for Solr source
- Gets the SolaceSourceTask class
- Construct the SolaceConfigDef
- Returns the source records
- Handle incoming session event
- Commits all records to disk
- Get a list of task configurations
Get all kandi verified functions for this library.
pubsubplus-connector-kafka-source Key Features
No Key Features are available at this moment for pubsubplus-connector-kafka-source.
pubsubplus-connector-kafka-source Examples and Code Snippets
curl http://18.218.82.209:8083/connector-plugins | jq
{
"class": "com.solace.connector.kafka.connect.source.SolaceSourceConnector",
"type": "source",
"version": "2.1.0"
},
curl -X POST -H "Content-Type: application/json" \
log4j.rootLogger=INFO, file
log4j.appender.file=org.apache.log4j.RollingFileAppender
log4j.appender.file.File=/var/log/kafka/connect.log
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=[%d] %p %m
com.solace.connector.kafka.connect
pubsubplus-connector-kafka-source
2.1.0
compile "com.solace.connector.kafka.connect:pubsubplus-connector-kafka-source:2.1.0"
Community Discussions
Trending Discussions on pubsubplus-connector-kafka-source
QUESTION
Solace integration with Kafka over TCPS failing
Asked 2021-Mar-15 at 21:08
I am trying to connect Solace cloud broker with Kafka. I have a topic in Solace cloud. I want to subscribe into the Solace topic through the pub-sub-plus source connector.
Here are my Source Connector Configuration:
...ANSWER
Answered 2021-Mar-15 at 21:08Answer to question could be found here! https://solace.community/discussion/646/solace-integration-with-kafka-over-tcps-failing
👍
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install pubsubplus-connector-kafka-source
This example demonstrates an end-to-end scenario similar to the Protocol and API messaging transformations use case, using the WebSocket API to publish a message to the PubSub+ event broker. It builds on the open source Apache Kafka Quickstart tutorial and walks through getting started in a standalone environment for development purposes. For setting up a distributed environment for production purposes, refer to the User Guide section. Note: The steps are similar if using Confluent Kafka; there may be difference in the root directory where the Kafka binaries (bin) and properties (etc/kafka) are located.
Install Kafka. Follow the Apache tutorial to download the Kafka release code, start the Zookeeper and Kafka servers in separate command line sessions, then create a topic named test and verify it exists.
Install PubSub+ Source Connector. Designate and create a directory for the PubSub+ Source Connector - assuming it is named connectors. Edit config/connect-standalone.properties and ensure the plugin.path parameter value includes the absolute path of the connectors directory. Download and extract the PubSub+ Source Connector into the connectors directory.
Acquire access to a PubSub+ message broker. If you don't already have one available, the easiest option is to get a free-tier service in a few minutes in PubSub+ Cloud , following the instructions in Creating Your First Messaging Service.
Configure the PubSub+ Source Connector: a) Locate the following connection information of your messaging service for the "Solace Java API" (this is what the connector is using inside): Username Password Message VPN one of the Host URIs b) edit the PubSub+ Source Connector properties file located at connectors/pubsubplus-connector-kafka-source-<version>/etc/solace_source.properties updating following respective parameters so the connector can access the PubSub+ event broker: sol.username sol.password sol.vpn_name sol.host Note: In the configured source and destination information, the sol.topics parameter specifies the ingress topic on PubSub+ (sourcetest) and kafka.topic is the Kafka destination topic (test), created in Step 1.
Start the connector in standalone mode. In a command line session run: bin/connect-standalone.sh \ config/connect-standalone.properties \ connectors/pubsubplus-connector-kafka-source-<version>/etc/solace_source.properties After startup, the logs will eventually contain following line: ================Session is Connected
Start to watch messages arriving to Kafka. See the instructions in the Kafka tutorial to start a consumer on the test topic.
Demo time! To generate an event into PubSub+, we use the "Try Me!" test service of the browser-based administration console to publish test messages to the sourcetest topic. Behind the scenes, "Try Me!" uses the JavaScript WebSocket API. If you are using PubSub+ Cloud for your messaging service follow the instructions in Trying Out Your Messaging Service. If you are using an existing event broker, log in to its PubSub+ Manager admin console and follow the instructions in How to Send and Receive Test Messages. In both cases, ensure to set the topic to sourcetest, which the connector is listening to. The Kafka consumer from Step 6 should now display the new message arriving to Kafka through the PubSub+ Kafka Source Connector: Hello world!
JDK 8 or higher is required for this project. This script creates artifacts in the build directory, including the deployable packaged PubSub+ Source Connector archives under build\distributions.
First, clone this GitHub repo: git clone https://github.com/SolaceProducts/pubsubplus-connector-kafka-source.git cd pubsubplus-connector-kafka-source
Install the test support module: git submodule update --init --recursive cd solace-integration-test-support ./mvnw clean install -DskipTests cd ..
Then run the build script: gradlew clean build
The processing of the Solace message to create a Kafka source record is handled by SolaceMessageProcessorIF. This is a simple interface that creates the Kafka source records from the PubSub+ messages.
SolSampleSimpleMessageProcessor
SolaceSampleKeyedMessageProcessor
Apache Kafka Connect
Confluent Kafka Connect
Install Kafka. Follow the Apache tutorial to download the Kafka release code, start the Zookeeper and Kafka servers in separate command line sessions, then create a topic named test and verify it exists.
Install PubSub+ Source Connector. Designate and create a directory for the PubSub+ Source Connector - assuming it is named connectors. Edit config/connect-standalone.properties and ensure the plugin.path parameter value includes the absolute path of the connectors directory. Download and extract the PubSub+ Source Connector into the connectors directory.
Acquire access to a PubSub+ message broker. If you don't already have one available, the easiest option is to get a free-tier service in a few minutes in PubSub+ Cloud , following the instructions in Creating Your First Messaging Service.
Configure the PubSub+ Source Connector: a) Locate the following connection information of your messaging service for the "Solace Java API" (this is what the connector is using inside): Username Password Message VPN one of the Host URIs b) edit the PubSub+ Source Connector properties file located at connectors/pubsubplus-connector-kafka-source-<version>/etc/solace_source.properties updating following respective parameters so the connector can access the PubSub+ event broker: sol.username sol.password sol.vpn_name sol.host Note: In the configured source and destination information, the sol.topics parameter specifies the ingress topic on PubSub+ (sourcetest) and kafka.topic is the Kafka destination topic (test), created in Step 1.
Start the connector in standalone mode. In a command line session run: bin/connect-standalone.sh \ config/connect-standalone.properties \ connectors/pubsubplus-connector-kafka-source-<version>/etc/solace_source.properties After startup, the logs will eventually contain following line: ================Session is Connected
Start to watch messages arriving to Kafka. See the instructions in the Kafka tutorial to start a consumer on the test topic.
Demo time! To generate an event into PubSub+, we use the "Try Me!" test service of the browser-based administration console to publish test messages to the sourcetest topic. Behind the scenes, "Try Me!" uses the JavaScript WebSocket API. If you are using PubSub+ Cloud for your messaging service follow the instructions in Trying Out Your Messaging Service. If you are using an existing event broker, log in to its PubSub+ Manager admin console and follow the instructions in How to Send and Receive Test Messages. In both cases, ensure to set the topic to sourcetest, which the connector is listening to. The Kafka consumer from Step 6 should now display the new message arriving to Kafka through the PubSub+ Kafka Source Connector: Hello world!
JDK 8 or higher is required for this project. This script creates artifacts in the build directory, including the deployable packaged PubSub+ Source Connector archives under build\distributions.
First, clone this GitHub repo: git clone https://github.com/SolaceProducts/pubsubplus-connector-kafka-source.git cd pubsubplus-connector-kafka-source
Install the test support module: git submodule update --init --recursive cd solace-integration-test-support ./mvnw clean install -DskipTests cd ..
Then run the build script: gradlew clean build
The processing of the Solace message to create a Kafka source record is handled by SolaceMessageProcessorIF. This is a simple interface that creates the Kafka source records from the PubSub+ messages.
SolSampleSimpleMessageProcessor
SolaceSampleKeyedMessageProcessor
Apache Kafka Connect
Confluent Kafka Connect
Support
In standalone mode, the connect logs are written to the console. If you do not want to send the output to the console, simply add the "-daemon" option to have all output directed to the logs directory. In distributed mode, the logs location is determined by the connect-log4j.properties located at the config directory in the Apache Kafka distribution or under etc/kafka/ in the Confluent distribution.
Find more information at:
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page