kappa-architecture.com | examples around the Kappa Architecture | Pub Sub library
kandi X-RAY | kappa-architecture.com Summary
kandi X-RAY | kappa-architecture.com Summary
A repository of information, implementations and examples around the Kappa Architecture
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of kappa-architecture.com
kappa-architecture.com Key Features
kappa-architecture.com Examples and Code Snippets
Community Discussions
Trending Discussions on kappa-architecture.com
QUESTION
I am new in Lagom and Persistence Entities Database concepts.
I am building Streaming Analytics Engine. My each analysis will work as independent micro-service and for each individual micro-service according to its design philosophy the results will save in its own database (my case Cassandra). I am using Flink & Spark for Streaming Analysis which results are then Sink to Cassandra using Phantom for Flink(Scala driver for Cassandra). I am not able to understand following challenges in Lagom Framework.
To store the analytic result still i need to implement Persistence Entity(P.E) to store the record in Cassandra or should i buy-pass it and can store direct to Cassandra? My application neither support for deleter nor update. Only insert to visualize the results. Flink & Spark already have support of Fault-Tolerance.
How can i access access to Cassandra Session without Persistence Entities?
If I use Phantom driver in Lagom then its has some conflict with embedded Cassandra of Lagom; not able to register the service in Service Locator.
Can you please suggest how should i proceed with this situation. In other words each micro-service, its architecture based on KAPPA Architecture
Thanks
...ANSWER
Answered 2017-Apr-19 at 09:37If you have a stream of events then each microservice consuming from it could either keep a copy of all events of maintain a materialized view. An example of such a microservice can be seen on the search-service of the online-auction sample app. In the linked code there's a clas consuming two different streams (in this case Kafka topics) and storing data into an ElasticSearch Index. The same could be achieved using Cassandra or other database.
You may be facing further problems if you try to import a cassandra driver on top of what's provided by Lagom. In that case I would suggest that you: (1) don't depend on any lagom-persistence-xxx so that only your driver is used or (2) use the CassandraSession
provided by Lagom's lagomScaladslPersistenceCassandra
module (see Lagom Persistence docs).
If you choose to use the seconds option, you have to add CassandraSession
to the constructor of your class and then the Dependency Injection in your Loader
will make sure the adequate instance is provided. See how in the linked code there's 3 arguments in the constructor and the the Loader uses macwire
to inject them. Note that you will have to mix in the ReadSideCassandraPersistenceComponents
trait so CassandraSession
can be injected.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install kappa-architecture.com
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page