monix | Asynchronous , Reactive Programming for Scala and Scala.js | Reactive Programming library
kandi X-RAY | monix Summary
kandi X-RAY | monix Summary
Monix is a high-performance Scala / Scala.js library for composing asynchronous, event-based programs. It started as a proper implementation of ReactiveX, with stronger functional programming influences and designed from the ground up for back-pressure and made to interact cleanly with Scala's standard library, compatible out-of-the-box with the Reactive Streams protocol. It then expanded to include abstractions for suspending side effects and for resource handling, and is one of the parents and implementors of Cats Effect. A Typelevel project, Monix proudly exemplifies pure, typeful, functional programming in Scala, while being pragmatic, and making no compromise on performance.
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of monix
monix Key Features
monix Examples and Code Snippets
Community Discussions
Trending Discussions on monix
QUESTION
Let us use Scala.
I'm trying to find the best possible way to do an opportunistic, partial, and asynchronous pre-computation of some of the elements of an iterator that is otherwise processed synchronously.
The below image illustrates the problem.
There is a lead thread (blue) that takes an iterator and a state. The state contains mutable data that must be protected from concurrent access. Moreover, the state must be updated while the iterator is processed from the beginning, sequentially, and in order because the elements of the iterator depend on previous elements. Moreover, the nature of the dependency is not known in advance.
Processing some elements may lead to substantial overhead (2 orders of magnitude) compared to others, meaning that some elements are 1ms to compute and some elements are 300ms to compute. It would lead to significant improvements in terms of running time if I could pre-process the next k
elements speculatively. A speculative pre-processing on asynchronous threads is possible (while the blue thread is synchronously processing), but the pre-processed data must be validated by the blue thread, whether the result of pre-computation is valid at that time. Usually (90% of the time), it should be valid. Thus, launching separate asynchronous threads to pre-process the remaining portion of the iterator speculatively would spear many 300s of milliseconds in running time.
I have studied comparisons of asynchronous and functional libraries of Scala to understand better which model of computation, or in other words, which description of computation (which library) could be a better fit to this processing problem. I was thinking about communication patterns and came about with the following ideas:
- AKKA
Use an AKKA actor Blue
for the blue thread that takes the iterator, and for each step, it sends a Step
message to itself. On a Step
message, before it starts the processing of the next i
th element, it sends a PleasePreprocess(i+k)
message with the i+k
th element to one of the k
pre-processor actors in place. The Blue
would Step
to i+1
only and only if PreprocessingKindlyDone(i+1)
is received.
- AKKA Streams
AFAIK AKKA streams also support the previous two-way backpressure mechanism, therefore, it could be a good candidate to implement what actors do without actually using actors.
- Scala Futures
While the blue thread processes elements ˙processElement(e)˙ in iterator.map(processElement(_))
, then it would also spawn Future
s for preprocessing. However, maintaining these pre-processing Future
s and awaiting their states would require a semi-blocking implementation in pure Scala as I see, so I would not go with this direction to the best of my current knowledge.
- Use Monix
I have some knowledge of Monix but could not wrap my head around how this problem could be elegantly solved with Monix. I'm not seeing how the blue thread could wait for the result of i+1
and then continue. For this, I was thinking of using something like a sliding window with foldLeft(blueThreadAsZero){ (blue, preProc1, preProc2, notYetPreProc) => ... }
, but could not find a similar construction.
Possibly, there could be libraries I did not mention that could better express computational patterns for this.
I hope I have described my problem adequately. Thank you for the hints/ideas or code snippets!
...ANSWER
Answered 2022-Mar-24 at 13:30I would split the processing into two steps, the pre-processing that could be run in parallel and the dependent one which has to be serial.
Then, you can just create a stream of data from the iterator do a parallel map applying the preprocess step and finish with a fold
Personally I would use fs2, but the same approach can be expressed with any streaming solution like AkkaStreams, Monix Observables
or ZIO ZStreams
QUESTION
I am trying to use monix to do parallelize certain operations and then perform error handling
Lets say I am trying to parse and validate a couple of objects like this
...ANSWER
Answered 2021-Sep-16 at 08:01With parMap2
only, it is not possible to accomplish what you want to do.
The documentation says:
In case one of the tasks fails, then all other tasks get cancelled and the final result will be a failure.
However, it is possible to accomplish want you to want exposing errors, and not hiding behind the monadic handling. This is possible via materialize
method.
So, for instance, you can implement your method as:
QUESTION
I'm reading a very large number of records sequentially from database API one page at a time (with unknown number of records per page) via call to def readPage(pageNumber: Int): Iterator[Record]
I'm trying to wrap this API in something like either Stream[Iterator[Record]]
or Iterator[Iterator[Record]]
lazily, in a functional way, ideally no mutable state, with constant memory footprint, so that I can treat it as infinite stream of pages or sequence of Iterators, and abstract away the pagination from the client. Client can iterate on the result, by calling next() it will retrieve the next page (Iterator[Record]).
What is the most idiomatic and efficient way to implement this in Scala.
Edit: need to fetch & process the records one page at a time, cannot maintain all the records from all pages in memory. If one page fails, throw an exception. Large number of pages/records means infinite for all practical purposes. I want to treat it as infinite stream (or iterator) of pages with each page being an iterator for finite number of records (e.g. less <1000 but exact number is unknown ahead if time).
I looked at BatchCursor in Monix but it serves a different purpose.
Edit 2: this is the current version using Tomer's answer below as starting point, but using Stream instead of Iterator.
This allows to eliminate the need in tail recursion as per https://stackoverflow.com/a/10525539/165130, and have O(1) time for stream prepend #::
operation (while if we've concatenated iterators via ++
operation it would be O(n))
Note: While streams are lazily evaluated, Stream memoization may still cause memory blow up, and memory management gets tricky. Changing from val
to def
to define the Stream in def pages = readAllPages
below doesn't seem to have any effect
ANSWER
Answered 2021-Jan-17 at 19:04You can try implement such logic:
QUESTION
I have below code
...ANSWER
Answered 2020-Sep-26 at 21:16Unfortunately, EitherT and Task are different monads and monads doesn't compose so you can't use them directly in the same for comprehension.
What you could do is lifting Task into EitherT but in this case type parameter F of EitherT would have to be Task and in your case it's Future.
So you have to do 2 things:
- Transform Task into Future
- Lift Future to EitherT
Let's say your another method looks like this:
QUESTION
Just try some simple Task examples. The following code works fine
...ANSWER
Answered 2020-Sep-28 at 07:27When run as a stand-alone program, the program exits before the task completes, so you don't get any output. You need to wait for the task to complete.
QUESTION
I am currently trying to use Monix for throttling api get requests. I have tried using STTP's Monix backend, and it worked fine until couldn't shut down the Monix backend after I was done... As this seems more like an sttp issue than a Monix one, I tried to re-approach the problem by using sttp's default backend, while still using Monix to throttle.
I am mainly struggling with closing the monix backend once I am done with consuming the observable
I have tried to simplify the problem through:
...ANSWER
Answered 2020-Aug-10 at 20:16You can create Promise
, complete it when Observable
is completed by .doOnComplete
And await it in the main thread.
QUESTION
I am currently learning and playing around with STTP using the Monix backend. I am mainly stuck with closing the backend after all my requests (each request is a task) have been processed.
I have created sample/mock code to resemble my issue (to my understanding my problem is more general rather than specific to my code):
...ANSWER
Answered 2020-Aug-11 at 07:42The problems comes from the fact that with the sttp backend open, you are computing a list of tasks to be performed - the List[Task[Response[Either[String, String]]]]
, but you are not running them. Hence, we need to sequence running these tasks, before the backend closes.
The key thing to do here is to create a single description of a task, that runs all of these requests while the backend is still open.
Once you compute data
(which itself is a task - a description of a computation - which, when run, yield a list of tasks - also descriptions of computations), we need to convert this into a single, non-nested Task
. This can be done in a variety of ways (e.g. using simple sequencing), but in your case this will be using the Observable
:
QUESTION
ANSWER
Answered 2020-Aug-09 at 09:22So if i have understood right you have types like this:
QUESTION
Disclaimer: I am new to sttp and Monix, and that is my attempt to learn more about these libraries. My goal is to fetch data (client-side) from a given API via HTTP GET requests -> parse JSON responses -> write this information to a database. My question pertains to the first part only. My objective is to run get requests in an asynchronous (hopefully fast) way while having a way to either avoid or handle rate limits.
Below is a snippet of what I have already tried, and seems to work for a single request:
...ANSWER
Answered 2020-Aug-05 at 22:48You can use monix.reactive.Observable like this
QUESTION
I am currently working on implementing client-side http requests to an API, and decided to explore sttp & monix for this task. As I am new to Monix, I am still not sure how to run tasks and retrieve their results. My objective is to have a sequence of http request results, which I can call in parallel -> parse -> load.
Below is a snippet of what I have tried so far:
...ANSWER
Answered 2020-Aug-05 at 13:33Thanks to Oleg Pyzhcov and the monix gitter community for helping me figure this one out.
Quoting Oleg here:
Since you're using backend with monix support already, the type of r1 is
Task[Response[Either[String,String]]]
. So when you're doingSeq(r1).map(i => Task(i))
, you make it a sequence of tasks that don't do anything except give you other tasks that give you result (the type would beSeq[Task[Task[Response[...]]]]
). Your code then parallelizes the outer layer, tasks-that-give-tasks, and you get the tasks that you started with as the result. You only need to process a Seq(r1) for it to run requests in parallel.If you're using Intellij, you can press
Alt + =
to see the type of selection - it helps if you can't tell the type from the code alone (but it gets better with experience).As for rate-limiting, we have parSequenceN that lets you set a limit to parallelism. Note that unordered only means that you get slight performance advantage at the cost of results being in random order in the output, they are executed non-deterministically anyway.
I ended up with a (simplified) implementation that looks something like this:
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install monix
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page