spring-cloud-task | spring-cloud-task demo | Microservice library
kandi X-RAY | spring-cloud-task Summary
kandi X-RAY | spring-cloud-task Summary
spring-cloud-task demo
Support
Quality
Security
License
Reuse
Top functions reviewed by kandi - BETA
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample of spring-cloud-task
spring-cloud-task Key Features
spring-cloud-task Examples and Code Snippets
Community Discussions
Trending Discussions on spring-cloud-task
QUESTION
TL;DR
Spring Cloud Data Flow does not allow multiple executions of the same Task even though the documentation says that this is the default behavior. How can we allow SCDF to run multiple instances of the same task at the same time using the Java DSL to launch tasks? To make things more interesting, launching of the same task multiple times works fine when directly hitting the rest enpoints using curl for example.
Background :
I have a Spring Cloud Data Flow Task that I have pre-registered in the Spring Cloud Data Flow UI Dashboard
...ANSWER
Answered 2021-May-12 at 16:57In this case it looks like you are trying to recreate the task definition. You should only need to create the task definition once. From this definition you can launch multiple times. For example:
QUESTION
I'm trying out a simple Spring Cloud Task application. I'm currently trying to run the said application but somehow I'm ending up with a java.lang.ClassNotFoundException: com.example.sampletask.demotask.DemoTaskApplication.java
. I'm trying to tweak the pom.xml
file but I'm getting no where so far. Here's what I have as of now:
The application class:
...ANSWER
Answered 2020-Sep-25 at 19:18I think you need to drop the .java
from this line in your pom.xml:
QUESTION
There is Data Flow deployed in Docker and a Spring Batch application is deployed as a 'Task' and turned into a task.
I am trying to provide a year job parameter for my task. I have tried using a properties class with @ConfigurationProperties
annotation like in timestamp example. Then I turn this into job parameter via JobParametersIncrementer
.
ANSWER
Answered 2020-Aug-16 at 02:27The job parameters are passed as Arguments
where you don't need to specify the --
prefix. For instance, in this case you need to specify year=2011
as argument in Arguments
section.
QUESTION
I'm developing a task to use Spring Cloud Data Flow, but I can't see the execution parameters of my task (which is a Spring Boot application) or how to execute the task.
I am importing my application as a base and as the project as a base: spring-cloud-starter-task-timestamp
This is my pom, I made this using Spring Initializr and then using the timestamp-task application for tweaking.
...ANSWER
Answered 2020-Jun-25 at 04:56You need to add your task application properties into an application metadata
that Spring Cloud Data Flow knows how to extract and make them available.
You can refer this page that walks you through how to add metadata.
Also, you can check this sample configuration as a reference on how to add such metadata.
QUESTION
I went through the Introducing Spring Cloud Task, but things are not clear for the following questions.
I'm using Spring Batch
1) What's the use of Spring Cloud Task when we already have the metadata provided by Spring Batch ?
2) We're planning to use Spring Cloud Data Flow to monitor the Spring Batch. All the batch jobs can be imported into the SCDF as task and can be sceduled there, but don't see support for MongoDB. Hope MySQL works well.
What is the difference between Spring Cloud Task and Spring Batch?
...ANSWER
Answered 2020-May-04 at 10:40Spring Cloud Task has a broader scope than Spring Batch. It is designed for any short lived task, including but not limited to (Spring) Batch jobs. A short lived task could be a Java process, a shell script, a Docker image, etc. Spring Cloud Task has its own meta-data tables to track the progress/status/stats of tasks.
In the context of Spring Batch, Spring Cloud Task provides a number of additional features:
- Batch informational messages: ability to emit messages based on Spring Batch listeners events. Those messages can be consumed by streaming apps and makes it possible to bridge tasks and streaming apps.
- DeployerPartitionHandler: an additional partition handler that is suitable to cloud environments to dynamically deploy workers in a remote partitioning setup.
QUESTION
I am working on Spring Cloud Data Flow and Spring Batch
by taking a reference from https://github.com/spring-cloud/spring-cloud-task/tree/master/spring-cloud-task-samples.
I'm executing the batch-job and when executed this example two times, on 2nd time I observed the error, however for the first time it worked fine.
I started the spring-cloud-dataflow-server-local
server using below commands and it created all metadata for me- highlighted in yellow.
Error:
...ANSWER
Answered 2020-Apr-18 at 15:50To resolved this issue, I passed the unique key:value in the Task Argument and I so keeping passing the multiple key:value combinations, then I can run the same Task many times.
However I wanted to do this programmatically. Any quick solution ?
QUESTION
I have deployed a Spring Batch project as a task on Spring Cloud Data Flow (Using @EnableTask
).
I have used spring.cloud.task.singleInstanceEnabled=true
to avoid relaunching task while task's status is still running on Spring Cloud Data Flow. This property is placed in application.properties of Spring Batch project. (following this link -- https://docs.spring.io/spring-cloud-task/docs/2.0.0.RELEASE/reference/htmlsingle/#_restricting_spring_cloud_task_instances)
However, the results are not as my expectation while testing:
1st time run: job A is launched successfully (about 15 minutes)
2nd time run: job A is launched again during 1st time run so it had the error:
...ANSWER
Answered 2019-Nov-12 at 14:23The task instance restriction is a feature of Spring Cloud Task (your applications), not Spring Cloud Data Flow. Because of this, what you are seeing is expected behavior. You can open up an issue with the Spring Cloud Data Flow project for us to discuss adding that feature at the Spring Cloud Data Flow level.
QUESTION
My current understanding is that both of these projects are under Spring Cloud Dataflow, and serve as components of the pipeline. However, both can be made recurring (a stream is by definition recurring, where a task can run every certain time interval). In addition, both can be configured to communicate with the rest of the pipeline through the message broker. Currently there is this unanswered question, so I've yet to find a clear answer.
...ANSWER
Answered 2019-Sep-23 at 06:02Please see my response as below:
My current understanding is that both of these projects are under Spring Cloud Dataflow, and serve as components of the pipeline.
Both Spring Cloud Stream and Spring Cloud Task are not under Spring Cloud Data Flow, instead, they can be used as standalone projects and Spring Cloud Data Flow just uses them.
Spring Cloud Stream lets you bind
your event-driven long-running applications into a messaging middleware or a streaming platform. As a developer, you have to choose your binder (the binder implementations for RabbitMQ, Apache Kafka etc.,) to stream your events or data from/to the messaging middleware you bind to.
Spring Cloud Task doesn't bind your application into a messaging middleware. Instead, it provides abstractions and lifecycle management to run your ephemeral
or finite
duration applications (tasks). It also provides the foundation for developing Spring Batch applications.
However, both can be made recurring (a stream is by definition recurring, where a task can run every certain time interval)
A task application can be triggered/scheduled to make it a recurring one whereas the streaming application is a long-running, not a recurring one.
In addition, both can be configured to communicate with the rest of the pipeline through the message broker.
Though a task application can be configured to communicate to a messaging middleware, the concept of pipeline
is different when it comes to stream vs task (batch). For the streaming applications, the pipeline refers to the communication via the messaging middleware while for the task applications, the concept of composed
tasks lets you create a conditional workflow of multiple task applications. For more information on composed tasks, you can refer to the documentation.
QUESTION
My spring batch application is not inserting relationship between task and job in TASK_TASK_BATCH
table.
Spring doc says :
Associating A Job Execution To The Task In Which It Was Executed Spring Boot provides facilities for the execution of batch jobs easily within an über-jar. Spring Boot’s support of this functionality allows for a developer to execute multiple batch jobs within that execution. Spring Cloud Task provides the ability to associate the execution of a job (a job execution) with a task’s execution so that one can be traced back to the other.
This functionality is accomplished by using the TaskBatchExecutionListener. By default, this listener is auto configured in any context that has both a Spring Batch Job configured (via having a bean of type Job defined in the context) and the spring-cloud-task-batch jar is available within the classpath. The listener will be injected into all jobs."
I have all the required jars in my classpath.It's just that I am creating jobs and tasklets dynamically so not using any annotation. As per the doc TaskBatchExecutionListener is responsible for creating mapping in TASK_TASK_BATCH
table by calling taskBatchDao's saveRelationship
method.
I am just not able to figure out how to configure TaskBatchExecutionListener
explicitly in my spring batch application.
ANSWER
Answered 2019-Sep-04 at 19:02If you have the org.springframework.cloud:spring-cloud-task-batch
dependency, and the annotation @EnableTask
is present, then your application context contains a TaskBatchExecutionListener
bean that you can inject into your class that dynamically creates the jobs and tasklets.
That might look similar to this:
QUESTION
When I run my task without the CommandLineRunner implemented and add the @Scheduled annotation it appears the context is being closed. How can I keep the context open so the @Scheduled can run properly?
DataTransferTask.java
...ANSWER
Answered 2018-Feb-22 at 20:51Since you have @EnableTask, spring will close the context once everything is done running. From the code it looks like you are not running anything explicitly so spring is closing the context before your @Schedule() annotation kicks in.
The fix for that is to tell spring to not close the context at all with spring.cloud.task.closecontext_enable=false
. This will keep the context open for your scheduled task.
Some documentation here: https://docs.spring.io/spring-cloud-task/docs/1.2.2.RELEASE/reference/htmlsingle/#features-lifecycle
One more note about the property. In the documentation it says closecontext_enable
but after inspecting the logs and the jar, that property has been deprecated and replaced by close_context_enabled
.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install spring-cloud-task
Support
Reuse Trending Solutions
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesStay Updated
Subscribe to our newsletter for trending solutions and developer bootcamps
Share this Page