argo-workflows | Workflow engine for Kubernetes | BPM library

 by   argoproj Go Version: v3.4.8 License: Apache-2.0

kandi X-RAY | argo-workflows Summary

kandi X-RAY | argo-workflows Summary

argo-workflows is a Go library typically used in Automation, BPM applications. argo-workflows has a Permissive License and it has medium support. However argo-workflows has 1276 bugs and it has 1 vulnerabilities. You can download it from GitHub.

Argo Workflows is an open source container-native workflow engine for orchestrating parallel jobs on Kubernetes. Argo Workflows is implemented as a Kubernetes CRD (Custom Resource Definition). Argo is a Cloud Native Computing Foundation (CNCF) hosted project.

            kandi-support Support

              argo-workflows has a medium active ecosystem.
              It has 13015 star(s) with 2830 fork(s). There are 202 watchers for this library.
              There were 1 major release(s) in the last 12 months.
              There are 829 open issues and 4341 have been closed. On average issues are closed in 54 days. There are 53 open pull requests and 0 closed requests.
              It has a neutral sentiment in the developer community.
              The latest version of argo-workflows is v3.4.8

            kandi-Quality Quality

              argo-workflows has 1276 bugs (1275 blocker, 0 critical, 1 major, 0 minor) and 2880 code smells.

            kandi-Security Security

              argo-workflows has 1 vulnerability issues reported (0 critical, 1 high, 0 medium, 0 low).
              argo-workflows code analysis shows 0 unresolved vulnerabilities.
              There are 0 security hotspots that need review.

            kandi-License License

              argo-workflows is licensed under the Apache-2.0 License. This license is Permissive.
              Permissive licenses have the least restrictions, and you can use them in most projects.

            kandi-Reuse Reuse

              argo-workflows releases are available to install and integrate.
              Installation instructions are not available. Examples and code snippets are available.
              It has 237874 lines of code, 10271 functions and 1414 files.
              It has high code complexity. Code complexity directly impacts maintainability of the code.

            Top functions reviewed by kandi - BETA

            kandi has reviewed argo-workflows and discovered the below as its top functions. This is intended to give you an instant insight into argo-workflows implemented functionality, and help decide if they suit your requirements.
            • Checks the validity of the input value .
            • Calls the API .
            • Validate and convert input_value to required types .
            • Instantiate a new instance of the given class .
            • Validate all the properties of this schema
            • Calls an API call .
            • Return a OneOfSchema instance from the model .
            • Initialize an OpenApiModel from keyword arguments .
            • Convert a model instance to a python dictionary .
            • Attempts to convert the given value to an appropriate object .
            Get all kandi verified functions for this library.

            argo-workflows Key Features

            No Key Features are available at this moment for argo-workflows.

            argo-workflows Examples and Code Snippets

            No Code Snippets are available at this moment for argo-workflows.

            Community Discussions


            Argo workflows is trying to save the artifact to /var/run/argo/outputs/artifacts. Where is this specified?
            Asked 2022-Mar-18 at 11:45

            I found Argo lint today. Thank you to the Argo team!!! This is a very useful tool and has saved me tons of time. The following yaml checks out with no errors, but when I try to run it, I get the following error. How can I track down what is happening?



            Answered 2022-Mar-18 at 11:45

            The complete fix is detailed here

            for purposes of this discussion, the output must be the explicit location (not a placeholder) e.g. /tmp/ouput

            I think the standard is that you do not put the .tgz suffix in the output location, but that is not yet confirmed as there was another fix involved. Perhaps someone from the Argo team can confirm this.



            How do I use Argo Workflows Using Previous Step Outputs As Inputs?
            Asked 2022-Mar-02 at 20:49

            I am trying to format my workflow per these instructions ( but cannot seem to get it right. Specifically, I am trying to imitate "Using Previous Step Outputs As Inputs"

            I have included my workflow below. In this version, I have added a path to the inputs.artifacts because the error requests one. The error I am now receiving is:



            Answered 2022-Mar-01 at 15:26

            A very similar workflow from the Argo developers/maintainers can be found here:




            Difference between namespace install vs managed namespace install in Argo Workflows?
            Asked 2022-Feb-09 at 15:13

            I am trying to install argo workflows and looking at the documentation I can see 3 different types of installation

            Can anybody give some clarity on the namespace install vs managed namespace install? If its a managed namespace, how can I tell the managed namespace? Should I edit the k8's manifest for deployment? What benefit it can provide compared to simple namespace install ?



            Answered 2022-Feb-09 at 15:13

            A namespace install allows Workflows to run only in the namespace where Argo Workflows is installed.

            A managed namespace install allows Workflows to run only in one namespace besides the one where Argo Workflows is installed.

            Using a managed namespace install might make sense if you want some users/processes to be able to run Workflows without granting them any privileges in the namespace where Argo Workflows is installed.

            For example, if I only run CI/CD-related Workflows that are maintained by the same team that manages the Argo Workflows installation, it's probably reasonable to use a namespace install. But if all the Workflows are run by a separate data science team, it probably makes sense to give them a data-science-workflows namespace and run a "managed namespace install" of Argo Workflows from another namespace.

            To configure a managed namespace install, edit the workflow-controller and argo-server Deployments to pass the --managed-namespace argument.

            You can currently only configure one managed namespace, but in the future it may be possible to manage more than one.



            Argo Workflow to continue processing during fan-out
            Asked 2022-Feb-03 at 14:15

            General question here, wondering if anyone has any ideas or experience trying to achieve something I am right now. I'm not entirely sure if its even possible in the argo workflow system...

            I'm wondering if it is possible to continue a workflow regardless if a dynamic fanout has finished. By dynamic fanout I mean that B1/B2/B3 can go to B30 potentially.

            I want to see if C1 can start when B1 has finished. The B stage is creating a small file which then in C stage I need to run an api request that it has finished and upload said file. But in this scenario B2/B3 still are processing.

            And finally, D1 would have to wait for all of C1/2/3-C# to finish to complete

            Diagram what I'm trying to achieve



            Answered 2022-Feb-03 at 14:15

            Something like this should work:



            How does the Argo Workflows CLI get permissions?
            Asked 2022-Feb-01 at 19:57

            I am new to the argo universe and was trying to set up Argo Workflows .

            I have installed the argo CLI from the page : . I was trying it in my minikube setup and I have my kubectl already configured to the minikube cluster. I am able to hit argo commands without any issues after putting it in my local bin folder.

            How does it work? Where do the argo CLI is connecting to operate?



            Answered 2022-Feb-01 at 18:04

            The argo CLI manages two API clients. The first client connects to the Argo Workflows API server. The second connects to the Kubernetes API. Depending on what you're doing, the CLI might connect just to one API or the other.

            To connect to the Kubernetes API, the CLI just uses your kube config.

            To connect to the Argo server, the CLI first checks for an ARGO_TOKEN environment variable. If it's not available, the CLI falls back to using the kube config.

            ARGO_TOKEN is only necessary when the Argo Server is configured to require client auth and then only if you're doing things which require access to the Argo API instead of just the Kubernetes API.



            Dynamically provide an image name for the container template
            Asked 2022-Jan-18 at 15:02

            Is there a way to provide an image name for the container template dynamically based on its input parameters?

            We have more than 30 different tasks each with its own image and that should be invoked identically in a workflow. The number may vary each run depending on the output of a previous task. So we don't want to or even can't just hardcode them inside workflow YAML.

            An easy solution would be to provide the image field for the container depending on the input parameter and have the same template for each of these tasks. But looks like it's impossible. This workflow doesn't work:



            Answered 2022-Jan-14 at 11:29

            This is a possible workaround: to use when and conditional run of a task. We need to list of all possible tasks with their container images though:



            Retrying after a settable delay in Argo Workflows
            Asked 2022-Jan-12 at 23:47

            One of our Argo Workflow steps may hit a rate limit and I want to be able to tell argo how long it should wait until the next retry.

            Is there a way to do it?

            I've seen Retries on the documentation but it only talks about retry count and backoff strategies and it doesn't look like it could be parameterized.



            Answered 2022-Jan-12 at 23:47

            As far as I know there's no built-in way to add a pause before the next retry.

            However, you could build your own with Argo's exit handler feature.



            Dynamic "Fan-In" for artifact outputs in Argo?
            Asked 2022-Jan-10 at 14:20

            I have an Argo workflow with dynamic fan-out tasks that do some map operation (in a Map-Reduce meaning context). I want to create a reducer that aggregates their results. It's possible to do that when the outputs of each mapper are small and can be put as an output parameter. See this SO question-answer for the description of how to do it.

            But how to aggregate output artifacts with Argo without writing custom logic of writing them to some storage in each mapper and read from it in reducer?



            Answered 2022-Jan-10 at 14:20

            Artifacts are more difficult to aggregate than parameters.

            Parameters are always text and are generally small. This makes it easy for Argo Workflows to aggregate them into a single JSON object which can then be consumed by a "reduce" step.

            Artifacts, on the other hand, may be any type or size. So Argo Workflows is limited in how much it can help with aggregation.

            The main relevant feature it provides is declarative repository write/read operations. You can specify, for example, an S3 prefix to write each parameter to. Then, in the reduce step, you can load everything from that prefix and perform your aggregation logic.

            Argo Workflows provides a generic map/reduce example. But besides artifact writing/reading, you pretty much have to do the aggregation logic yourself.



            How can I delete the Argo events launched in the Argo namespace?
            Asked 2021-Dec-12 at 15:27

            I am trying to delete (and recreate) the Argo namespace, but it won't fully delete because I tried launching an eventsource and eventbus there. Now these will not delete.

            I have tried to delete them via yaml and individually - no success yet.

            The frustrating result is that I cannot re-launch argo



            Answered 2021-Dec-12 at 15:27

            For anyone who stumbles onto this question, it is a permissions issue. Make certain your service account has permissions to work in both namespaces (argo and argo-events).



            Is there a way to gracefully end a pod with the Kubernetes client-go?
            Asked 2021-Oct-29 at 12:01

            The main question is if there is a way to finish a pod from the client-go sdk, I'm not trying to delete a pod, I just want to finish it with a Phase-Status: Completed.

            In the code, I'm trying to update the pod phase but It doesn't work, It does not return an error or panic but The pod does not finish. My code:



            Answered 2021-Oct-29 at 12:01

            You cannot set the phase or anything else in the Pod status field, it is read only. According to the Pod Lifecycle documentation your pod will have a phase of Succeeded after "All containers in the Pod have terminated in success, and will not be restarted." So this will only happen if you can cause all of your pod's containers to exit with status code 0 and if the pod restartPolicy is set to onFailure or Never, if it is set to Always (the default) then the containers will eventually restart and your pod will eventually return to the Running phase.

            In summary, you cannot do what you are attempting to do via the Kube API directly. You must:

            1. Ensure your pod has a restartPolicy that can support the Succeeded phase.
            2. Cause your application to terminate, possibly by sending it SIGINT or SIGTERM, or possibly by commanding it via its own API.


            Community Discussions, Code Snippets contain sources that include Stack Exchange Network


            No vulnerabilities reported

            Install argo-workflows

            You can download it from GitHub.


            Get started hereHow to write Argo Workflow specsHow to configure your artifact repository
            Find more information at:

            Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items

            Find more libraries
          • HTTPS


          • CLI

            gh repo clone argoproj/argo-workflows

          • sshUrl


          • Stay Updated

            Subscribe to our newsletter for trending solutions and developer bootcamps

            Agree to Sign up and Terms & Conditions

            Share this Page

            share link

            Explore Related Topics

            Consider Popular BPM Libraries

            Try Top Libraries by argoproj


            by argoprojGo


            by argoprojGo


            by argoprojGo


            by argoprojGo


            by argoprojGo