I am exploring Argo to orchestrate processing big data. I wish to kick off a workflow via REST call that divides a large data set among a number of machines with desired resources for processing. From an architectural perspective, how would I accomplish this? Is there an REST api or maybe some libraries for Node.js that I can use?
2 Answers
Argo 2.5 introduces its own API.
There are currently officially-supported Golang and Java clients. There is also a community-supported Python client. Updates will be available here: https://github.com/argoproj-labs/argo-client-gen
Argo provides Swagger API specs, so it should be reasonably easy to generate clients for other languages.

- 7,504
- 3
- 45
- 81
-
Is there documents for argo golang sdk? – shuiqiang Apr 26 '20 at 09:22
-
@shuiqiang I haven't located any docs, but there is an example: https://github.com/argoproj/argo/blob/master/examples/example-golang/main.go argoproj.slack.com is also pretty active. – crenshaw-dev Apr 26 '20 at 14:36
According to Argo documentation:
Argo is implemented as a kubernetes controller and Workflow Custom Resource. Argo itself does not run an API server, and with all CRDs, it extends the Kubernetes API server by introducing a new API Group/Version (argorproj.io/v1alpha1) and Kind (Workflow). When CRDs are registered in a cluster, access to those resources are made available by exposing new endpoints in the kubernetes API server.
For example, to list workflows in the default namespace, a client would make anHTTP GET
request to:
https://<k8s-api-server>/apis/argoproj.io/v1alpha1/namespaces/default/workflows
You can find examples for Golang, Python, Java, Ruby, OpenAPI following this link.
So, you can generate a YAML file describing Argo Workflow resources and send it to the Kubernetes API as described in the examples.
I hope it helps.

- 7,504
- 3
- 45
- 81

- 8,538
- 1
- 28
- 39