I've been trying to deploy a workflow in Argo with Kubernetes and I'm getting this error
Can someone help me to know the root of the issue?
I’ve tried several things but I’ve been unsuccessful.
I've been trying to deploy a workflow in Argo with Kubernetes and I'm getting this error
Can someone help me to know the root of the issue?
I’ve tried several things but I’ve been unsuccessful.
The way Argo solves that problem is by using compression on the stored entity, but the real question is whether you have to have all 3MB worth of that data at once, or if it is merely more convenient for you and they could be decomposed into separate objects with relationships between each other. The kubernetes API is not a blob storage, and shouldn't be treated as one.
expectedMsgFor1MB := etcdserver: request is too large
expectedMsgFor2MB := rpc error: code = ResourceExhausted desc = trying to send message larger than max
expectedMsgFor3MB := Request entity too large: limit is 3145728
expectedMsgForLargeAnnotation := metadata.annotations: Too long: must have at most 262144 bytes
But even if you did, it would not be ideal because usually this error means that you are consuming the objects instead of referencing them which would degrade your performance.
I highly recommend that you consider instead these options:
- Determine whether your object includes references that aren't used
- Break up your resource
- Consider a volume mount instead
There's a request for a new API Resource: File (orBinaryData) that could apply to your case. It's very fresh but it's good to keep an eye on.
Partial source for this answer: https://stackoverflow.com/a/60492986/12153576