When I try to execute following command to get submission status:
$SPARK_HOME/bin/spark-submit --master spark://master:7077 --status driver-20210608160650-0012
Or:
curl http://master:7077/v1/submissions/status/20210608160650-0012
I receive strange error on master node:
21/06/08 16:12:41 WARN TransportChannelHandler: Exception in connection from /172.19.0.5:46734
master_1 | java.lang.IllegalArgumentException: Too large frame: 5135603447297880359
master_1 | at org.sparkproject.guava.base.Preconditions.checkArgument(Preconditions.java:119)
master_1 | at org.apache.spark.network.util.TransportFrameDecoder.decodeNext(TransportFrameDecoder.java:148)
master_1 | at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:98)
master_1 | at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
master_1 | at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
master_1 | at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357)
master_1 | at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
master_1 | at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379)
master_1 | at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365)
master_1 | at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
master_1 | at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:163)
master_1 | at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:714)
master_1 | at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:650)
master_1 | at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:576)
master_1 | at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:493)
master_1 | at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989)
master_1 | at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
master_1 | at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
master_1 | at java.base/java.lang.Thread.run(Thread.java:834)
Job itself was submitted and executed successfully:
$SPARK_HOME/bin/spark-submit --master spark://master:7077 \
--conf spark.standalone.submit.waitAppCompletion=true \
--name arrow-spark \
--class com.tasks.Task \
--deploy-mode cluster \
/path/to/my.jar
The only clue I've found was java.lang.IllegalArgumentException: Too large frame: 5211883372140375593 but it didn't work for me