0

This dockerfile has a spark path, retrieves the file name and installs the same. However, the script fails at the first echo command. It looks like the echo is being run as /bin/sh instead of /bin/sh -c.

How can I execute this echo command using /bin/sh -c? Is this the correct way to implement it, I'm planning on using the same logic for other installations such as Mongo, Node etc.

FROM ubuntu:18.04

ARG SPARK_FILE_LOCATION="http://www.us.apache.org/dist/spark/spark-2.4.4/spark-2.4.4-bin-hadoop2.7.tgz"    
CHAR_COUNT=`echo "${SPARK_FILE_LOCATION}" | awk -F"${DELIMITER}" '{print NF-1}'`
RUN echo $CHAR_COUNT
RUN CHAR_COUNT=`expr $CHAR_COUNT + 1`
RUN SPARK_FILE_NAME=`echo ${SPARK_FILE_LOCATION} | cut -f${CHAR_COUNT} -d"/"`
RUN Dir_name=`tar -tzf $SPARK_FILE_NAME | head -1 | cut -f1 -d"/"`
RUN echo Dir_name
/bin/sh: 1: 'echo http://www.us.apache.org/dist/spark/spark-2.4.4/spark-2.4.4-bin-hadoop2.7.tgz | awk -F/ "{print NF-1}"': not found
John Kugelman
  • 349,597
  • 67
  • 533
  • 578
user3616977
  • 527
  • 2
  • 5
  • 14
  • https://docs.docker.com/engine/reference/builder/ `Note that each instruction is run independently, and causes a new image to be created - so RUN cd /tmp will not have any effect on the next instructions.` and https://stackoverflow.com/questions/10067266/when-to-wrap-quotes-around-a-shell-variable/27701642 – KamilCuk Apr 06 '20 at 16:10
  • 1
    The easiest thing to do here is to move all of this logic into a shell script (which you can run and test independently), then `COPY` it into your image and `RUN` it. – David Maze Apr 06 '20 at 16:40

0 Answers0