0

I'm in a totally disconnected network developing a VueJS2 app.

I've been lucky to be allowed to bring everything I need on a CD to be scanned and blessed and eventually put in a spot where I can go get it and develop from there.Now I'm trying to Dockerize this VueJS2 app with NodeJS. (Works great in connected environment)

Note: I brought node:lts-alpine in as a tar file and then turned that into an image

FROM node:lts-alpine

# make the 'app' folder the current working directory
WORKDIR /app

# copy both 'package.json' and 'package-lock.json' (if available)
COPY package*.json /app

# install project dependencies
RUN npm install

# copy project files and folders to the current working directory (i.e. 'app' folder)
COPY . /app

# build app for production with minification
RUN npm run build

EXPOSE 8060
CMD ["node", "server.prod.js"]

This works great when connected, but complains as some of the node_modules still want to reach out to the internet. I've taken care to ensure that what I need is already in the node_modules folder.

What do I need to do to tell Docker, npm, or whatever not to go and look on the internet?

The error I'm getting is on the RUN npm install for a network request to yargs-parser.

yars and yargs-parser are in the node_modules folder.

Thanks!

Duane Haworth
  • 79
  • 1
  • 11
  • 1
    I'd suggest building your image while connected to the public Internet, and then bringing that built image (and not its Dockerfile) into the restricted environment. This is the one case where `docker save` your image makes sense. – David Maze Mar 16 '22 at 17:13
  • Thanks for the response!! I ended up doing that and then made required changes on the restricted side (created temp container, updated temp container, and commit back to image). Worked like a champ! https://stackoverflow.com/questions/56670437/add-a-file-in-a-docker-image – Duane Haworth Mar 16 '22 at 18:32

0 Answers0