0

I want to run an web API running in container that can run script in another container.

# docker-compose.yml

version: '3.9'
services:
  api:
    build: 
      context: .
      dockerfile: Dockerfile
    ports:
      - 3000:3000
    volumes:
      - converter_data: /path-to-files
  converter:
    image: my-converter-image:latest
    volumes:
      - converter_data: /path-to-files

volumes:
  converter_data:

Nest service:

@Injectable()
export class ConverterService {
  convert(file: Express.Multer.File) {
    const result = execSync('ping -c 2 converter'); // it works
    // execSync(convert path/file) // it should work something like that...
    console.log('Result: ', result.toString('utf8'));
    console.log(file);
  }
}

Now the flow should look like:

  • User calls API endpoint (api service)
  • API (api service) calls script available on converter service with param (file)
  • API (api service) has access to converter output (converted file) - it should be made via shared volume tho

My question is what is the best way to establish such connection and how to achieve that?

mike927
  • 682
  • 1
  • 8
  • 25
  • Typically either directly making an HTTP request or by inserting an intermediate messaging system like RabbitMQ. A container cannot directly run a command in another container (without being given the ability to escape its container entirely and take over the entire host system). I'd also try to avoid sharing files if you can, though the model you describe maps well to the body of HTTP POST requests and responses. – David Maze Jan 07 '22 at 12:08
  • Well service B does not expose HTTP interface. It would be a bit overhead. What about SSH connection? – mike927 Jan 07 '22 at 12:12
  • The typical ssh setups you see floating around SO have major security issues. It's very hard to set up properly. – David Maze Jan 07 '22 at 12:14

0 Answers0