-1

I am currently running Ubuntu 20.10 on a Raspberry Pi 3. I have already installed Docker and the MySQL server, which runs as a service on Ubuntu. Both of the installations are working properly. Now I am trying to run this Node.js API:

'use strict';

const express = require('express');
const mysql = require('mysql');

const PORT = 8080;
const HOST = '0.0.0.0';

const app = express();
app.get('/', (req, res) => {
  res.send('Hello World');
});

app.get('/persons', (req, res) => {
  connection.query('SELECT * FROM persons', (err, result) => {
    if (err) res.send(`Some error occured: ${err}`);
    res.send(result);
  });
});

app.listen(PORT, HOST);
console.log(`Runnning API on http://${HOST}:${PORT}`);

const connection = mysql.createConnection({
  host: 'localhost',
  user: myuser,
  password: 'mypwd,
  database: mydb
});

connection.connect(err => {
  if (err) throw err;
  console.log('Successfully connected to MySQL');
});

This works perfectly fine until I wrap this app into a Docker container. The Dockerfile looks like this:

FROM node:14

WORKDIR /usr/src/app

COPY package*.json ./

RUN npm install

COPY . .

EXPOSE 8080
CMD [ "node", "server.js"]

Then I get this error:

if (err) throw err;
           ^

Error: connect ECONNREFUSED 127.0.0.1:3306
    at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1135:16)
    --------------------
    at Protocol._enqueue (/home/felix/workspaces/node-api/node_modules/mysql/lib/protocol/Protocol.js:144:48)
    at Protocol.handshake (/home/felix/workspaces/node-api/node_modules/mysql/lib/protocol/Protocol.js:51:23)
    at Connection.connect (/home/felix/workspaces/node-api/node_modules/mysql/lib/Connection.js:116:18)
    at Object.<anonymous> (/home/felix/workspaces/node-api/server.js:31:12)
    at Module._compile (internal/modules/cjs/loader.js:1138:30)
    at Object.Module._extensions..js (internal/modules/cjs/loader.js:1158:10)
    at Module.load (internal/modules/cjs/loader.js:986:32)
    at Function.Module._load (internal/modules/cjs/loader.js:879:14)
    at Function.executeUserEntryPoint [as runMain] (internal/modules/run_main.js:71:12)
    at internal/main/run_main_module.js:17:47 {
  errno: 'ECONNREFUSED',
  code: 'ECONNREFUSED',
  syscall: 'connect',
  address: '127.0.0.1',
  port: 3306,
  fatal: true
}

Does anyone know how to fix this? Moreover I wanted to know what`s the difference between running MySQL as a service on Linux or running it in a container…what are the advantages/disadvantages? Or can someone explain if it makes sense to run the app and the database in two different containers and connect them with Docker compose?

Felix
  • 76
  • 6
  • the error message has a different port `3306` rather than `8080` why is that? you can check if the docker container is using port `8080` by running `sudo ss -tnlp | grep 8080` – Mheni Feb 09 '21 at 18:29

2 Answers2

0

If I understood correctly, what you want to do is connect the app in a container with the mysqld you installed in the host?

Firstable: Why don't you put a mysql instance in another container? Second: 127.0.0.1 is the loopback address of the container itself, nothing to do with the host where you have your mysql server installed...

I think you should try to use a mysql container, but if you need this for whatever reason, you could look at this post: https://medium.com/@sirajul.anik/docker-for-linux-localhost-docker-connect-to-host-machine-from-a-docker-container-in-linux-fa42b00f161e

0
<...>
const connection = mysql.createConnection({
      host: 'localhost',
<...>

localhost part is incorrect. While in container, localhost refers to services running in that container and MySQL isn't there. It's in your Ubuntu host and that's what host should refer to. If you want to go down this route:


Above isn't a standard way of doing things with docker though. Unless you have some good reason to run MySQL on host, just have it in a container and run the pair of container (or more if you ever need to) with docker-compose. To answer your questions about this setup:

can someone explain if it makes sense to run the app and the database in two different containers and connect them with Docker compose?

Yes, that's the standard way of doing it. You can link containers in docker-compose in various ways (links, depends_on), but there will also be a default/implicitly created network for services in docker-compose. You can then refer to database container just by the name of the service, which e.g. might be db, or whatever you call it. You'd normally pass that name into Node app as an environment variable or as part of .env.

I wanted to know what`s the difference between running MySQL as a service on Linux or running it in a container…what are the advantages/disadvantages?

That depends on what you are after. There are cases for both. But for a basic case, just have a database for the one node app, docker-compose route and database in container is way to go because:

  • it's easy to manage (start/stop/update/configure) a container via docker-compose
  • things are more portable in containers
  • services are self-contained. No risk of one affecting the other and so on
  • they (arguably) are more secure by having one thing and one thing only in a container rather than a bunch of stuff running alongside of your database. Also by putting database behind NAT, and other specifics of container

list of container benefits goes on. There's a flip-side for most benefits, but for a small app, MySQL in container might be just right and easy. Make sure you manage persistence (where the data is stored by MySQL). That's usually done with volumes.

apo
  • 106
  • 5