1

While building a Dockerfile I'm trying to create a timescale database engine using a SQLAlchemy code. I used the official TimescaleDB image and after that I installed python in the same Dockerfile to start SQLAlchemy code with CMD command. In my SQLAlchemy Code I create an engine and start a session to insert data in created database as follow:

sqlalchemy.py

data = X
engine = create_engine("dbms://username:password@server:port/database")
Session = sessionmaker(bind = engine)
Base = declarative_base()
class Query(Base):
    __tablename__ = "X"
    X = Column(Integer, primary_key = "True")

    def __init__(self, X):
        self.X = X

Base.metadata.create_all(engine)
session = Session()
session.add(Query(data))
session.commit()
session.close()

sqlalchemy.sh

#!/bin/bash
python3 ./sqlalchemy.py

Dockerfile:

FROM timescale/timescaledb:latest-pg12

ENV POSTGRES_PASSWORD=X
ENV POSTGRES_USER=X

RUN apk add --update --no-cache python3 && ln -sf python3 /usr/bin/python
RUN python3 -m ensurepip
RUN pip3 install --no-cache --upgrade pip setuptools

COPY ./pathcopyfrom ./pathcopyto

WORKDIR ./pathcopyto
CMD ["./sqlalchemy.sh"]

But I get the error message:

    sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) server closed the connection 
    unexpectedly
            This probably means the server terminated abnormally
            before or while processing the request.

I just removed CMD command in my Dockerfile. After that I got into the Container with docker exec -it psql -U postgres. I ran the SQLAlchemy code. It worked fine. I don't understand what is leading me to the problem. I kept trying to start the Database Engine in my python code with a time delay and built Dockerfile with CMD ["./sqlalchemy.sh"] It did not work.

Does anyone know how to solve this problem?

bgr
  • 55
  • 1
  • 6
  • 3
    Postgres isn't running when you're building the image. It only starts when you create a container from the image. Typically, you would handle database initialization at runtime, possibly using the extension mechanisms provided by the `postgres` and `timescaledb` images (e.g., see "Initialization scripts" in https://hub.docker.com/_/postgres). – larsks Feb 11 '22 at 22:11
  • Thank you for your answer. I already added an user and a database with env variables in Dockerfile in order to initialize default database and I thought, I can add additional database and user using initialization scripts. Is there any other aspect to work with init scripts? I changed sqlalchemy.sh and it will first start timescaledb service and in the same shell script it runs sqlalchemy.py paralell. I changed sqlalchemy.py too and it will only create an engine after container is created. It didn't work. Probably I can't run this py-script while building the image. – bgr Feb 12 '22 at 07:19
  • Do you think, that there is any possibility to make this process automatically? I don't really want to go inside of container and run that py-file manually every time. – bgr Feb 12 '22 at 07:20
  • The documentation to which I linked shows how to automate database initialization as part of the container startup. That should allow you to do whatever you need to do. – larsks Feb 12 '22 at 17:43
  • 1
    Before running your script, you also need to check if PostgreSQL/TimescaleDB is ready to serve requests: https://stackoverflow.com/a/63011266/315168 – Mikko Ohtamaa Feb 13 '22 at 11:59

0 Answers0