I have a small node
script wherein I use bunyan
to handle application logging. The logger writes to local storage. I'd like to send the logs to elasticsearch
using filebeat
(both of which are new tech to me).
I've made a dockerfile
that containerizes the app (below), but I'm unsure how/where to insert the necessary instructions. Is there a way to send my logs to elasticsearch
from within the docker
? And while I'm at it, also send whatever logs the docker container
& os emit?
# dockerfile.
# installations and entrypoint are to run nightmarejs headless
FROM node:latest
RUN apt-get update &&\
apt-get install -y libgtk2.0-0 libgconf-2-4 \
libasound2 libxtst6 libxss1 libnss3 xvfb
WORKDIR /app
COPY ./dist .
# enable installation of private npm modules
ARG NPM_TOKEN
COPY .npmrc .npmrc
COPY package.json .
RUN npm i
RUN rm -f .npmrc
COPY entrypoint /
RUN chmod +x /entrypoint
ENTRYPOINT ["/entrypoint"]
CMD DEBUG=nightmare node ./app.js
and for completeness' sake, entrypoint.js
:
#!/usr/bin/env bash
set -e
# Start Xvfb
Xvfb -ac -screen scrn 1280x2000x24 :9.0 &
export DISPLAY=:9.0
exec "$@"
if it matters, I'll be deploying the container to a service where I won't have ssh
access to the os.
[ETA: a lot of answers already cover how to grab docker
logs from outside of the container--eg, running filebeat
in a separate container. I'm hoping to run it within the same container, though]