I am hosting my project on gitlab. I have the CI/CD, which is scheduled to run every 24h to run a certain python script. That script generates a csv file, which is stored in an artifact. I need to access the csv file in other python files of the project, which then parse the csv into dataframes and so on. I managed to run the python script in the pipeline and to generate the artifact, but I don't know how to access it into another python script.
Could I use the artifact as a parameter in the script or how best could I handle this?
This is my .gitlab-ci.yml
file:
stages: # List of stages for jobs, and their order of execution
- build
- test
- deploy
build-job:
stage: build
script:
- echo "Running python script to generate csv artifact..."
- python to_del.py
artifacts:
paths:
- test.csv
expose-job:
stage: test
script:
- echo "Exposing the artifact..."
artifacts:
expose_as: 'artifact_file_txt'
paths: ['test.csv']
deploy-job:
stage: deploy
script:
- echo "Run update of csv files based on artifacts??"
In the build-job I am running the python script which generates the artifact, then I am exposing it in the expose-job and I would need to save it to a certain location in the gitlab project in the deploy-job. Is this possible?