I have some Python code in a Jupyter notebook and I need to run it automatically every day, so I would like to know if there is a way to set this up. I really appreciate any advice on this.
-
2Can't you convert the notebook into a Python file and just run that periodically? Does it have to be enclosed in a notebook? Another important question is what is the code supposed to do? – rayryeng Feb 12 '18 at 15:38
-
2Just put it in a regular `.py` file and then have a built in scheduler type program (i.e. windows task scheduler) run whenever you need. – SuperStew Feb 12 '18 at 15:39
-
I need to run as is bc it call several notebooks so it basically the motor of a bigger engine – Betsy Curbelo Feb 12 '18 at 15:45
-
4If you really want to do it this way, take a look at Jupyter's Execute API: http://nbconvert.readthedocs.io/en/latest/execute_api.html#executing-notebooks-from-the-command-line. You can run Jupyter notebooks in the command line, and you can combine this with a scheduling program or an automated script that will run what you need and how often you need to run it. – rayryeng Feb 12 '18 at 15:53
-
Is there a particular reason for running the code in Jupyter notebook? – relay Feb 12 '18 at 15:56
-
yes is an etl code – Betsy Curbelo Feb 12 '18 at 15:56
-
3@rayryeng has a good solution. Figure out the command needed to run it, stick that in a batch file, then call the batch file from the scheduler program. – SuperStew Feb 12 '18 at 15:57
-
I have that but and in spite that run every day , sometimes run but like a ghost I mean don't pull the info that i need – Betsy Curbelo Feb 12 '18 at 16:00
-
@BetsyCurbelo can you clarify ? – SuperStew Feb 12 '18 at 16:02
-
well I have a cmd file that its supposed to run everyday, my code takes daily info from one place to a database, and for example I have the info for 02/10 but not the info for 02/11 – Betsy Curbelo Feb 12 '18 at 16:05
-
@BetsyCurbelo So there's an issue with your code? If not, just create a scheduled task that runs that file. – SuperStew Feb 12 '18 at 17:51
12 Answers
Update
recently I came across papermill which is for executing and parameterizing notebooks.
https://github.com/nteract/papermill
papermill local/input.ipynb s3://bkt/output.ipynb -p alpha 0.6 -p l1_ratio 0.1
This seems better than nbconvert, because you can use parameters. You still have to trigger this command with a scheduler. Below is an example with cron on Ubuntu.
Old Answer
nbconvert --execute
can execute a jupyter notebook, this embedded into a cronjob will do what you want.
Example setup on Ubuntu:
Create yourscript.sh with the following content:
/opt/anaconda/envs/yourenv/bin/jupyter nbconvert \
--execute \
--to notebook /path/to/yournotebook.ipynb \
--output /path/to/yournotebook-output.ipynb
You have more options except --to notebook. I like this option since you have a fully executable "log"-File afterwards.
I recommend using a virtual environment to run your notebook, to avoid that future updates mess with your script. Do not forget to install nbconvert into the environment.
Now create a cronjob, that runs every day e.g. at 5:10 AM, by typing crontab -e
in your terminal and add this line:
10 5 * * * /path/to/yourscript.sh

- 2,080
- 14
- 27
Try the SeekWell Chrome Extension. It lets you schedule notebooks to run weekly, daily, hourly or every 5 minutes, right from Jupyter Notebooks. You can also send DataFrames directly to Sheets or Slack if you like.
Here's a demo video, and there is more info in the Chrome Web Store link above as well.
**Disclosure: I'm a SeekWell co-founder

- 117
- 1
- 3
-
1Right now, the startar plan costs 50 dollars per month: https://www.seekwell.io/pricing – Awais Mirza Mar 30 '21 at 10:55
It's better to combine with airflow if you want to have higher quality. I packaged them in a docker image, https://github.com/michaelchanwahyan/datalab.
It is done by modifing an open source package nbparameterize and integrating the passing arguments such as execution_date. Graph can be generated on the fly The output can be updated and saved within inside the notebook.
When it is executed
- the notebook will be read and inject the parameters
- the notebook is executed and the output will overwrite the original path
Besides, it also installed and configured common tools such as spark, keras, tensorflow, etc.

- 1,335
- 1
- 11
- 24
you can add jupyter notebook in cronjob
0 * * * * /home/ec2-user/anaconda3/bin/python /home/ec2-user/anaconda3/bin/jupyter-notebook
you have to replace /home/ec2-user/anaconda3 with your anaconda install location, and you can schedule time based on your requirements in cron

- 77
- 1
- 5
Executing Jupyter notebooks with parameters is conveniently done with Papermill. I also find convenient to share/version control the notebook either as a Markdown file or a Python script with Jupytext. Then I convert the notebook to an HTML file with nbconvert
. Typically my workflow looks like this:
cat world_facts.md \
| jupytext --from md --to ipynb --set-kernel - \
| papermill -p year 2017 \
| jupyter nbconvert --no-input --stdin --output world_facts_2017_report.html
Learn more about the above, including how to specify the Python environment in which the notebook is expected to run, and how to use continuous integration on notebooks, have a look at my article Automated reports with Jupyter Notebooks (using Jupytext and Papermill) which you can read either on Medium, GitHub, or on Binder. Use the Binder link if you want to test interactively the outcome of the commands in the article.

- 669
- 6
- 8
As others have mentioned, papermill is the way to go. Papermill is just nbconvert
with a few extra features.
If you want to handle a workflow of multiple notebooks that depend on one another, you can try Airflow's integration with papermill. If you are looking for something simpler that does not need a scheduler to run, you can try ploomber which also integrates with papermill (Disclaimer: I'm the author).

- 1,383
- 8
- 13
To run your notebook manually:
jupyter nbconvert --to notebook --execute /home/username/scripts/mynotebook.ipynb
Create a simple batch file and add the command above to the file:
/home/username/scripts/mynotebook.sh
Paste the command above into the file
Make the file executable
chmod +x /home/username/scripts/mynotebook.sh
To schedule your notebook use cron or airflow, depends on your needs vs complexity. if you want to use cron, you can simply do crontab -e and add an entry
00 11 * * * /home/username/scripts/mynotebook.sh

- 4,682
- 4
- 29
- 30
There are several ways to execute a Jupyter Notebook daily, according to the article.
Cron or Windows Task Scheduler
You can use your operating system scheduler to execute the notebook. There are two command line tools for executing notebooks:
Both are great, I personally use nbconvert
, but papermill
offers handful of extensions as input parameters for notebooks or automatic export to cloud storage.
Mercury
The open source framework Mercury is a web based application that:
- can execute notebook in the background,
- can share notebook as website,
- can send execute notebook as email with PDF or HTML attachment,
- can restrict access to notebooks to authenticated users.
Notebooks available in web app
Scheduled notebook
PDF notebook sent in email
Notebooker
Notebooker is open source web app for scheduling and sharing notebooks.
List of notebooks
Executed notebook

- 5,023
- 1
- 30
- 34
You can download the notebook in the form of .py and then create a batch file to execute the .py script. Then schedule the batch file in the task scheduler

- 1
Creating a BAT file then running it through Task scheduler worked for me. Below is the code.
call C:\Users\...user...\Anaconda3\condabin\conda activate
python -m notebook_file.py
pause
call conda deactivate

- 214
- 2
- 9
Simply put this line in crontab -e
0 1 * * * jupyter nbconvert --to html --execute /path/to/main.ipynb
This will execute your julyter notebook every day at 1:00 am.

- 703
- 10
- 24
You want to use Google AI Platform Notebooks Scheduler service currently in EAP.

- 9,283
- 6
- 80
- 125