0

I want to create a project in my IBM Cloud account consisting of a jupyter notebook.
I later want to create a job to make this notebook run daily and share its link.

The data is available on a third party GitHub repo and I reckon I can only link the
cloud storage service with repos in my property.

As a workaround I thought of
- forking the third party repo to my GitHub account
- linking the local fork to the remote master (following How do I update a GitHub forked repository?)
- linking the IBM resource with my version of the forked repo
- creating a cron job on my machine to fetch changes in the remote master)

The problem is that I need a local machine always running at the cron job time.

Is there a way to automate all this inside the IBM Cloud?

I was thinking of a job spinning up a VM via Docker which updates the repo. Can this work?

Can anybody else think of a simpler and smarter way of doing it?

Thanks

andrea
  • 525
  • 1
  • 5
  • 21

1 Answers1

1

Have you tried setting up a Continuous Development pipeline to work from your forked repo, automatically build the image in IBM Cloud Container Registry, and push it to a Kubernetes cluster? After you set it up, each time you commit in GitHub will automatically build and deploy the new image.

Art Berger
  • 37
  • 4