0

I am following the simple instructions given in https://datascience.ibm.com/docs/content/analyze-data/schedule-task.html. But unfortunately, the task never runs. I have only 2 spark instances running. The simple code snippet is creating a cloudant DB and exiting. Works fine in manual runs.

Job details tells me the following: "The notebook "testSchedule" is scheduled to run daily starting on Mon, 4 June 2018, 11:40 PM until Sun, 17 June 2018, 12:40 AM." Any general suggestions without looking into the code? Quite a black box considering....

Sumit Goyal
  • 575
  • 3
  • 16

2 Answers2

1

There's a post explaining how to debug a scheduled notebook. Could you please follow the instructions in that code and provide feedback whether it did help to shed some light on the issue that you are describing?

How to troubleshoot a DSX scheduled notebook?

  • Still not resolved. The good news is that the job comes up in the incomplete application list. But its not showing any specific error messages nor is it completing. Any thoughts? – arun raghavan Jun 05 '18 at 05:35
  • Also, I have a few dependencies that need to be pip installed before one can run the code. For example, !pip install --user cloudant. I do this in the 1st section itself & works ok when run manually. – arun raghavan Jun 05 '18 at 10:16
0

Well it looks like I was able to schedule daily jobs at least. I would suggest to the product owners of DSX/Data Platform to add some suggestion on adding dependencies while scheduling python jobs. Debugging an incomplete job is still an open mystery, a cleaner interface may be required in the front end itself?

Thanks, am closing this thread.

AR