1

I'm currently using GCP to run the Jupyter notebooks on the notebook server provided by Google. Every time I open the notebook server in commandline, it shuts down when there is a network interruption or power outage on my end. I'm very naive on GCP too.

Is there any way that I could run the Ipython notebooks on the server and later collect the results without having to bother about anything else?

Thanks in advance!

Zain Rizvi
  • 23,586
  • 22
  • 91
  • 133
maninekkalapudi
  • 958
  • 2
  • 10
  • 23
  • 1
    Have you seen this article? https://stackoverflow.com/questions/45835971/persistent-use-of-jupyter-notebook-from-remote-server It looks like it may contain a solution? – Kolban Jan 31 '19 at 16:29
  • @Kolban I'm not getting this one `ssh -L xxxx:localhost:yyyy server`. I think it is a legit solution. Do you have any Idea what's happening here? or can you point me any resources to understand this? – maninekkalapudi Jan 31 '19 at 17:52
  • It might help to look at the "shh -L" command. I found this one pretty good: https://linux.die.net/man/1/ssh – Kolban Jan 31 '19 at 19:59

1 Answers1

1

Have you tried using GCP's AI Platform Notebooks? https://cloud.google.com/ai-platform-notebooks/

You can open these notebooks directly in your browser unlike the older Datalab notebooks (no need to SSH). That should solve your network interruption issues

Zain Rizvi
  • 23,586
  • 22
  • 91
  • 133