1

I'm playing around with some python deep learning packages (Theano/Lasagne/Keras). I've been running it on CPU on my laptop, which takes a very long time to train the models.

For a while I was also using Amazon GPU instances, with an iPython notebook server running, which obviously ran much faster for full runs, but was pretty expensive to use for prototyping.

Is there any way to set things up that would let me prototype in iPython on my local machine, and then when I have a large model to train spin up a GPU instance, do all processing/training on that, then shut down the instance.

Is a setup like this possible, or does anyone have any suggestions to combine the convenience of the local machine with temporary processing on AWS?

My thoughts so far were along the lines of

  1. Prototype on local ipython notebook

  2. Set up cell to run a long process from start to finish.

  3. Use boto to start up an ec2 instance ssh into the instance using boto's sshclient_from_instance

    ssh_client = sshclient_from_instance(instance,
                                 key_path='<path to SSH keyfile>',
                                 user_name='ec2-user')
    
  4. Get the contents of the cell I've set up using the script using the solution here, say the script is in cell 13 Execute that script using

    ssh_client.run('python -c "'+ _i13 + '"' )
    
  5. Shut down instance using boto

This just seems a bit convoluted, is there a proper way to do this?

Community
  • 1
  • 1
Ger
  • 754
  • 1
  • 9
  • 33

1 Answers1

0

So when it comes to EC2 you don't have to shut down the instance every time. The beauty of AWS is that you stop and start your instance when you use it, and only pay for the time you have it up and running. Also you can always try your code on a smaller and cheaper instance, and if its too slow for your liking then you just scale up to a larger instance.

BreKru212
  • 1
  • 2
  • I might have mixed up my terms. By shutting down the instance at the end I meant merely stopping it. I had been scaling up and down by prototyping on a smaller instance, just wondering there's an integrated way to farm out specific processes (or ipython cells) to ec2. – Ger Sep 19 '15 at 10:53