1

I am running the translate demo on ipython using:

>> %run translate.py --data_dir data --train_dir data --max_train_data_size 100000

The process gets killed automatically for some reason. Here is the output of the run.

Preparing WMT data in data/
I tensorflow/core/common_runtime/local_device.cc:25] Local device intra op parallelism threads: 3
I tensorflow/core/common_runtime/local_session.cc:45] Local session inter op parallelism threads: 3
Creating 3 layers of 1024 units.
Created model with fresh parameters.
Reading development and training data (limit: 100000).
    reading data line 100000
Killed

I am running it on a vagrant box running Ubuntu 14.04 and no gpu. What might be happening here?

Engineero
  • 12,340
  • 5
  • 53
  • 75
Anurag Ranjan
  • 2,001
  • 2
  • 13
  • 6

1 Answers1

1

That's not ipython/tensorflow specific.

Kills like that can come from the linux kernel if it decides, basically, that a process is using too much memory.

Who "Killed" my process and why?

Community
  • 1
  • 1
mdaoust
  • 6,242
  • 3
  • 28
  • 29