When training using TF Slim's train_image_classifier.py I would like to tell Slim to only allocate what GPU memory it needs, rather than allocating all the memory.
Were I using straight up TF and not Slim I could say this:
config = tf.ConfigProto()
config.gpu_options.allow_growth=True
sess = tf.Session(config=config)
Or even just this to put a hard cap on GPU memory use:
gpu_options = tf.GPUOptions(per_process_gpu_memory_fraction=0.333)
sess = tf.Session(config=tf.ConfigProto(gpu_options=gpu_options))
How can I tell Slim the same thing(s)?
My comprehension fail is that Slim seems to use it's own loop and I can't find docs on the nitty gritty of configuring the loop. So, even if someone could point me to good Slim docs that'd be fantastic.
Thanks in advance!