2

I'm looking for a way to change floatx in keras directly in python. floatx is the default float type (float16, float32 . . .)

The config is stored in a json file at:

$HOME/.keras/keras.json

But I'm looking for a way to change the config inside my python programm without changing the config file itself.

There is a similiar question, in which somebody ask the same for changing the backend, which is also stored in keras.json. The accepted answer involves setting the environment variable KERAS_BACKEND and reload the keras module, but I didn't find a similar environment variable for floatx.

akshat
  • 1,219
  • 1
  • 8
  • 24
dennis-w
  • 2,166
  • 1
  • 13
  • 23

2 Answers2

4

Turns out keras.backend has function for setting and retrieving the floatx value (scroll down in the link):

keras.backend.floatx()
>>> 'float32'
keras.backend.set_floatx('float16')
keras.backend.floatx()
>>> 'float16'

Also you are not allowed to reload the keras module after using set_floatx like when changing backend, because then keras will simply reread the config file and return to its previous value:

keras.backend.floatx()
>>> 'float32'
keras.backend.set_floatx('float16')
keras.backend.floatx()
>>> 'float16'
importlib.reload(keras.backend)
keras.backend.floatx()
>>> 'float32'
dennis-w
  • 2,166
  • 1
  • 13
  • 23
1

Well, the floatx var should certainly be used in keras.json, as described in documentation.

The least buggy way to do it is using the file indeed and reloading the module.

Using K.set_floatx, at least for me, left parts of the models unchanged (even if sef_floatx was the very first thing I did after loading the keras model in a new python kernel)

Even though, I faced yet another bug when setting precision to float16: all my loss functions very quickly became nan. Unfortunately I had to go back to float32 (the default) to have the possibility of training.

Daniel Möller
  • 84,878
  • 18
  • 192
  • 214
  • I really don't like reloading a module. Depending on the use case this can also turn out very buggy. Does your case involve using `set_floatx` and then load an already created model? Then it would make sense that the layers already have a specific float type which overwrites the default floatx type. – dennis-w May 29 '18 at 11:48
  • Do you intend to change it more than once? I don't see why you wouldn't reload the model, unless you're going to change it during execution. (Changing this during execution sounds the buggiest of all options) – Daniel Möller May 29 '18 at 11:51
  • No, my case was "create a new model". – Daniel Möller May 29 '18 at 11:54
  • I have a general working environment where I could change it during every start of training/programm start. Sure I could check the value for every training but this seems not like the right approach for a config file , which stores default values. – dennis-w May 29 '18 at 12:01
  • I'm not sure I understood you. If you change it in config file it's done. (You only need to reload the module if it's already loaded/imported). – Daniel Möller May 29 '18 at 12:09
  • Yes but imagine you want sometimes to train model a with floatx set to 'float32' and sometimes train model b with floatx set to 'float16'. Then you would have to set it every time you start a training to make sure it's set to the right value. In my opinion such a config file is for having a preferred value. But if you want to change it only for one training. You shouldn't change the config file itself, because you can forget to change it back for the next training. – dennis-w May 29 '18 at 12:13