1

I've been using TensorFlow to create feed-forward neural nets with customized initial weights and biases (cannot provide code for proprietary reasons, but another algorithm outputs initial matrices that are then plugged into tf for optimization). Everything has been working beautifully for classification, however when I started testing on regression datasets I was getting enormously large errors. I ran the skflow Boston housing data example (DNN regressor) and it gets much better results. Like others (tensorflow model has different results than the same model in skflow (optimizer)) I tried reproducing the skflow code in tf (abandoning my weight initialization idea) and am getting drastically worse performance in tf.

So I'd like ideas on one of two things, either: 1. How can you initialize with custom weight matrices in skflow? Digging in the source code has yet to reveal simple solutions... or 2. What is the key difference between the skflow dnn regressor and the code written in tensorflow (what is the answer to the question I linked to)?

Community
  • 1
  • 1
  • Unfortunately I didn't control my experimentation very well so I'm not sure what exactly did the trick... I can say, however, that weight initialization is very important (skflow defaults to xavier), as is the choice of optimizer/learning rate. If you're having the same issues I did, start by changing the weight initialization to match skflow's, and then start playing with the optimizer and learning rate. For the boston data I believe I had the best success with the Adam optimizer and learning rate of 0.008, but don't ask me why. ;) I was able to get MSE comparable to skflow in the end. – Kelli Humbird Feb 22 '17 at 16:09

0 Answers0