2

Pylearn2 is usually suggest as python resource for neural networks.

I would like to create a Single hidden layer neural network and train it with the backpropagation algorithm.

This should be something of basic but I do not understand how to do it with pylearn2. I have found this tutorial on multilayer perceptron but despite that I am still lost. (http://nbviewer.ipython.org/github/lisa-lab/pylearn2/blob/master/pylearn2/scripts/tutorials/multilayer_perceptron/multilayer_perceptron.ipynb)

n = 200
p = 20
X = np.random.normal(0, 1, (n, p))
y = X[:,0]* X[:, 1] + np.random.normal(0, .1, n)

I would like to create a single layer neural network with 40 hidden nodes and a sigmoid activation function.

Can someone help me?

EDIT:

I have been able to write this code but it is still not working

ds = DenseDesignMatrix(X=X, y=y)

hidden_layer = mlp.Sigmoid(layer_name='hidden', dim=10, irange=.1, init_bias=1.)
output_layer = mlp.Linear(1, 'output', irange=.1)
trainer = sgd.SGD(learning_rate=.05, batch_size=10, 
                  termination_criterion=EpochCounter(200))

layers = [hidden_layer, output_layer]
ann = mlp.MLP(layers, nvis=1)
trainer.setup(ann, ds)

while True:
    trainer.train(dataset=ds)
    ann.monitor.report_epoch()
    ann.monitor()
    if not trainer.continue_learning(ann):
        break
Donbeo
  • 17,067
  • 37
  • 114
  • 188

2 Answers2

3

This is my current solution:

n = 200
p = 2
X = np.random.normal(0, 1, (n, p))
y = X[:,0]* X[:, 1] + np.random.normal(0, .1, n)
y.shape = (n, 1)

ds = DenseDesignMatrix(X=X, y=y)


hidden_layer = mlp.Sigmoid(layer_name='hidden', dim=10, irange=.1, init_bias=1.)
output_layer = mlp.Linear(dim=1, layer_name='y', irange=.1)
trainer = sgd.SGD(learning_rate=.05, batch_size=10, 
                  termination_criterion=EpochCounter(200))
layers = [hidden_layer, output_layer]
ann = mlp.MLP(layers, nvis=2)
trainer.setup(ann, ds)

while True:
    trainer.train(dataset=ds)
    ann.monitor.report_epoch()
    ann.monitor()
    if not trainer.continue_learning(ann):
        break

inputs = X 
y_est = ann.fprop(theano.shared(inputs, name='inputs')).eval()
Donbeo
  • 17,067
  • 37
  • 114
  • 188
0

pylearn2 can be used either by instantiating the objects and using them as you would normally do or by defining the topology of the network and the parameters through a configuration Yaml file and letting pylearn2 take care of the rest. A good way to understand how things work is to look into pylearn2/scripts/train.py to see the operation that are performed. Also, in pylearn2/train.py (unfortunate choice of names, I guess) you will find the "train object" that contains every information about training. Basically when you use the configuration file the yaml parser will build a train object with the information from the configuration file and then start the training. There are a bunch of examples in pylearn2/scripts/papers that you can give a look at if you want.

I also suggest you to read this article to have a better comprehension of how pylearn2 works: Your models in Pylearn2

Finally, you might also want to check out Blocks, a new framework for neural networks that is being developed by the same lab as pylearn2. It is under very active development and has fewer features than pylearn2, but you might like it better especially if you know something of Theano.

Francesco
  • 11
  • 1
  • 4
  • thanks for the answer. I took the code from an online example http://www.arngarden.com/2013/07/29/neural-network-example-using-pylearn2/ . In the example they did a neural network with 4 lines of code. I assume that the same should be possible for a regression problem. Do you know how can I do that or why my code is wrong? – Donbeo Apr 02 '15 at 17:54
  • Can you provide more information about the problem? Are you getting an error or it is running but not learning? Also, can you provide a full script with all the imports and whatever is needed to run the script? It's been a while since I used pylearn2 so I am not sure I can help you, but I'll try. – Francesco Apr 06 '15 at 23:07