3

I have written a program to train my network using pybrain. I have 104 inputs,and 7 outputs in each line of the train file.I have created one hidden layer with length of 50.The network is written in an .xml file.But I dont know how to write the final weights and biases in a file so that I can calculate precision and recall.Can anyone help?

from pybrain.datasets import SupervisedDataSet
from pybrain.datasets            import ClassificationDataSet
from pybrain.utilities           import percentError
from pybrain.tools.shortcuts     import buildNetwork
from pybrain.supervised.trainers import BackpropTrainer
from pybrain.structure.modules   import SoftmaxLayer
from pybrain.tools.xml.networkwriter import NetworkWriter
from pybrain.tools.xml.networkreader import NetworkReader

ds = SupervisedDataSet(104,7)

tf = open('neural_net_feature.txt','r')
for line in tf.readlines():
    data = [float(x) for x in line.strip().split(',') if x != '']
    indata =  tuple(data[:104])
    outdata = tuple(data[104:])
    ds.addSample(indata,outdata)
n=buildNetwork(ds.indim,50,ds.outdim,hiddenclass=SigmoidLayer,outclass=SigmoidLayer)
NetworkWriter.writeToFile(n, 'filename.xml')
n = NetworkReader.readFrom('filename.xml')
t = BackpropTrainer(n,learningrate=0.01,momentum=0.5,verbose=True)
t.trainUntilConvergence(dataset=ds, maxEpochs=None, verbose=False  ,          continueEpochs=10, validationProportion=0.10)
t.testOnData(verbose=True)

Thanks

m.khodakarami
  • 131
  • 11
  • 3
    As @Chase Roberts answered I used [http://stackoverflow.com/questions/8150772/pybrain-how-to-print-a-network-nodes-and-weights]. The number of **in_to_hiddens** in my network is 5200(too much), it is written like that: in_to_hidden [ 1.55300577 -0.62533809 -0.08147982 ..., 1.29706926 0.50138988 ] But I NEED the REAL WEIGHTS not **SOME DOTS**. Can Anyone help plz?? – m.khodakarami Apr 08 '17 at 04:02

1 Answers1

0

Finally I used this answer [pybrain: how to print a network (nodes and weights) to get all of them written:

i=0
for mod in n.modules:
    #n.modules:
    #set([<BiasUnit 'bias'>, <LinearLayer 'in'>, <SigmoidLayer 'hidden0'>,
    #<SigmoidLayer 'out'>])
    for conn in n.connections[mod]:
        #n.connections:
        #{<BiasUnit 'bias'>: [<FullConnection 'FullConnection-4': 'bias' -> 'out'>,<FullConnection 'FullConnection-5': 'bias' -> 'hidden0'>],
        #<LinearLayer 'in'>: [<FullConnection 'FullConnection-6': 'in' -> 'hidden0'>],
        #<SigmoidLayer 'hidden0'>: [<FullConnection 'FullConnection-7': 'hidden0' -> 'out'>],
        #<SigmoidLayer 'out'>: []}
        final_weight.write('\n'+"connection:"+str(conn)+'\n')
        final_weight.write('[')
        if i<7:
        for cc in range(len(conn.params)):
            final_weight.write(str(conn.params[cc])+',')
            i+=1
            if i==7:
            final_weight.write('\n')
            i=0
            final_weight.write(']')

In this way I wrote the weights (all of them not some of them) in a list,and because the number of weights are too much,in this program each seven weights are written in a line.

Community
  • 1
  • 1
m.khodakarami
  • 131
  • 11