3

I've had a go at implementing the Nguyen Widrow algorithm (below) and it appears to function correctly, but I have some follow-on questions:

  • Does this look like a correct implementation?

  • Does Nguyen Widrow initialization apply to any network topology / size ? (ie 5 layer AutoEncoder)

  • Is Nguyen Widrow initialization valid for any input range? (0/1, -1/+1, etc)

  • Is Nguyen Widrow initialization valid for any activation function? (Ie Logistic, Tanh, Linear)

The code below assumes that the network has already been randomized to -1/+1 :

        ' Calculate the number of hidden neurons
        Dim HiddenNeuronsCount As Integer = Me.TotalNeuronsCount - (Me.InputsCount - Me.OutputsCount)

        ' Calculate the Beta value for all hidden layers
        Dim Beta As Double = (0.7 * Math.Pow(HiddenNeuronsCount, (1.0 / Me.InputsCount)))

        ' Loop through each layer in neural network, skipping input layer
        For i As Integer = 1 To Layers.GetUpperBound(0)

            ' Loop through each neuron in layer
            For j As Integer = 0 To Layers(i).Neurons.GetUpperBound(0)

                Dim InputsNorm As Double = 0

                ' Loop through each weight in neuron inputs, add weight value to InputsNorm
                For k As Integer = 0 To Layers(i).Neurons(j).ConnectionWeights.GetUpperBound(0)
                    InputsNorm += Layers(i).Neurons(j).ConnectionWeights(k) * Layers(i).Neurons(j).ConnectionWeights(k)
                Next

                ' Add bias value to InputsNorm
                InputsNorm += Layers(i).Neurons(j).Bias * Layers(i).Neurons(j).Bias

                ' Finalize euclidean norm calculation
                InputsNorm = Math.Sqrt(InputsNorm)

                ' Loop through each weight in neuron inputs, scale the weight based on euclidean norm and beta
                For k As Integer = 0 To Layers(i).Neurons(j).ConnectionWeights.GetUpperBound(0)
                    Layers(i).Neurons(j).ConnectionWeights(k) = (Beta * Layers(i).Neurons(j).ConnectionWeights(k)) / InputsNorm
                Next

                ' Scale the bias based on euclidean norm and beta
                Layers(i).Neurons(j).Bias = (Beta * Layers(i).Neurons(j).Bias) / InputsNorm

            Next

        Next
Satellite
  • 531
  • 1
  • 8
  • 16

1 Answers1

1

Nguyen & Widrow in their paper assume that the inputs are between -1 and +1. Nguyen Widrow initialization is valid for any activation function which is finite in length. Again in their paper they are only talking about a 2 layer NN, not sure about a 5 layer one.

S

  • If you have more layers you just apply the algorithm to each of them. Check [this question](https://stackoverflow.com/questions/13689765/weight-initialisation). – Luis Jun 27 '16 at 17:49