3

I'm a newbie..I don't understand why we use threshold and bias in MLP (Multi-Layer Perceptron)??What is the role of threshold, bias..And I don't know the formula of output (the result after we use the activation function, such as sigmoid) follow bias and threshold..In the same document, I see:

output_value=activation_function(summing_function+threshold) (follow Jeff Heaton)
output_value=activation_function(summing_function-threshold)(follow my teacher)
output_value=activation_function(summing_function+bias) (no problem!)

Which is the correct??Please give me a response!

And, the bias and the threshold can exists same time in MLP??

hippietrail
  • 15,848
  • 18
  • 99
  • 158
Ricky Tran
  • 145
  • 1
  • 10

1 Answers1

7

bias and threshold in MLP are the same concepts, simply - two different names for the same thing. Sign does not matter, as bias can be both positive and negative (but it is more common to use + bias).

In the most simple terms - if there is no bias, then for input of only 0's, you get summing_function=0, and as a result also output_value=0 (as most of the activation functions cross the origin). As a result, your network cannot learn any other behavior for this type of signal, as only changing part of the whole model are weights.

From more mathematical perspective - this is responsible for shifting the activation function and giving neural network the universal approximator capabilities.

lejlot
  • 64,777
  • 8
  • 131
  • 164