45

What is the difference between back-propagation and feed-forward neural networks?

By googling and reading, I found that in feed-forward there is only forward direction, but in back-propagation once we need to do a forward-propagation and then back-propagation. I referred to this link

  1. Any other difference other than the direction of flow? What about the weight calculation? The outcome?
  2. Say I am implementing back-propagation, i.e. it contains forward and backward flow. So is back-propagation enough for showing feed-forward?
demongolem
  • 9,474
  • 36
  • 90
  • 105
USB
  • 6,019
  • 15
  • 62
  • 93

4 Answers4

75
  • A Feed-Forward Neural Network is a type of Neural Network architecture where the connections are "fed forward", i.e. do not form cycles (like in recurrent nets).

  • The term "Feed forward" is also used when you input something at the input layer and it travels from input to hidden and from hidden to output layer.
    The values are "fed forward".

Both of these uses of the phrase "feed forward" are in a context that has nothing to do with training per se.

  • Backpropagation is a training algorithm consisting of 2 steps: 1) Feed forward the values 2) calculate the error and propagate it back to the earlier layers. So to be precise, forward-propagation is part of the backpropagation algorithm but comes before back-propagating.
runDOSrun
  • 10,359
  • 7
  • 47
  • 57
18

There is no pure backpropagation or pure feed-forward neural network.

Backpropagation is algorithm to train (adjust weight) of neural network. Input for backpropagation is output_vector, target_output_vector, output is adjusted_weight_vector.

Feed-forward is algorithm to calculate output vector from input vector. Input for feed-forward is input_vector, output is output_vector.

When you are training neural network, you need to use both algorithms.

When you are using neural network (which have been trained), you are using only feed-forward.

Basic type of neural network is multi-layer perceptron, which is Feed-forward backpropagation neural network.

There are also more advanced types of neural networks, using modified algorithms.

Also good source to study : ftp://ftp.sas.com/pub/neural/FAQ.html Best to understand principle is to program it (tutorial in this video) https://www.youtube.com/watch?v=KkwX7FkLfug

i.n.n.m
  • 2,936
  • 7
  • 27
  • 51
user4545181
  • 181
  • 3
2

To be simple:

Feed-foward is an architecture. The contrary one is Recurrent Neural Networks.

Back Propagation (BP) is a solving method. BP can solve both feed-foward and Recurrent Neural Networks.

S Z
  • 31
  • 4
-1

Neural Networks can have different architectures. The connections between their neurons decide direction of flow of information. Depending on network connections, they are categorised as - Feed-Forward and Recurrent (back-propagating).

Feed Forward Neural Networks

In these types of neural networks information flows in only one direction i.e. from input layer to output layer. When the weights are once decided, they are not usually changed. One either explicitly decides weights or uses functions like Radial Basis Function to decide weights. The nodes here do their job without being aware whether results produced are accurate or not(i.e. they don't re-adjust according to result produced). There is no communication back from the layers ahead.

Recurrent Neural Networks (Back-Propagating)

Information passes from input layer to output layer to produce result. Error in result is then communicated back to previous layers now. Nodes get to know how much they contributed in the answer being wrong. Weights are re-adjusted. Neural network is improved. It learns. There is bi-directional flow of information. This basically has both algorithms implemented, feed-forward and back-propagation.

Ananth
  • 2,597
  • 1
  • 29
  • 39
  • 2
    There is some confusion here. Feed Forward NN and Recurrent NN are types of Neural Nets, not types of Training Algorithms. Training Algorithms are BackProp , Gradient Descent , etc which are used to train the networks. In FFNN, the output of one layer does not affect itself whereas in RNN it does. – Varad Bhatnagar Dec 11 '18 at 01:51
  • Thank you @VaradBhatnagar. "Algorithm" word was placed in an odd place. That indeed aroused confusion. I tried to put forth my view more appropriately now. – Ananth Dec 28 '18 at 14:48
  • 1
    remark: Feed Forward Neural Network also can be trained with the process as you described it in Recurrent Neural Network. – Quastiat Aug 03 '19 at 15:08