1

I'm learning machine learning, and I saw some materials indicate that MLP (multilayer perceptron) may get stuck at local minima. I want to learn by doing some experiment but I searched in web for concrete example but couldn't find any. Can anyone tell me where to find such example that can be reproduced by me?

Has QUIT--Anony-Mousse
  • 76,138
  • 12
  • 138
  • 194
Nia
  • 99
  • 1
  • 2
  • 6

1 Answers1

1

You can simply build a XOR neural network via back-propagation using stochastic gradient descent. A very simple way to generate a dataset for this experiment can be:

X = [[randint(0, 1), randint(0, 1)] for _ in range(1000)]
y = []
for a, b in X:
    if a + b == 1:
        y.append([0, 1])
    else:
        y.append([1, 0])
X = np.array(X)
y = np.array(y)

You may find this SO post (XOR neural network error stops decreasing during training) helpful as it is related to your query.

Others: If you are familiar with torch, then you can look at this github repository for a comprehensive example. You can generate dataset for your experiment as described here in torch. If you are not familiar with torch but interested to learn, you may consider going through this post.

Helpful Tutorials: (1) The XOR problem and solution

Community
  • 1
  • 1
Wasi Ahmad
  • 35,739
  • 32
  • 114
  • 161