10

I'm very new to machine learning. I tried to create a model to predict if the number is even.

I used this code https://machinelearningmastery.com/tutorial-first-neural-network-python-keras/ which I changed to my needs.

The problem is that there is circa 50% success which is equal to random.

Do you know what to do to make it work?

from keras.models import Sequential
from keras.layers import Dense
import numpy
# fix random seed for reproducibility
seed = 7
numpy.random.seed(seed)

X = list(range(1000))
Y = [1,0]*500
# create model
model = Sequential()
model.add(Dense(12, input_dim=1, init='uniform', activation='relu'))
model.add(Dense(8, init='uniform', activation='relu'))
model.add(Dense(1, init='uniform', activation='sigmoid'))
# Compile model
model.compile(loss='mean_squared_error', optimizer='adam', metrics=['accuracy'])
# Fit the model
model.fit(X, Y, epochs=150, batch_size=10,  verbose=2)
# calculate predictions
predictions = model.predict(X)
# round predictions
rounded = [round(x[0])for x in predictions]
print(rounded)


>>> [1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0, 1.0]
Milano
  • 18,048
  • 37
  • 153
  • 353
  • Seems like an odd thing to do with a Neural Net :/ And I don't think your code is all that correct. The model you're building expects 12 inputs, what are those inputs if it is just one number? Try the following things to do: [MNIST Tensorflow](https://www.tensorflow.org/tutorials/) as Keras is built on Tensorflow, this should be possible. – SBylemans Dec 07 '18 at 14:40
  • You can try a simple feature transformation: ```x -> (-1)^x``` where by ```^``` I mean power of -1 to x. – eozd Dec 07 '18 at 15:02
  • Very good question anyone working in the Ai filed should solve and understand. – prosti Jan 16 '19 at 20:51

6 Answers6

5

Neural networks aren't good at figuring out if a number is even or not. At least not if the input representation is just an integer. Neural networks are good at figuring out and combining linear decision boundaries. In the case of all natural numbers there are an infinite number of decision boundaries to check if a number is even or not. If, however, you were only to get your NN to work on a subset of all numbers then you could make it work. However, you essentially need one neuron per number you want to be able to test in your input layer. So for 0 <= n < 1000 you would need a thousand neurons in your input layer. That's not really a great example of a neural network.

If you were to change the representation of your inputs to the binary representation of a number then the NN would have a much easier time of detecting if a number is even or not. eg.

X = [
  [0, 0, 0], # 0
  [0, 0, 1], # 1
  [0, 1, 0], # 2
  [0, 1, 1], # 3
  [1, 0, 0], # 4
  [1, 0, 1], # 5
  [1, 1, 0], # 6
  [1, 1, 1]  # 7
]

Y = [1, 0, 1, 0, 1, 0, 1, 0]

As you can see, this is now a rather simple problem to solve: basically the inverse of the last binary digit. This is an example of preprocessing your inputs to create a problem that is easier for the neural net to solve.

Dunes
  • 37,291
  • 7
  • 81
  • 97
  • 1
    Excellent answer. Confirms the importance of the representation. I basically used your model to create my answer. – prosti Jan 16 '19 at 20:50
4

Here is how I created the model in Keras to classify odd/even numbers in Python 3.

It just uses 1 neuron in the first hidden layer with 32 inputs. The output layer has just 2 neurons for one-hot encoding 0 and 1.

from keras.models import Sequential
from keras.layers import Dense
from keras.utils import to_categorical


# Helper function to convert a number 
# to its fixed width binary representation
def conv(x):
  a = format(x, '032b')
  l = list(str(a))
  l = np.array(list(map(int, l)))
  return l

# input data
data = [conv(i) for i in range(100000)]
X = np.array(data)


Y= list() # empty list of results
for v in range(100000):
  Y.append( to_categorical(v%2, 2) )

Y = np.array(Y) # we need np.array


# Sequential is a fully connected network
model = Sequential()

# 32 inputs and 1 neuron in the first layer (hidden layer)
model.add(Dense(1, input_dim=32, activation='relu'))

# 2 output layer 
model.add(Dense(2, activation='sigmoid'))


model.compile(loss='binary_crossentropy', 
              optimizer='adam', 
              metrics=['accuracy'])

# epochs is the number of times to retrain over the same data set
# batch_size is how may elements to process in parallel at one go
model.fit(X, Y, epochs=5, batch_size=100, verbose=1)
weights, biases = model.layers[0].get_weights()
print("weights",weights.size, weights, "biases", biases)
model.summary()

Epoch 1/5
100000/100000 [==============================] - 3s 26us/step - loss: 0.6111 - acc: 0.6668
Epoch 2/5
100000/100000 [==============================] - 1s 13us/step - loss: 0.2276 - acc: 1.0000
Epoch 3/5
100000/100000 [==============================] - 1s 13us/step - loss: 0.0882 - acc: 1.0000
Epoch 4/5
100000/100000 [==============================] - 1s 13us/step - loss: 0.0437 - acc: 1.0000
Epoch 5/5
100000/100000 [==============================] - 1s 13us/step - loss: 0.0246 - acc: 1.0000
weights 32 [[-4.07479703e-01]
 [ 2.29798079e-01]
 [ 4.12091196e-01]
 [-1.86401993e-01]
 [ 3.70162904e-01]
 [ 1.34553611e-02]
 [ 2.01252878e-01]
 [-1.00370705e-01]
 [-1.41752958e-01]
 [ 7.27931559e-02]
 [ 2.55639553e-01]
 [ 1.90407157e-01]
 [-2.42316410e-01]
 [ 2.43226111e-01]
 [ 2.22285628e-01]
 [-7.04377817e-05]
 [ 2.20522008e-04]
 [-1.48785894e-05]
 [-1.15533156e-04]
 [ 1.16850446e-04]
 [ 6.37861085e-05]
 [-9.74628711e-06]
 [ 3.84256418e-05]
 [-6.19597813e-06]
 [-7.05791535e-05]
 [-4.78575275e-05]
 [-3.07796836e-05]
 [ 3.26417139e-05]
 [-1.51580054e-04]
 [ 1.27965177e-05]
 [ 1.48101550e-04]
 [ 3.18456793e+00]] biases [-0.00016785]
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_49 (Dense)             (None, 1)                 33        
_________________________________________________________________
dense_50 (Dense)             (None, 2)                 4         
=================================================================
Total params: 37
Trainable params: 37
Non-trainable params: 0

Here are the predictions:

print(X[0:1])
scores = model.predict(X[0:1])
print(scores)
print(np.argmax(scores))

print(X[1:2])
scores = model.predict(X[1:2])
print(scores)
print(np.argmax(scores))

[[0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]]
[[0.9687797  0.03584918]]
0
[[0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1]]
[[0.00130448 0.9949934 ]]
1
prosti
  • 42,291
  • 14
  • 186
  • 151
3

I think it's a good idea for you to read perceptron XOR-problem to understand how a single perceptron works and what is its limitation.

Predicting if a number is even is a binary classification problem, with one dimensional input; In classification problem a neural network is trained to separate the classes via a boundary. One way of thinking about this problem is to map its one dimensional input into two dimensional input by adding input number to added dimension (e.g. map 7 to [7, 7]) and see how even and odd vectors look like in a scatter diagram.

If you run the following code in Jupyter notebook

%matplotlib inline
import matplotlib.pyplot as plt

X = list(range(-20, 20))
evens = [x for x in X if x % 2 == 0]
odds = [x for x in X if x % 2 != 0]
data = (evens, odds)
colors = ("green", "blue")
groups = ("Even", "Odd") 

fig = plt.figure()
ax = fig.add_subplot(1, 1, 1)
for data, color, group in zip(data, colors, groups):
    x = data
    ax.scatter(x, x, alpha=0.8, c=color, edgecolors='none', s=30, label=group)
plt.title('Even/Odd numbers')
plt.legend(loc=2)
plt.show()

data = (evens, odds)
fig2 = plt.figure()
ax = fig2.add_subplot(1, 1, 1)
for data, color, group in zip(data, colors, groups):
    x = data
    y = [abs(i) if i%2==0 else -abs(i) for i in data]
    ax.scatter(x, y, alpha=0.8, c=color, edgecolors='none', s=30, label=group)
plt.title('Even/Odd numbers (Separatable)')
plt.legend(loc=2)
plt.show()

You will see something like the following image:

Evens and Odds numbers

You can see in the first figure it's not really possible to come up with a boundary between even and odd vectors, but If you map the second dimension number to its equivalent negative number then drawing a boundary between two classes (even and odd number vectors) is easy. As a result if you transform your input data to two dimensions and negate the second dimension value based on being even or odd then neural network can learn how to separate even and odd vector classes.

You can try something like the following code, and you will see the network will learn and converge to almost 100% accuracy.

import numpy
from keras.models import Sequential
from keras.layers import Dense

# fix random seed for reproducibility
seed = 7
numpy.random.seed(seed)

X = numpy.array([[x, x if x%2 == 0 else -x] for x in range(1000)])
Y = [1,0]*500

# create model
model = Sequential()
model.add(Dense(12, input_dim=2, init='uniform', activation='relu'))
model.add(Dense(8, init='uniform', activation='relu'))
model.add(Dense(1, init='uniform', activation='sigmoid'))
# Compile model
model.compile(loss='mean_squared_error', optimizer='adam', metrics=['accuracy'])
# Fit the model
model.fit(X, Y, epochs=50, batch_size=10,  verbose=2)
# Calculate predictions
predictions = model.predict(X)

Note that transforming number into negative space based on being even or odd will work for one dimension as well, but it is easier to demonstrate with a scatter diagram with two dimension vectors.

HojjatK
  • 1,858
  • 1
  • 17
  • 16
1

I am not suprised that it doesn't work - neural networks doesn't work like that at all.

You have to get better feeling what are you passing as an input to neural network.

When you are passing number it have to have some meaning. That means: if one number is greater than another, it should cause something. Like age -> money, where there should be any dependancy.

But when looking for odd number, this meaning is more abstract. Honestly, you should think about your input as an independent string values.

Maybe: please try to take as an input:

X = [[math.floor(item/2), item/2 - math.floor(item/2)] for item in range(1000)]

and check whether network will understand that "if second value is greater than zero, number is odd".

Keep reading to get better feeeling :)

EDIT:

@MIlano full code will look as

from keras.models import Sequential
from keras.layers import Dense
import numpy
import math
# fix random seed for reproducibility
seed = 7
numpy.random.seed(seed)

X = numpy.array([[math.floor(item/2), item/2 - math.floor(item/2)] for item in range(1000)])

Y = [1, 0]*500
# create model
model = Sequential()
model.add(Dense(12, input_shape=(2,), init='uniform', activation='relu'))
model.add(Dense(8, init='uniform', activation='relu'))
model.add(Dense(1, init='uniform', activation='sigmoid'))
# Compile model
model.compile(loss='mean_squared_error', optimizer='adam', metrics=['accuracy'])
# Fit the model
model.fit(X, Y, epochs=150, batch_size=10,  verbose=2)
# calculate predictions
predictions = model.predict(X)
# round predictions
rounded = [round(x[0])for x in predictions]
print(rounded)
Koral
  • 80
  • 6
  • I tried it but it returns: ValueError: Error when checking model input: the list of Numpy arrays that you are passing to your model is not the size the model expected. Expected to see 1 array(s), but instead got the following list of 1000 arrays: [array([[0.], – Milano Dec 07 '18 at 14:48
0

This isn't the strangest application of Neural networks I've ever seen. The closest example would be A Compositional Neural-network Solution to Prime-number Testing from back in 2006 that used neural networks to solve a more complex number theory problem.

The result of the research was that it could be trained, and I'd suggest you try using a similar construction, but as the paper concludes, there are better solutions to this kind of problem.

Tom
  • 172
  • 6
0

The goal of Machine Learning is to predict labels (Y for you) on data with features / patterns (X for you here)

The issue for you is that your X is only a growing list with no particular pattern, sequence or any explanation.

So you're trying to ask a statistical algorithm to explain full random things, which is impossible.

Try the very beginning of machine learning with the Titanic Dataset on Kaggle, the reference platform for ML :

https://www.kaggle.com/upendr/titanic-machine-learning-from-disaster/data

Download it, load it via pandas and try the same algorithm.

Your X will be every features like Class, Age, Sex etc.. and your Y is Survived, the value is 1 if he lived, 0 if not. And you will be trying to determine if he lived or not thanks to pattern in the Age, the Sex etc...

I can also recommend to look at Andrew Ng : Machine Learning Courses that will explain everything, and really accessible for beginners

Have fun ! :)

LaSul
  • 2,231
  • 1
  • 20
  • 36