I'm a total beginner regarding A.N.N.s. I understand the concept and all but there's no straight explanation as to why the input is a series of 0s and 1s and the output also a series of 0s and 1s.
I read here on Neural networks - input values that you can encode the input with a data normalization function so that it's converted to a number between 0 and 1.
Is this the case or am I misunderstanding things?
Also do you think you could point me in the right direction regarding which article/ lecturing material I should pick up to clear things out?