Im writing a neural network using haskell. Im basing my code on this http://www-cs-students.stanford.edu/~blynn/haskell/brain.html . I adapted the feedforward method in the following way:
feedForward :: [Float] -> [([Float], [[Float]])] -> [Float]
feedForward = foldl ((fmap tanh . ) . previousWeights)
Where previousWeights is:
previousWeights :: [Float] -> ([Float], [[Float]]) -> [Float]
previousWeights actual_value (bias, weights) = zipWith (+) bias (map (sum.(zipWith (*) actual_value)) weights)
I don't really understand what fmap tanh .
From what I read fmap applied to two functions is like a composition. If i change the fmap
for map
I get the same result.