2

Im writing a neural network using haskell. Im basing my code on this http://www-cs-students.stanford.edu/~blynn/haskell/brain.html . I adapted the feedforward method in the following way:

feedForward :: [Float] -> [([Float], [[Float]])] -> [Float]
feedForward = foldl ((fmap tanh . ) . previousWeights)

Where previousWeights is:

previousWeights :: [Float] -> ([Float], [[Float]]) -> [Float]
previousWeights actual_value (bias, weights) = zipWith (+) bias (map (sum.(zipWith (*) actual_value)) weights)

I don't really understand what fmap tanh . From what I read fmap applied to two functions is like a composition. If i change the fmap for map I get the same result.

Mark Seemann
  • 225,310
  • 48
  • 427
  • 736
nat
  • 85
  • 6
  • For lists `fmap = map`. `fmap` is a generalization of `map`. – Willem Van Onsem Jul 15 '18 at 16:52
  • `fmap` is equal to `map` for List, see http://learnyouahaskell.com/making-our-own-types-and-typeclasses#the-functor-typeclass, look for `fmap = map` – geckos Jul 15 '18 at 16:55
  • `(fmap tanh .)` gives you `(Floating b, Functor f) => (a -> f b) -> a -> f b` if you restrict it to lists by using map you get `(map tanh .) :: Floating b => (a -> [b]) -> a -> [b]` – geckos Jul 15 '18 at 17:00
  • 3
    For the `(.) .` have a look at [this question](https://stackoverflow.com/questions/20279306/what-does-f-g-mean-in-haskell) – mschmidt Jul 15 '18 at 17:42
  • 4
    `(fmap tanh . ) . previousWeights` is a complicated way to write `\x y -> map tanh (previousWeights x y)`. Hence, it generates the previous weights and then takes the tanh of each of them. – chi Jul 15 '18 at 18:28
  • Possible duplicate of [What does (f .) . g mean in Haskell?](https://stackoverflow.com/questions/20279306/what-does-f-g-mean-in-haskell) – Mark Seemann Jul 24 '18 at 12:58

1 Answers1

2

It is much easier to read if we give the parameters names and remove the consecutive .:

feedForward :: [Float] -> [([Float], [[Float]])] -> [Float]
feedForward actual_value bias_and_weights =
  foldl
  (\accumulator -- the accumulator, it is initialized as actual_value
    bias_and_weight -> -- a single value from bias_and_weights
     map tanh $ previousWeights accumulator bias_and_weight)
  actual_value -- initialization value
  bias_and_weights -- list we are folding over

It might also help to know that type signature of foldl in this case will be ([Float] -> ([Float], [[Float]])-> [Float]) -> [Float] -> [([Float], [[Float]])] -> [Float].

Note: This style of code you have found, while fun to write, can be a challenge for others to read and I generally do not recommend you write this way if for other than fun.

MCH
  • 2,124
  • 1
  • 19
  • 34