-2

I'm coding a game where if the user provides a word, the game uses that word. Otherwise, the program should pick a random word from the dictionary. Dictionary is a global list. Currently, the default random will always evaluate to 0, but evaluates correctly anywhere else.

Code:

 def game(i = random.randint(0, len(dictionary))):
    print 'default random:', i
    print 'random inside function:', random.randint(0, len(dictionary))

Output:

default random: 0
random inside function: 40403

Can Python not evaluate len inside parameters?

  • It can, but you should really put application logic inside the function instead of it's signature. – Klaus D. Jan 13 '16 at 01:17
  • Possible duplicate of ["Least Astonishment" in Python: The Mutable Default Argument](http://stackoverflow.com/questions/1132941/least-astonishment-in-python-the-mutable-default-argument) – Yaroslav Admin Jan 13 '16 at 01:22
  • No, it can not. It's by design. Default values are evaluated at function definition time, not at function call time. – Yaroslav Admin Jan 13 '16 at 01:24

1 Answers1

0

If you say to the random number generator "pick a number from 0-100" and it picks 0 doesn't mean the random number lib is broken. Python evaluated the statement in the function signature.

Everything in the code above works just like you expect it to. http://dilbert.com/strip/2001-10-25

Back2Basics
  • 7,406
  • 2
  • 32
  • 45
  • Did you set a random seed before the code you posted (this makes sure that people can generate the same sequence time after time)? or is your dictionary length 0 when you start? (I'm guessing it's the latter) (make a 2nd variable of just len(dictionary) to prove it) – Back2Basics Jan 13 '16 at 01:36
  • I'm not sure what you mean by random seed before the code. This is the first place it's called. dictionary length is not 0, and if you look in the body of the method I showed that the line of code including len(dictionary) works fine inside the method, just not in the parameter – SeeMeRaze Jan 13 '16 at 02:44
  • There are 2 points of time. One is when the function is defined, the other is when the function is run. – Back2Basics Jan 13 '16 at 06:29