Here is the code: `
def canSum(targetSum, numbers, m = {}):
if targetSum in m:
return m[targetSum]
if targetSum == 0:
return True
if targetSum < 0:
return False
for number in numbers:
remainder = targetSum - number
if canSum(remainder, numbers, m):
m[targetSum] = True
return True
m[targetSum] = False
return False
print(f'Answer: {canSum(7, [2, 3])}') # true
print(f'Answer: {canSum(7, [5, 3, 4, 7])}') # true
print(f'Answer: {canSum(7, [2, 4])}') # false
print(f'Answer: {canSum(8, [2, 3, 5])}') # true
print(f'Answer: {canSum(300, [7, 14])}') # false
`
The problem is that when I remove the default value in function arguments, and pass the empty dictionary when I am calling the function, all works fine. But if I don't want to pass the empty dictionary when I am calling the function, I just want to pass the default value.
In JavaScript it works fine, however in Python I don't know why it is happening like that. Can someone explain me, thanks!
In JavaScript it works fine, however in Python I don't know why it is happening like that. Can someone explain me, thanks!