I am learning scientific computing with python. In the exercise, I am supposed to generate a polynomial by using its roots with this formula:
Here is my implementation:
def poly(x,roots): #Pass real and/or complex roots
x = symbols(x)
f = 1
for r in roots:
f = f*(x - r)
return expand(f)
When I test it:
from sympy import expand
poly('x',[(-1/2), 5, (21/5), (-7/2) + (1/2)*sqrt(73), (-7/2) - (1/2)*sqrt(73)])
I get:
x**5 - 1.7*x**4 - 50.5*x**3 + 177.5*x**2 - 24.8999999999999*x - 63.0
But I should get:
10*x**5 - 17.0*x**4 - 505.0*x**3 + 1775.0*x**2 - 248.999999999999*x - 630.0
Hence, everything is off by a factor of 10. If I set f = 10
, it works, but I don't see why I should do that. Am I making an obvious mistake? Thank you!