I am currently writing a program to determine the roots of a cubic equation. This is what I have written so far:
import math
def sols(a, b, c, d ):
disc = 18*a*b*c*d - 4*(b**3)*d + (b**2)*(c**2) - 4*a*(c**3) - 27*(a**2)*(d**2)
x = b**2.0 - 3.0*a*c
y = 2.0*(b**3.0) - 9.0*a*b*c + 27.0*(a**2.0)*d
alpha = complex(-0.5, 0.5*math.sqrt(3))
if disc >= 0:
C = ((y + complex (0, math.sqrt (27.*(a**2)*disc)))/2)**(1.0/3)
sol1 = -(1/3*a)*(b + C + x/C)
sol2 = -(1/3*a)*(b + C*alpha + x/(C*alpha))
sol3 = -(1/3*a)*(b + C*(alpha**2)) + x/(C*(alpha**2))
a, b, c, d are real. If the discriminant is bigger than zero I should be left with three real roots. C computes a number that is complex, however whenever C is used in sol1 the complex parts should cancel each other and I should be left with a real solution.
Given the values a=2, b=3, c=-11 d=-6 I obtain for C and x/C:
In[11]: C
Out[12]: (7.5+4.330127018922192j)
In[13]: x/C
Out[14]: (7.500000000000002-4.330127018922193j)
As I said before the two complex parts should cancel, and the do up until their very last decimal figure. So instead of C - x/C = 15, I get:
In[26]: C + x/C
Out[27]: (15.000000000000002-8.881784197001252e-16j)
I don't understand how to fix this. I dont know what is wrong with my program that makes my values of C and x/C differ by one decimal figure. Please can someone offer any form of help!!!