I recently took an algorithms and data structures exam. One of the questions was to create a list of steps and flowchart for an algorithm that calculates the roots of a quadratic equation. I was also tasked to provide the Big O complexity of the provided algorithm.
Basically, my algorithm was like the one presented here :
Step 1. Start
Step 2. Read the coefficients of the equation, a, b and c from the user.
Step 3. Calculate discriminant = (b * b) – (4 * a * c)
Step 4. If discriminant > 0:
4.1: Calculate root1 = ( -b + sqrt(discriminant)) / (2 * a)
4.2: Calculate root2 = ( -b - sqrt(discriminant)) / (2 * a)
4.3: Display “Roots are real and different”
4.4: Display root1 and root2
Step 5: Else if discriminant = 0:
5.1: Calculate root1 = -b / (2 *a)
5.2: root2 = root1
5.3: Display “Root are real and equal”
5.4: Display root1 and root2
Step 6. Else:
6.1: Calculate real = -b / (2 * a)
6.2:Calculate imaginary = sqrt(-discriminant) / (2 * a)
6.3: Display “Roots are imaginary”
6.4: Display real, “±” , imaginary, “i”
Step 7. Stop
As an answer to the question of complexity, I submitted O(1) because there is always a constant amount of steps required to find the roots. However, Professor provided me with feedback that the answer is incorrect without providing the correct one.
I couldn't find any answers to this question so I require help. What is the complexity of such an algorithm?