I need help with this program and I'm not sure how to do something. The program is supposed to take 3 doubles to make a quadratic equation, then calculate and display all real roots. Here's the problem: If one or more of the roots is unreal or complex, it tries to display them anyway because I don't know how to test for that.

the equation you are solving is ax^2 + bx + c = 0 and you are using the formula -b +/- sqrt(b^2 - 4ac) / 2a to solve it

but wait, if a = 0 there is no ^2 term so the equation is the rather simpler linear equation bx + c = 0

if you used the quadratic equation when a = 0, and therefore also 2a = 0, you'd be dividing by 0. this is naughty!

so when a=0 use:

bx + c = 0

or

bx = 0 - c

or

x = -c / b

to find the single real root

otherwise

calculate the determinant, that bit of the formula of which you are going to take the square root. If the determinant is zero the roots are co-incident. Imagine the U shaped curve sitting on the x-axis.

If the determinant is zero the formula is -b +/- sqrt(0) / 2a which simplifies to -b/2a since +/- sqrt(0) is also zero.

So when the determinant = 0 use

x = -b / 2a

to find the single real root.

On the other hand if the determinant is less than zero, because the square root of -1 is i, the roots will have an imaginary component and you will have to use complex numbers in the formula

But if the determinant is positive and non-zero then you can use the formula in the way you are doing already.

Hope this helps.