The problem I am working on involves finding the zeros of a polynomial. I am working through a pre-calculus tutorial here:
Here is the problem I am given:
Factor the third degree polynomial by finding the zeros of the function:
And here is the answer that is given:
f ( 1 ) = 4, f ( 2 ) = 0, so ( x – 2 ) is a factor. Knowing this factor, we can take a shortcut and divide this factor into f ( x ) to get which we can factor by conventional means as ( x – 3 ) ( x + 1 ). Thus,
f ( x ) = ( x – 2 ) ( x – 3 ) ( x + 1 ).
Now, the author finds the first zero (2) by guess and check. However, I am having trouble figuring out how he gets from the original function to . I understand that he divides the factor (x-2) into the original function, gets an intermediate answer, then factors that intermediate answer to get the remaining factors (x-3) and (x+1) of the original function. But how is that division accomplished? I am trying to factor out (x-2) from the original function but I think I am barking up the wrong tree here. Is the only way to do this long division?