Helping students gain understanding and self-confidence in algebra.
Sandurz wrote:I'm new to these forums, so I'm not very familiar with the board layout. If this is in the wrong section, I'm sorry!
I've been looking at tons of proofs on dividing by zero, and one of them seems to be incorrect. I've seen it in so many resources, it's not even funny, and I haven't found an answer to it ANYWHERE. Any help or a point in the direction of the answer would be much appreciated! Thanks in advance!
This is where it seems to get incorrect. Isn't any number divided by itself equal to 1? Or does this not apply to 0?
The rest of the proof:
since 0/0 is being multiplied on both sides of the equation, you can get rid of it, ending with 1=2
This is most certainly incorrect, blah blah blah.
Even if 0/0 is equal to 1, it still falsely proves 1=2. I'm just confused as to why it's equal to 0, not 1.
Sandurz wrote:Well, that's my question. There are a ton of resources (including one of my textbooks) that uses that as an example. I'm getting so many contradictory answers, it's not even funny.