Did you know that 1 = 2? Think that sounds ridiculous? OK, I'll prove it to you. Then I'll show you why this "proof" is indeed, as you suspected, ridiculous. And we'll see what it all has to do with the number zero.
Buy Now
Quick, what's 1 + 1? It's obviously 2, right? Not so fast!
What if I was to tell you that I could prove that 1 + 1 is actually equal to 1. And that, therefore, 2 is equal to 1. Would you think I was kind of nuts? More like completely nuts? Probably. But nuts or not, these are exactly the things we'll be talking about today.
Of course, there will be a trick involved because 1 + 1 is certainly equal to 2…thank goodness! And, as it turns out, that trick is related to a very interesting fact about the number zero.
How does it all work? And what's the big ruse that the sneaky number zero is attempting to pull off? Keep on reading to find out!.
How to "Prove" That 2 = 1
Let's begin our journey into the bizarre world of apparently correct, yet obviously absurd, mathematical proofs by convincing ourselves that 1 + 1 = 1. And therefore that 2 = 1. I know this sounds crazy, but if you follow the logic (and don't already know the trick), I think you'll find that the "proof" is pretty convincing.
Here's how it works:

Assume that we have two variables a and b, and that: a = b

Multiply both sides by a to get: a^{2} = ab

Subtract b^{2} from both sides to get: a^{2}  b^{2} = ab  b^{2}

This is the tricky part: Factor the left side (using FOIL from algebra) to get (a + b)(a  b) and factor out b from the right side to get b(a  b). If you're not sure how FOIL or factoring works, don't worry—you can check that this all works by multiplying everything out to see that it matches. The end result is that our equation has become: (a + b)(a  b) = b(a  b)

Since (a  b) appears on both sides, we can cancel it to get: a + b = b

Since a = b (that's the assumption we started with), we can substitute b in for a to get: b + b = b

Combining the two terms on the left gives us: 2b = b

Since b appears on both sides, we can divide through by b to get: 2 = 1
Wait, what?! Everything we did there looked totally reasonable. How in the world did we end up proving that 2 = 1?
What Are Mathematical Fallacies?
The truth is we didn't actually prove that 2 = 1. Which, good news, means you can relax—we haven't shattered all that you know and love about math. Somewhere buried in that "proof" is a mistake. Actually, "mistake" isn't the right word because it wasn't an error in how we did the arithmetic manipulations, it was a much more subtle kind of whoopsiedaisy known as a "mathematical fallacy."
It's never OK to divide by zero!
What was the fallacy in the famous faux proof we looked at? Like many other mathematical fallacies, our proof relies upon the subtle trick of dividing by zero. And I say subtle because this proof is structured in such a way that you might never even notice that division by zero is happening. Where does it occur? Take a minute and see if you can figure it out…
OK, got it?
It happened when we divided both sides by a  b in the fifth step. But, you say, that's not dividing by zero—it's dividing by a  b. That's true, but we started with the assumption that a is equal to b, which means that a  b is the same thing as zero! And while it's perfectly fine to divide both sides of an equation by the same expression, it's not fine to do that if the expression is zero. Because, as we've been taught forever, it's never OK to divide by zero!
Why Can't You Divide By Zero?
Which might get you wondering: Why exactly is it that we can't divide by zero? We've all been warned about such things since we were little lads and ladies, but have you ever stopped to think about why division by zero is such an offensive thing to do? There are many ways to think about this. We'll talk about two reasons today.
The first has to do with how division is related to multiplication. Let's imagine for a second that division by zero is fine and dandy. In that case, a problem like 10 / 0 would have some value, which we'll call x. We don't know what it is, but we'll just assume that x is some number. So 10 / 0 = x. We can also look at this division problem as a multiplication problem asking what number, x, do we have to multiply by 0 to get 10? Of course, there's no answer to this question since every number multiplied by zero is zero. Which means the operation of dividing by zero is what's dubbed "undefined."
The second way to think about the screwiness of dividing by zero—and the reason we can't do it—is to imagine dividing a number like 1 by smaller and smaller numbers that get closer and closer to zero. For example:
 1 / 1 = 1
 1 / 0.1 = 10
 1 / 0.01 = 100
 1 / 0.001 = 1,000
 1 / 0.0001 = 10,000
 ...
 1 / 0.00000000001 = 100,000,000,000
and so on forever. In other words, as we divide 1 by increasingly small numbers—which are closer and closer to zero—we get a larger and larger result. In the limit where the denominator of this fraction actually becomes zero, the result would be infinitely large.
Which is yet another very good reason that we can't divide by zero. And why 1 + 1 is indeed equal to 2…no matter what our screwy "proof" might say.
Wrap Up
OK, that's all the math we have time for today.
Please be sure to check out my book The Math Dude’s Quick and Dirty Guide to Algebra. And remember to become a fan of the Math Dude on Facebook where you’ll find lots of great math posted throughout the week. If you’re on Twitter, please follow me there, too.
Until next time, this is Jason Marshall with The Math Dude’s Quick and Dirty Tips to Make Math Easier. Thanks for reading, math fans!
Apple addition and infinity lightbulb images from Shutterstock.