Every math student knows that division by 0 is
forbidden. As a matter of fact, on the list of commandments in mathematics,
this is at the top. But why is division by 0 not permissible? We in mathematics
pride ourselves on the order and beauty in which everything in the realm of
mathematics falls neatly into place. When something arises that could spoil
that order, we simply define it to suit our needs. This is precisely what
happens with division by 0. So let’s give this “commandment” some meaning.
Consider the quotient ^{n}/_{0}, with n≠0. Without acknowledging the divisionbyzero
commandment, let us speculate (i.e., guess) what the quotient might be. Let us
say it is p. In that case, we could check by multiplying 0 x p to
see if it equals n, as would have to be the case for the division to
be correct. We know that 0×p ≠ n,
since 0×p = 0. So there is no number p that
can take on the quotient to this division. For that reason, we define division
by 0 to be invalid.
A more convincing case for defining away division
by 0 is to see how it can lead to a contradiction of an accepted fact, namely,
that 1 ≠ 2.
If division by 0 is acceptable, then 1 = 2, clearly an absurdity!
Here is the “proof” that 1 = 2:
Let, a

=

b


Then a^{2}

=

ab

(multiplying both sides by a)

a^{2} − b^{2}

=

ab − b^{2}

(subtracting b^{2} from both sides)

(a–b)(a+b)

=

b(a – b)

(factoring)

a + b

=

b

[dividing by (a − b)]

2b

=

b

(replacing a by b)

2

=

1

(dividing both sides by b)

In the step where we divided by (a−b), we actually divided by 0, because a = b, so a
− b = 0.
That ultimately led us to an absurd result, leaving us with no option other
than to prohibit division by 0.
0 comments