Although students are typically taught that "one cannot divide by 0," it can be argued that = 0 (read as "one divided by infinity"). How is this possible? Observe the following progression.

Note that as the denominator, or the divisor, becomes larger, the value of the fraction (or the "quotient") becomes smaller. What happens if the denominators become very large?

One can see that as the denominator becomes extremely large, the fraction values approach 0. Indeed, if one thinks of infinity as "ultimately large," one can see that the value of the fraction will likewise be "ultimately small," or 0. Hence, one informal (but useful) way to define infinity is "the number that 1 can be divided by to get 0." Actually, there is no need to use the number 1 as the numerator here; any number divided by infinity will produce 0.

Using algebra, one can come up with another definition of infinity. By transforming the following equation we see that infinity is what results if 1 is divided by 0.

If

Then 1 = ∞ × 0

And

Notice that this approach to informally defining infinity produces an equation (the middle equation of the three above) in which something times 0 does not give 0! Because of this difficulty, and because the rules of algebra used to write and transform the equations apply to numbers, some mathematicians claim that division by 0 should not be allowed because ∞ may not be a defined number. They argue that dividing by 0 does

*not*give infinity, but rather that infinity is undefined.

Another method of attempting to define infinity is to examine sets and their elements. If in counting the elements of a set one-by-one the counting never ends, the set can be said to be infinite.