charliem says...

This is patently wrong.
0.999 recurring != 1.
0.244 recurring != 0.25.

The value 0.999 is equivalent to 1, not equal.

The correct 'notation' to use follows;

0.999 recuring ≃ 1.

Note: This is not an equality sign, it is an equivalence sign.
In mathematics they are vastly different.

One infers solid equality.
1 = 1, no ifs ands or buts.

1 ≃ 0.999 means that 0.999 and 1 are close enough that you can, given set boundary conditions, claim they are equal, when in reality they are not.

This is why we have surds and fractions. Use those if you want to express recursive decimal accurately.

GeeSussFreeK says...

>> ^charliem:

This is patently wrong.
0.999 recurring != 1.
0.244 recurring != 0.25.
The value 0.999 is equivalent to 1, not equal.
The correct 'notation' to use follows;
0.999 recuring ≃ 1.
Note: This is not an equality sign, it is an equivalence sign.
In mathematics they are vastly different.
One infers solid equality.
1 = 1, no ifs ands or buts.
1 ≃ 0.999 means that 0.999 and 1 are close enough that you can, given set boundary conditions, claim they are equal, when in reality they are not.
This is why we have surds and fractions. Use those if you want to express recursive decimal accurately.


Ya, he is trying to solve irrational numbers...rationally, oops. He seems to bypass the notion that some irrational numbers can't be expressed by a surd or fraction, and you can't just go truncating them without changing the value from absolute to an estimation. Bad mathematician, bad, get your hands out of the science jar!

Ornthoron says...

@charliem No, you are wrong.

Noone says that 0.999 = 1. What is true is that the number written as 0.(an infinite number of 9s), which we can write more prettily as 0.(9), is equal to 1. That means equal. Exactly equal. No equivalence needed.

Bear in mind that we are not talking about a number with a finite number of decimals. If we were, it would be true to say that we could get arbitrarily close to 1 without ever being exactly equal. But we are in fact talking about the infinite sum

9*(1/10) + 9*(1/10)^2 + 9*(1/10)^3 + ...

This is a geometric series of the form ar + ar^2 + ar^3 which according to the convergence theorem has the solution

ar/(1-r) = (9*(1/10))/(9/10) = 1

There, I just proved the equality for you.

GeeSussFreeK says...

>> ^Ornthoron:

@charliem No, you are wrong.
Noone says that 0.999 = 1. What is true is that the number written as 0.(an infinite number of 9s), which we can write more prettily as 0.(9), is equal to 1. That means equal. Exactly equal. No equivalence needed.
Bear in mind that we are not talking about a number with a finite number of decimals. If we were, it would be true to say that we could get arbitrarily close to 1 without ever being exactly equal. But we are in fact talking about the infinite sum
9 (1/10) + 9 (1/10)^2 + 9 (1/10)^3 + ...
This is a geometric series of the form ar + ar^2 + ar^3 which according to the convergence theorem has the solution
ar/(1-r) = (9 (1/10))/(9/10) = 1
There, I just proved the equality for you.


Something tending to something isn't the something itself. Something tending toward 1 isn't, yet, 1. We don't live in the land of convergent infinities, we live in today. If you can right down enough .99999's that eventually turn into a 1, then I will accept that proof, otherwise, it is an estimation or an assumption. Unless you don't believe in infinite precision, that is. But even then, your left one something with a fineinte number of 9's that don't converge to a 1. Doing loop-de-loops with infinities, a reality in which humans don't and can't inhabit, is trying to abstract away the real problem...the same problem that Zeno proposed long, long ago.

Ornthoron says...

@GeeSussFreeK

So you are denying the existence of infinites in mathematics? Good luck with that, there just might be a Field's medal in it for you if you are right.

It's obvious that the equality doesn't hold if we truncate. That's why we don't do that. If you in the sum

9*(1/10) + 9*(1/10)^2 + 9*(1/10)^3 + ...

stop summing after a finite number of steps you will, as you say, get only an estimation. But calculating infinite sums is something mathematicians and other users of mathematics do all the time. For instance is

2 = 1 + 1/2 + 1/4 + 1/8 + 1/16 + ...

Just because you cannot physically write the sums out on paper doesn't mean they are less correct. Mathematics is abstract. That's what makes it so powerful.

GeeSussFreeK says...

>> ^Ornthoron:

@GeeSussFreeK
So you are denying the existence of infinites in mathematics? Good luck with that, there just might be a Field's medal in it for you if you are right.
It's obvious that the equality doesn't hold if we truncate. That's why we don't do that. If you in the sum
9 (1/10) + 9 (1/10)^2 + 9 (1/10)^3 + ...
stop summing after a finite number of steps you will, as you say, get only an estimation. But calculating infinite sums is something mathematicians and other users of mathematics do all the time. For instance is
2 = 1 + 1/2 + 1/4 + 1/8 + 1/16 + ...
Just because you cannot physically write the sums out on paper doesn't mean they are less correct. Mathematics is abstract. That's what makes it so powerful.


1 + 1 = 2 isn't abstract, or an estimation, though. That is my point, as far as maths are abstract, they aren't certain, as far as they are certain, they aren't abstract. Infinities aren't tautologically certain. I don't think you can do math with infinite numbers without estimating. I am not denying the infinities of numbers, I am denying the use of logic based on an infinite outcomes, does that make sense? I am denying that you can count to the largest number, and in that, I am denying that you can make predictions on the outcome of a value of a number that is infinitely precise.

Ornthoron says...

>> ^GeeSussFreeK:
1 + 1 = 2 isn't abstract, or an estimation, though. That is my point, as far as maths are abstract, they aren't certain, as far as they are certain, they aren't abstract. Infinities aren't tautologically certain. I don't think you can do math with infinite numbers without estimating. I am not denying the infinities of numbers, I am denying the use of logic based on an infinite outcomes, does that make sense? I am denying that you can count to the largest number, and in that, I am denying that you can make predictions on the outcome of a value of a number that is infinitely precise.


I agree that it can be difficult to wrap your head around, but that doesn't mean it's wrong. As far as mathematical logic is concerned, 1 + 1 = 2 is just as abstract as what I've just lined out. Using the most basic axioms of mathematics, the same that are used to prove 1 + 1 = 2, it is possible to prove (through a series of other increasingly complicated theorems) that the infinite sums do in fact give exact answers.

Mikus_Aurelius says...

Try going up to someone who actually does math for a living and suggesting that "you can't do math with infinite (sets of) numbers". There is a well established, consistent, and applicable theory for dealing with infinite sums. It's called a limit. We've only been using them for 400 years.

Demarcations between the abstract and concrete are useless. Every mathematical theory is abstract. Numbers don't exist, but they can be applied in millions of useful ways. So can infinite sums.

GeeSussFreeK says...

>> ^dannym3141:

@GeeSussFreeK i don't like it either, but it's one of those things you have to accept is true - just like quantum mechanics Your mind's desire to slot it into a jigsaw puzzle will have to go unsatiated.


You can't really talk about something unless you have an idea of it, just like you can't talk about circular squares or some other such construct that doesn't point to a real idea. I am still considering what @Ornthoron was saying and wondering if we are talking about the same thing or not. Get back to this later, have to ponder, I think this is a problem of ontology vs abstraction. @Mikus_Aurelius I am very familiar with the idea of a limit, taken many years of calculus and dif. EQ back in the day. My argument isn't that you can "use" things, but if those things represent actual things in and of themselves. I do think that there is an inherit realness to numbers outside of complete abstraction. The idea of a single object relating to itself is always true, regardless of a formal number system to represent it. The relation of 2 objects against 1 object is also still a real distinction that exists outside of a formal numbering system. The realness of elements of counting are, seemingly (and I need to think more on this) tautological true; an analytically true statement in other words.

I need to ponder on this more though, perhaps I am mistaken. Or perhaps I was talking about a different aspect of it than everyone else. Time to grease down the mind with beer!

Mikus_Aurelius says...

Some ideas are intuitive to me and others are not. I don't think that makes my declarations of what's "real" or not any more than personal opinion. The idea that we see 3 cats and 3 goats and associate the same number 3 to each is an abstract construction. If I have one piece of chalk and another piece of chalk, does that mean I have two pieces? What if I break one in half? Or could I place them so close together that they are one piece of chalk again? We can't talk about one of a thing being inherently different than two of a thing, since no two identical things exist in the universe.

You are comfortable with the idea that we can count things and that the numbers we assign to quantities of different objects are comparable. In some remote places of the world you'll find people, adults, who see this idea as unnatural. The idea that you would quantify any group bigger than 5 is alien to them.

1+1=2 because we have defined a number system in which it is so. Conveniently, we can understand real objects in terms of this system. We arrived at this system through a combination of intuition and abstract manipulation. No one has ever sat down with 1350 oranges in one pile and 6723 in another and counted the sum to see that they got 8073.

Similarly, the sum 9x10^(-k) = 1 because we have defined infinite decimals to work this way. Conveniently again, this allows us to understand physical phenomena. We also arrived at this system through a combination of intuition and abstract manipulation. The fact that it doesn't feel intuitive to you doesn't give you any real argument against it.

>> ^GeeSussFreeK:

>> ^dannym3141:
@GeeSussFreeK i don't like it either, but it's one of those things you have to accept is true - just like quantum mechanics Your mind's desire to slot it into a jigsaw puzzle will have to go unsatiated.

You can't really talk about something unless you have an idea of it, just like you can't talk about circular squares or some other such construct that doesn't point to a real idea. I am still considering what @Ornthoron was saying and wondering if we are talking about the same thing or not. Get back to this later, have to ponder, I think this is a problem of ontology vs abstraction. @Mikus_Aurelius I am very familiar with the idea of a limit, taken many years of calculus and dif. EQ back in the day. My argument isn't that you can "use" things, but if those things represent actual things in and of themselves. I do think that there is an inherit realness to numbers outside of complete abstraction. The idea of a single object relating to itself is always true, regardless of a formal number system to represent it. The relation of 2 objects against 1 object is also still a real distinction that exists outside of a formal numbering system. The realness of elements of counting are, seemingly (and I need to think more on this) tautological true; an analytically true statement in other words.
I need to ponder on this more though, perhaps I am mistaken. Or perhaps I was talking about a different aspect of it than everyone else. Time to grease down the mind with beer!

siftbot says...

Tags for this video have been changed from 'mathematics, math, numbers, notation, simple proof' to 'he is saying naught meaning zero, mathematics, math, numbers, notation, simple proof' - edited by chilaxe

Bidouleroux says...

>> ^dannym3141:

@GeeSussFreeK i don't like it either, but it's one of those things you have to accept is true - just like quantum mechanics Your mind's desire to slot it into a jigsaw puzzle will have to go unsatiated.


Refusing to accept received "truths" is exactly how science advances. The Ancient Greeks thought infinities and infinitesimals were dumb and irrational numbers, well, properly irrational. Now we "know" they're just numbers like every other number. Same thing with the square root of minus one. Ultimately though, they are just tools and we will use them until they no longer suit our purposes. There are already many number systems in which 0.(9) doesn't equal 1. Who knows when they'll be useful.

But I think the real question is, what of the transfinites? No one ever thinks about the transfinites.

dannym3141 says...

>> ^GeeSussFreeK:

>> ^dannym3141:
@GeeSussFreeK i don't like it either, but it's one of those things you have to accept is true - just like quantum mechanics Your mind's desire to slot it into a jigsaw puzzle will have to go unsatiated.

You can't really talk about something unless you have an idea of it, just like you can't talk about circular squares or some other such construct that doesn't point to a real idea. I am still considering what @Ornthoron was saying and wondering if we are talking about the same thing or not. Get back to this later, have to ponder, I think this is a problem of ontology vs abstraction. @Mikus_Aurelius I am very familiar with the idea of a limit, taken many years of calculus and dif. EQ back in the day. My argument isn't that you can "use" things, but if those things represent actual things in and of themselves. I do think that there is an inherit realness to numbers outside of complete abstraction. The idea of a single object relating to itself is always true, regardless of a formal number system to represent it. The relation of 2 objects against 1 object is also still a real distinction that exists outside of a formal numbering system. The realness of elements of counting are, seemingly (and I need to think more on this) tautological true; an analytically true statement in other words.
I need to ponder on this more though, perhaps I am mistaken. Or perhaps I was talking about a different aspect of it than everyone else. Time to grease down the mind with beer!


But you DO have an "idea" of it. You know how it behaves. You may not understand why it does that but you can prove to yourself that it does. You might find that it crops up more often in nature than you like - as i said, quantum mechanics is counter-intuitive which unfortunately only goes to tell us that our intuition is wrong.

My lecturer's example was always to take electric charge - we have a name for it, we have a set of characteristic rules for electric charge and interaction between charges but when it gets right down to it 'electric charge' is just a name. We have simply defined and described a set of rules for a phenomenon that we have observed. And the same goes for quarks - we have up, down, top, bottom, strange and charmed. Those are just words too, we're just more familiar with electric charge as a term so we think we understand it.

I don't necessarily think you do need an idea of something to talk about it; anything more in-depth than "it exists, and here is how it behaves" comes after you analyse it, and presumably talk about it if only to yourself conceptually.

Ornthoron says...

@GeeSussFreeK

I believe where you go wrong is that you are trying to look for some deeper meaning behind this equality. There isn't one. As he says in the video, it's just notation. Two different ways of writing the same number using decimal notation.

The decimal system is in fact incapable of writing out all the different real number. It is for instance impossible to write out π with a finite number of digits, but that doesn't make π any less real.

Here's another example that might convince you: It is impossible to write out the fraction 1/3 with a finite number of digits. An approximation is to write it as 0.33, a better approximation is to write 0.333, and an even better one is to write 0.3333333333333333333333. But none of these are exact; to get an exact decimal representation (and mind you, it's no more than a representation) we would have to write out an infinite number of 3s after the decimal point. We therefore write 0.(3) to indicate that the 3s continue on forever. This is still only notation, a way to write the exact value 1/3 in decimal representation even though it's impossible with a finite amount of paper.

Still with me? Let us now do some simple calculations. Do you agree with me that I can divide this number by 3? I can after all do the calculations 0.33/3 = 0.11 and 0.3333333/3 = 0.1111111. If I want to divide 0.(3) by 3 I just have to repeat the same pattern so that 0.(3)/3 = 0.(1) = 0.1111111111... and so on.

Now let me use this to give you another proof of the concept:

0.(3) = 1/3
Now, instead of dividing, I multiply both sides with 3:
0.(9) = 3/3 = 1

There, I just proved again that 0.(9) = 1. Exactly equal. This is not as rigid a proof as the one I provided above, but it serves to demonstrate the point.


Ontological discussions about whether mathematics exists only in our minds or in and of itself can be entertaining, but it's not what this video is about. At all.

SpaceDude says...

More to the point, does any of this really matter? Is there actually any real world application where it makes a difference? If not then you are just wasting your time arguing about it.

draak13 says...

I'll weight in.

The first argument I've seen made in the last class I've seen required for a math minor is that there is no number that you can write which exactly equals pi. You can write more and more numbers which gets you closer and closer to pi, but you can't write the decimal value for pi itself, unless you had an infinitely long number written down. Infinite precision does indeed matter, so 0.9 with infinite nines is different from 1.0 with infinite zeros.

The only proof that begins to be relevant against this notion is the one presented by @Ornthoron, which is a geometric series. The geometric series he presented converges to 1 if you sum an infinite number of the series elements together. He defined his infinite series to be the equation ar + ar^2 + ar^3. However, the guy in this video didn't define any geometric series, he defined a static number: 0.9 with infinite nines.

The two concepts are explicitly different. If you wanted to take a calculus approach to the same explanation, the geometric series suggested by Onthoron would look like a line asymptotically approaching zero. Amazingly, integrating the area under that line approaches a value of 1 as you integrate more and more of the range along that number line, and equals 1 exactly if you integrate along the entire infinitely long number line. The value 0.9 with infinite nines would look like a discontinuous and flat line going from x = 0 (inclusive) to x = 1 (exclusive). The integrated area of that discontinuous line would not be equal to 1, and there is no infinitely more range to integrate along the number line. It has a definite & discrete value of the closest possible number to 1 that doesn't equal 1.

jmzero says...

The first argument I've seen made in the last class I've seen required for a math minor is that
there is no number that you can write which exactly equals pi.



In base pi, pi is 1. Exactly. And you an write pi exactly in a number of ways - just not as a single decimal number with finite digits. Similarly, we can't write out a finite number of nines and have one. But if we could write out an infinite number of digits, we could exactly specify pi (or 1 using 9s).

Infinite precision does indeed matter, so 0.9 with infinite nines is different from 1.0 with infinite zeros.


By how much is it different then? And, I'm curious, is .(3) also different than 1/3?

I've trolled the opposite side of this discussion for fun a couple times, and you can build a convincing looking (though wrong) case (though you haven't really made much of an attempt here). I assume you're trolling - but if you or anyone is actually not convinced, some quick Googling should fix that.

juliovega914 says...

I didnt read the arguments up till here because they bring TL;DR to a whole new level but...

If there was no distinction between equality and equivalence, and the infinitesimally small value was indeed equal to zero, calculus wouldn't be a thing. The immediate velocity if any object would be equal to displacement/time = displacement/0 (since time is zero for immediate velocity) = infinity. This of course meaning that every time you move your car (or anything that has MASS) it would take an infinite amount of energy to do so.

So its wrong.

dgandhi says...

>> ^Mikus_Aurelius:

Try going up to someone who actually does math for a living and suggesting that "you can't do math with infinite (sets of) numbers". There is a well established, consistent, and applicable theory for dealing with infinite sums. It's called a limit. We've only been using them for 400 years.
Demarcations between the abstract and concrete are useless. Every mathematical theory is abstract. Numbers don't exist, but they can be applied in millions of useful ways. So can infinite sums.


The physical universe has a range from planck to ~ 15B lightyears, or < 100 orders of magnitude. Once you get bellow 10-100 you are just blowing smoke, we treat limits as equivalences, and within the context of our strictly bounded universe, they might as well be, but there is no reason in a non-bounded universe that they must, or would be, we simply have no way to check.

While math is useful it is not TRUE, and it is not only one set of rules, but many within different domains. When we need to set new rules to model something new, we do, to treat any of these conventions as necessarily or real, is to ignore the basic ad-hoc nature of the tool.

draak13 says...

@jmzero

You're trolling, instead of productively discussing? Not sure what you're trying for...you made a couple of assertions about how you can change the base numbering system, but that still doesn't change that a number almost equal to 1 (base 10) is greater than 1 (base 0.99999999).

If you have something productive to say, then say it.

dannym3141 says...

>> ^draak13:

@jmzero
You're trolling, instead of productively discussing? Not sure what you're trying for...you made a couple of assertions about how you can change the base numbering system, but that still doesn't change that a number almost equal to 1 (base 10) is greater than 1 (base 0.99999999).
If you have something productive to say, then say it.


Well he didn't solve anything anyway, may as well write the symbol for pi as "1 in base pi". But don't feed the trolls!

Mikus_Aurelius says...

That's twice now that someone has brought up this bizarre "equivalence" notion. Is someone teaching this nonsense somewhere? We already have an equivalence relation on the real numbers. It's called "equal."

When you define the value of a limit, you aren't defining a new hitherto unknown equivalence relation on the reals. You are defining a new function, whose output gives an actual number (where defined). These numbers are equal to other numbers, or they are not.

By the way, I've reverted to discussing the actual definitions of mathematics as they exist today. Some people still seem to want to discuss what math is real and what math isn't. Ask a mathematician, and they'll likely say "all of it" or "none of it" and direct you to the philosophy department.

gwiz665 says...

Lol, math.


jmzero says...

In case anyone is actually interested in this (and I assume most people are meta-trolling rather than actually not knowing): .(9) equals and is the same as 1. They are two ways to represent the same number (not infinitesimally different, not "tending towards" or "left limit" or any of that - they are the very same number).

Wikipedia has a good discussion of it here: .999...

For all the arguments (which I think are pretty clear) I will grant that this isn't an intuitive thing. Nonetheless, I think that Wikipedia discussion should be enough for anyone who still doubts.

MaxWilder says...

I just don't see how any of you deniers can get around the following math:

x = 0.999...
10x = 9.999...

10x - x = 9.0

9x = 9

x = 1

That was enough for me, and it was right there in the video. If you can show an error in the math, please do.

VoodooV says...

that's not really a proof though, you're just throwing up your hands and saying..."close enough"

if .9999... was 1. Then it wouldn't be .9999, it would be 1

.999... and 1.000... are as close as you can get to 1 without being 1

you're just exploiting the weakness of converting fractions to decimal notation. Notation is the key word there. 1/3 is not .333... It's an approximation and by the same notion 3 * (1/3) is not .99999...

<deal with it.jpg>

Ornthoron says...

>> ^VoodooV:

that's not really a proof though, you're just throwing up your hands and saying..."close enough"
if .9999... was 1. Then it wouldn't be .9999, it would be 1
.999... and 1.000... are as close as you can get to 1 without being 1
you're just exploiting the weakness of converting fractions to decimal notation. Notation is the key word there. 1/3 is not .333... It's an approximation and by the same notion 3 (1/3) is not .99999...
<deal with it.jpg>


You are mistaken. No, it's not the most rigid proof. But I provided a rigid proof above using geometric series that should be enough to convince everyone who knows a modicum of math.

But you are right that notation is the key word here. Namely that in the decimal representation, there is no unique way to write out each number. We have been conditioned from ground school to believe that each number can only be written in one way, but that is false. If 0.9999... is not equal to 1, what then is the value of 1 - 0.9999..., pray tell?

Also, 0.3333.... is exactly equal to 1/3. In this case, there is no other way of writing that number in decimal notation. It would have been an approximation if we had only written out a finite number of 3s after the decimal point. But we don't.

GeeSussFreeK says...

@Ornthoron @dannym3141 and @Mikus_Aurelius I promised a reply, and I will give one in time, but it might need more time than I thought. I think I might be onto something, big. If it pans out, I am going to call it "The Extended Gödel's Incompleteness theorem as it refers to infinite irrationality". I think I can show that all non-terminating sets either are undefined, and/or, the largest set of infinities in existence. I haven't yet found the significance of the equation I crafted to explain irrational repeaters. I will check in later with the complete proof when I am done crafting it.

jmzero says...

I think I can show that all non-terminating sets either are undefined, and/or, the largest set of infinities in existence.


The infinities are already pretty well ordered...

A brief summary: the set of integers, rationals, and "computable irrationals" (including pi and every other number that makes any kind of sense) are all the same size.

Moving up one level, there are incomprehensibly more real numbers, as reals include a bunch of nonsense which we can't "map to" (in the way we can map between integers and rationals): numbers that never terminate and for which there can't exist a pattern fully describing them. The larger cardinalities are difficult to properly consider.

This isn't explained well on any "pop-math" website I've seen (the relevant wikipedia article isn't terribly helpful), but it makes sense if you have a general understanding of computers. Clearly, any number that follows a proper pattern can be described by a computer program on some possible computer (and all computer programs clearly correspond to integers).

playlists with this video:

Send this Article to a Friend



Separate multiple emails with a comma (,); limit 5 recipients






Your email has been sent successfully!

Manage this Video in Your Playlists

Beggar's Canyon

 • view