why can't 1 be divided into 3 equal parts mathematically?
1/3 = 0.3333333333333 ....
that means there is an error of 0.0000000000001.... value!
even though the error might seem insignificant , it can have drastic effects in sophisticated calculations!
why is it not possible to divide 1 into 3 equal parts logically? is it logic that is flawed or is it the way we use numbers?
or am I missing something?
2007-03-12
02:35:44
·
12 answers
·
asked by
x
2
in
Science & Mathematics
➔ Mathematics
indiana... i do understand what i'm askin! there will be a small value o.ooooooooooooooooo1.... that will be missing! think!
2007-03-12
02:44:41 ·
update #1
bruzer.. but what about in decimals?
2007-03-12
02:54:33 ·
update #2
gjmb1960 : good answer! but its a representation... why cant it be precise?!
2007-03-12
03:16:59 ·
update #3
cataurthur.. i'm sorry , but 0.9999999999 infinety is not equal to one!
2007-03-12
03:20:16 ·
update #4
bcoz it never reaches 1 !
2007-03-12
03:22:24 ·
update #5
good answer belinda! i guess the conclusion is that decimals are only a representation and cant be precise...
2007-03-12
03:30:46 ·
update #6
I think that your last addition sums it up. Some fractions cannot be represented by a finite decimal and so errors can be introduced by trying to do so. I once saw a beautifully devised examination question. If done with fractions an exact answer could be obtained. If done with a calculator then a very wrong answer was obtained due to the cumulative effect of rounding errors. This particularly happens when you subtract two numbers that are not far apart. If each contains a rounding error then this can multiply enormously as the calculation proceeds.
2007-03-12 04:21:47
·
answer #1
·
answered by Anonymous
·
1⤊
0⤋
It is possible to divide into three parts exactly
The parts are 1/3, 1/3, and 1/3.
In decimal notation, it isn't possible to REPRESENT the number except using a shorthand - a dot over the three to indicate that the representation continues indefinitely with the 3 recurring.
A computer doesn't in principal have a difficulty dealing with it exactly either. The programmer just has to code as a fraction, i.e. with two numbers .. one divided by three ... rather than representing in floating point.
2007-03-12 02:43:51
·
answer #2
·
answered by hustolemyname 6
·
0⤊
0⤋
One can be divided up equally. Each piece will be exactly 1/3.
1/3 + 1/3 + 1/3 = exactly 1
Example. Take 3 pieces of play dough exactly the same weight, one red one green and one blue. Form them into a ball. Now you have ONE ball. Each constituent of the ball is exactly 1/3, and they are all exactly equal.
I think EXPRESSING 1/3 as a decimal is problematic. Your question is a good one. Nun the less 1/3 is exact.
If you really want to have some fun wondering why, think about why you cannot cut a circle perfectly in half, and that relation to an irrational number. Have fun!
2007-03-12 02:52:11
·
answer #3
·
answered by BRUZER 4
·
0⤊
0⤋
You are right, in numerology this insignificant error never really disappears. However, I don't think it will have drastic effects in calculations, because if we notate it differently, then we get a perfect division. Say, take it as 360 degrees instead of number 1, then we get 3 equal slices of a cake which are 120 degrees each.
But what happens to it when we divide1 by 3? To think about the smallest 0,000001 is like thinking about the ever smallest substance in the universe; it goes to eternity. Protons, neutrons, atoms...these can be divided into smaller parts until 1 can be divided by 3:)
2007-03-12 03:26:29
·
answer #4
·
answered by Anonymous
·
2⤊
1⤋
Well, the flaw isn't in the logic, it's in the numbering system. Or, maybe it's God's fault - if we had twelve fingers instead of ten, we'd be counting in base 12. Then, 1/3 expressed as a decimal would be the nice and simple .4. (As numbering systems go, 10 is kind of a crummy base. Either use a prime number or pick one that factors the small integers.)
2007-03-12 02:54:37
·
answer #5
·
answered by Anonymous
·
1⤊
0⤋
1 can be divided by 3 mathematically.
1/3 is the representation of this.
and alos 0.333.... ( repeating 3's ) is a representation of this.
so there is no error , unless you ( yes you ) truncate 0.33... t say 10 decimals.
Netto it is possible to divide 1 by 3, and it is not flawed.
Edit:
You say that 0.999999.... ( continous 9's) is not the same as 1. But it is the same as 1.
just as 0.333... is the same as 1/3 .
1/3 - 0.333... = 0
0.333... means the number that you get when you divide one by three. that is the same as writeing 1/3, or (i : iii)
do you also think that the number pi is not exact ?
or sqrt(2) is not exact ?
2007-03-12 03:11:53
·
answer #6
·
answered by gjmb1960 7
·
2⤊
0⤋
There is no flaw. 1/3 just cannot be represented correctly in the decimal system. If you do the math out by hand using fractions, one divided by three time three will be equal to one, not 0.9999999...
2007-03-12 03:02:00
·
answer #7
·
answered by Anonymous
·
1⤊
0⤋
1/3 = 0.3333333333333 .... is not flawed- rather there is a misunderstanding on your part regarding the notation. the "..." means that the 3's continue indefinitely, which represents the fraction 1/3 with perfect accuracy. There is no error.
2007-03-12 02:42:09
·
answer #8
·
answered by indiana_jones_andthelastcrusade 3
·
0⤊
0⤋
Actually you will notice that if you add
0.33333.... + 0.33333... + 0.33333... you obtain
0.99999....
i.e 0 point 9 repeated infinitely.
Now this is confusing at first until you realize that 0 point 9 repeated is equal exactly to 1.
Why? Because no matter how small the amount you add to this number, you will always exceed 1.
Ex:
0.9999999.... +0.1 = 1.099999....
0.9999999.... +0.01 = 1.009999...
...
0.9999999...+0.0000001 = 1.0000000999999...
Therefore 0.9999... must be equal to 1.
2007-03-12 03:16:59
·
answer #9
·
answered by catarthur 6
·
1⤊
1⤋
Look at it this way... what about 1/2?
1/2 = 0.500000000...
What about the other decimals? How can you be sure they're all zeroes?
2007-03-12 03:21:33
·
answer #10
·
answered by Anonymous
·
0⤊
1⤋