English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

The Motionless Runner

A runner wants to run a certain distance - let us say 100 meters - in a finite time. But to reach the 100-meter mark, the runner must first reach the 50-meter mark, and to reach that, the runner must first run 25 meters. But to do that, he or she must first run 12.5 meters.

Since space is infinitely divisible, we can repeat these 'requirements' forever. Thus the runner has to reach an infinite number of 'midpoints' in a finite time. This is impossible, so the runner can never reach his goal. In general, anyone who wants to move from one point to another must meet these requirements, and so motion is impossible, and what we perceive as motion is merely an illusion.

Where does the argument break down? Why?

2007-01-11 06:27:06 · 3 answers · asked by Anonymous in Science & Mathematics Mathematics

3 answers

In other words, Zeno's paradox of Achilles and the Tortoise. In order to understand the answer, you must be familiar with calculus and the concept of the "limit".

---------------------------------------
Proposed solutions both to Achilles and the tortoise, and to the dichotomy

Both the paradoxes of Achilles and the tortoise and that of the dichotomy depend on dividing distances into a sequence of distances that become progressively smaller, and so are subject to the same counter-arguments.

Aristotle pointed out that as the distance decreases, the time needed to cover those distances also decreases, so that the time needed also becomes increasingly small. Such an approach to solving the paradoxes would amount to a denial that it must take an infinite amount of time to traverse an infinite sequence of distances.

Before 212 BC, Archimedes had developed a method to derive a finite answer for the sum of infinitely many terms that get progressively smaller. Theorems have been developed in more modern calculus to achieve the same result, but with a more rigorous proof of the method. These methods allow construction of solutions stating that (under suitable conditions) if the distances are always decreasing, the time is finite....

Proposed solution using calculus notation
d = distance between runners
t = time

lim (as d --> t) f(d) = t

2007-01-11 07:00:01 · answer #1 · answered by Randy G 7 · 1 0

Ahhh... Zeno's Paradox.

I think it breaks down in that, although yes, the runner will have to cover an infinite number of intervals, nowhere was it stated that these must each take at least a certain amount of time. If you had stated/shown that each interval takes at least 1 second, then it would take an infinite amount of time.

However in this case if the speed of the runner is constant then the amount of time for each interval is proportional to the size of the interval, and thus even though there are infinitely many intervals, the time to cover each is 1/2 the time to cover the one before, so the total time to cover all is finite.

To put it another way, the series: 1/2 + 1/4 + 1/8 + ... has infinitely many terms, but its sum (1) is still finite.

2007-01-11 06:59:48 · answer #2 · answered by Phineas Bogg 6 · 0 0

The fallacy is in the statement "This is impossible". There is nothing in the previous statements that indicates why this is impossible.

The original version of this question is the argument that you can never get anywhere since you always have to go half of the remaining distance, hence you never get there. Simple answer is that you get "close enough". An answer to the syllogism is to set the original target just beyond where you really want to go, so you will get there.

The real answer is that the argument is specious, since velocity is independent of destination, but dependent merely on speed and direction.

2007-01-11 06:45:01 · answer #3 · answered by Michael H 2 · 0 0

fedest.com, questions and answers