English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

A man is walking his dog on a leash towards home at a steady 4 mph. When they are nine miles from home, the man lets the dog off the leash. The dog immediately runs off towards home at 9 mph. When the dog reaches the house, it turns around and runs back to the man at the same speed. When it reaches the man, it turns back for the house. This is repeated until the man gets home and lets the dog in.
How many miles does the dog cover from being let off the leash to being let in the house?

2007-10-08 11:55:36 · 2 answers · asked by Anonymous in Science & Mathematics Mathematics

2 answers

20.25 miles -- it doesn't matter how far away the man is when the dog reaches him after going home, the dog is just contunuously running until the man reaches home. therefore, at 4 mph, it takes 2.25 hours for him to get home (9 divided by 4 equals 2.25). at 9 mph for 2.25 hours, the dog would travel 20.25 miles (9 times 2.25 equals 20.25).

2007-10-08 12:02:58 · answer #1 · answered by James M 2 · 0 0

It takes the man 9/4 = 2 1/4 hours to get home. So the dog has run 9* 2 1/4 = 81/4 = 20 1/2 miles

2007-10-08 19:01:38 · answer #2 · answered by ironduke8159 7 · 0 0

fedest.com, questions and answers