English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

There was a gal who had had a couple of accidents, so her grandma gives her a large sum of money to buy a brand new Volvo. The only condition is that when she gets the car, grandma wants to see it to make sure she didn't buy that Firebird. She drives to grandma's house which is 120 miles away. Assume that she lives right on the highway, and so does grandma -- there's no time lost accelerating(you know)
Because she's not eager to get there, she sets the cruise control for 40 mph and drives the 120 miles to grandma's house. She doesn't want to spend too much time with grandma; so she shows grandma the car and leaves. On the way back she sets the cruise control for 60 mph. same road, same 120 miles, and when she gets home she thinks,"I drove 120 miles up with 40 mph and 120 miles down with 60 mph thats 240 miles, my average speed was 50 mph, so it should have taken me 4.8 hours,right? That's 240 divided by 50 (average speed) comes out to 4.8. But it took 5 hours. How can that be??

2007-10-21 13:14:33 · 5 answers · asked by tuazeee 2 in Science & Mathematics Physics

5 answers

Because it took her a lot longer to get there the first time. Which means she spent more time driving at 40mph than 60. Therefore, the average would be less than 50mph because most of that time was spent going at 40.

2007-10-21 13:18:33 · answer #1 · answered by Anonymous · 2 0

Yes, it should take 5 hours, 3 one way, 2 the other way. The average speed should actually be computed as the total distance (240 miles) divided by time travelling (5 hours), or 48 miles an hour, not 50. Let's say that she stops halfway home to buy a CD. Does that mean her average speed is still 50 mph? No. You have to factor in the times in which those speeds were effective.

2007-10-21 13:22:00 · answer #2 · answered by Scythian1950 7 · 2 0

First things first.

She was wrong because she calculated the average speed, then worked out the total time taken. Wrong. The other way around.

To get the "speed" you have to total the distance, and divide it by the total time taken. That is 240 / 5 = 48mph. The average speed is 48mph not 50mph.

To calculate speed, you have to know the distance and the time taken first. Period. Assuming she didn't know it took 5 hours, she had to work out how long it took her, then calculate the average speed.

2007-10-21 13:47:53 · answer #3 · answered by Pizzous 2 · 2 0

40 mph for 120 miles (going to grandmas house)= 3 hours(40x3)

60mphs for 120 miles(coming back from grandmas)=2 hours(60x2)

therefore total=5 hours

2007-10-21 13:20:57 · answer #4 · answered by Anonymous · 2 0

....Her clock is bad?

2007-10-21 13:22:03 · answer #5 · answered by dfreeman321 2 · 0 0

fedest.com, questions and answers