The plane travels a total of 2000 miles in 5 hours. 2000/5=400, so the average speed of the plane is 400 mph.
2007-10-29 08:50:27
·
answer #1
·
answered by jlao04 3
·
1⤊
0⤋
OK
1000 miles in 2 hours and then 1000 miles in 3hr. That means 2000 miles in 5 hours.
Average speed = total miles/total time
x = 2000/5
x = 400 miles per hour is the average speed
Note that the direction has no bearing on the average speed - it only effect the distance and the actual displacement of miles.
Hope this helps.
2007-10-29 15:52:10
·
answer #2
·
answered by pyz01 7
·
1⤊
0⤋
There are 2 answers to this:
1.The air speed is simply the 2 distances 1000 and 1000 added and divided by the 5 hours. So, 400 mph.
2.The ground speed is arrived at using the Pythagorean theorem. C^2 = A^2 + B^2
C^2 = 2,000,000
So C (the ground distance covered) = 1414 miles
This was done in 5 hours. So, 1414/5 = 282.8 mph relative to the ground.
2007-10-29 15:55:25
·
answer #3
·
answered by ignoramus 7
·
0⤊
0⤋
average speed = total distance / total time
total distance = 1000 + 1000 = 2000 miles
total time = 2 + 3 = 5 hrs
averge speed = 2000 / 5 = 400 mi/hr <== answer
2007-10-29 15:52:59
·
answer #4
·
answered by Anonymous
·
1⤊
0⤋
2000 mi in 5 hr
2000/5 = 400
2007-10-29 15:51:20
·
answer #5
·
answered by kindricko 7
·
1⤊
0⤋
about 500 miles an hour
no ill be serious,
THE PLANE FLYS TWO THOUSAND MILES!!!!!!
IN 5 HOURS
ITS JUST ADDING
and dividing..
TWO THOUSAND DIVIDED BY 5 HOURS!!!
LEARN!
LEARN!
LEARN!
2007-10-29 15:50:02
·
answer #6
·
answered by melted cheese 4
·
0⤊
2⤋
add....divide....multiply...
PIE!
PIE
PIE!
that wrks for me
math aitn exactly my strong point lol!
2007-10-29 15:54:19
·
answer #7
·
answered by Anonymous
·
0⤊
0⤋