English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

A baseball player throws a ball from second base to home plate. He releases the ball from a height of 6 feet on a horizontal path with a velocity of 85 miles per hour. Will the ball reach home plate without bouncing? (or on the "fly") If not, how far away from home plate will the ball bounce? If so, what will be the height of the ball when it gets to home plate?

2007-12-13 09:28:43 · 3 answers · asked by kellibelli7392 1 in Science & Mathematics Physics

3 answers

One way to solve this problem is to realize that a ball dropped from a height of 6 feet will reach the ground at exactly the same time as a ball that's thrown horizontally from a height of 6 feet (assuming a level ground and that the curvature of the earth doesn't come into play, which it doesn't in this case). The dropped ball and the thrown ball both fall at exactly the same rate. So, let t equal the time it takes a ball to fall to the ground from 6 feet. Will a ball traveling 85 mph cover the distance from second base to home plate in more than time t? If not, then it bounces. If so, then it's at the same height as the dropped ball.

Hopefully, this should be enough to help you solve the problem.

2007-12-13 09:44:49 · answer #1 · answered by Jeff D 7 · 0 0

This depends on how far away home plate is from second base.

So the ball only has a velocity in the x component, 85 mph. Convert this to meters per second - about 38 m/s. Since things always fall in the y direction at the same speed as long as they dont have any initial velocity in the y direction, you can solve for the time it takes to fall.

vertical distance fallen = vi(t) + (1/2)at^2
since there is no vi (initial velocity), vi(t) = 0. 6 feet is 1.83 meters

1.83 = (1/2) * 9.8 * t^2
solve for t and get 0.61 seconds.

This is the same time it takes for the ball to travel a certain horizontal distance. Apply the same equation in the x direction.

horizontal distance = vi(t) + (1/2)at^2
since there is no horizontal acceleration, (1/2)at^2 is 0.
vi = 38 m/s
so vi(t) = 38 * 0.61 = 23.2 m

So the ball will travel 23.2 meters before hitting the ground.

2007-12-13 17:46:15 · answer #2 · answered by Sowmya 3 · 0 0

85 miles /hr x 5280ft /mi x hr / 60min x min / 60 sec
=124.66 ft / s
distance from 2nd base
a^2 +b^2 =c^2
90ft^2 +90ft^2 = c^2
c = 127.28ft
gravity 32.2ft/s^2
gotta go - figure it out

2007-12-13 17:41:33 · answer #3 · answered by Kent H 6 · 0 0

fedest.com, questions and answers