English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories
0

A major league pitcher can throw a baseball in excess of 46.7 m/s. if a ball is thrown horizontally at this speed, how much will it drop by the time it reaches the catcher who is 19.2 m away from the point of release?

2007-09-15 15:41:04 · 3 answers · asked by Anonymous in Science & Mathematics Physics

3 answers

Yes, Physics!
Ignore the air friction, the horizontal speed of the ball will not change, and it takes 19.2m/ (46.7 m/s) = 0.4111(s) to reache the catcher. Due to gravity during this 0.4111s period, it would drop:
0.5*g*t^2 = 0.5*9.8*0.4111^2 = 0.828(m)

2007-09-19 17:49:46 · answer #1 · answered by Hahaha 7 · 0 1

You have to remember that the horizontal motion (component) of an objects speed is unaffected by gravity. So if it is thrown horizontally (all the speed is horizontal) the speed of 46.7 m/s is unaffected by gravity in the horizontal direction. It hits the catcher's mitt with a horizontal speed of 46.7 m/s. What does happen is that the ball begins to accelerate (fall) downward, which is a different direction and it starts to gain speed in the vertcal direction. The balls total velocity would be the VECTOR SUM of the two speeds (horizontal and vertical) when it hits the catcher's mitt.

2007-09-23 15:48:08 · answer #2 · answered by Anonymous · 0 0

What the others aren't taking in to account is that the pitcher is throwing down hill.

The pitchers mound is about 18 inches high.

2007-09-23 21:42:48 · answer #3 · answered by Floyd B 5 · 0 0

fedest.com, questions and answers