English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

A major-league pitcher can throw a baseball in excess of 37.0m/s. If a ball is thrown horizontally at this speed, how much will it drop by the time it reaches a catcher who is 17.0m away from the point of release?

2007-12-19 14:56:05 · 2 answers · asked by Anonymous in Science & Mathematics Physics

2 answers

the three kinematic equations are:

Vf = Vi + a(t)
Sf = Si + Vi(t) + 1/2a(t)^2
Vf^2 = Vi^2 + 2a(s)

We will use the second equation here.
Sf = final displacement, which we are looking for

Si = initial displacement, which is 0 (assuming 0 is where it hasn't dropped at all)

Vi = initial velocity, which is 0 since when we start throwing the ball it isn't moving down at all

a = the constant acceleration of the ball, which is equal to the gravitational constant which is g which is -9.81m/s^2

and finally, (t) is the change in time.

First we need to know the change in time. And in order to solve that we need to know how long it takes for the ball to travel from the pitcher to the catchers mitt.

since velocity is constant, this is easy.

v = d / t
t = d / v
t = 17.0m / 37.0m/s = 0.460s

Now we sub all that information into the second kinematic equation and we get our answer.

Sf = Si + Vi(t) + 1/2a(t)^2
Sf = 0 + 0(2.176) + 1/2(-9.81m/s^2) * (0.460)^2
Sf = 4.905 * 0.212
Sf = 1.04m

So, the ball drops 1.04meters by the time it reaches the pitcher.

2007-12-19 15:14:54 · answer #1 · answered by Anonymous · 0 0

First find out how long it takes teh ball to travel 17 m

d = vt --> t = d/v = 17/37 sec

Now the ball falls:

y = 1/2g t^2 = 1/2 9.8 (17/37)^2

You can work out the numbers.

2007-12-19 15:01:17 · answer #2 · answered by nyphdinmd 7 · 1 0

fedest.com, questions and answers