English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

Assume a major league baseball pitcher can pitch at 85 miles per hour. If he could pitch that fast straight upward, from what height must a ball be dropped so that the ball pitched and the ball dropped hit the ground at the same time? (Hint: there are 5280 feet in 1 mile)

2007-11-07 09:36:28 · 3 answers · asked by NYcubsfan 1 in Science & Mathematics Physics

3 answers

first convert units

85 miles/hr * 5280 ft/mile * 1hr/60 min * 1 min/60 s = 124.67 ft/s

time to reach apex (due to pitching)
g = (Vf - Vo)/t , Vf = 0 at apex, g = 32.17 ft/s so t = Vo/g = 3.875 s

total time the ball is in flight due to pitching is 2* t = 7.75 s

solve for H if t = 7.75 s

x = 0.5 g t^2 = 0.5 * 32.17 fps2 * 7.75^2 = 966 ft

2007-11-07 09:48:06 · answer #1 · answered by Jan-Michael 2 · 0 0

V=gt ; t=V/g)
H=(1/2) gt^2
H=(1/2) g (V/g)^2=(V)^2/(2g)
H=(85 x5280 / 3600)^2/(2x32)=243 ft

2007-11-07 17:42:30 · answer #2 · answered by Edward 7 · 0 0

I'm assuming that both balls are also dropped at the same time.

V0(1)85mi./1hr. * 5280ft.-1mi. * 1hr./3600sec. = 124.67ft./sec.

Vf(1)=V0(1)+at
Vf(1)=-V0(1)
2Vf(1)=a(1)t
2*124.67=32.1t
t=7.767sec.

d(2)=V0(2)t+(at^2)/2
d(2)=0+(32.1*7.767^2)/2
d(2)=968.34ft.

2007-11-07 17:41:29 · answer #3 · answered by er_i_m 4 · 0 0

fedest.com, questions and answers