English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

An archer shoots an arrow with a velocity of 51.5 m/s at an angle of 54.5° with the horizontal. An assistant standing on the level ground 150 m downrange from the launch point throws an apple straight up with the minimum initial speed necessary to meet the path of the arrow.
(a) What is the initial speed of the apple?
(b) At what time after the arrow launch should the apple be thrown so that the arrow hits the apple?

okay i got the first part of the question which equals 41.39 m/s. i cant solve part b. please help. thanks in advance.

2007-10-20 14:47:50 · 1 answers · asked by Quang P 2 in Science & Mathematics Physics

1 answers

the x(t)=51.5*cos(54.5)*t

The apple must meet the arrow when x(t)=150

t=5.0157 seconds

The height of the arrow at that moment is

y(t)=51.5*sin(54.5)*t-.5*9.81*t^2
y(5.0157)=86.897 m

The flight of the apple has
y(t)=v*t-.5*g*t^2

and to have apogee at y(t)=86.897
v(t)=0=v-g*t

solve for v

86.897=v*t-4.905*t^2

t=v/9.81
86.897=v^2/9.81-.5*v^2/9.81
v=sqrt(86.897*2*9.81)
v=41.3 m/s

That's close to what you got. What did you use for g?

To find the time to throw the apple
How long does it take the apple to reach apogee?
recall
0=v-g*t
so
t=41.3/9.81
t=4.21 seconds

so to meet the arrow, the time after arrow launch is
5.016-4.21
0.806 seconds

j

2007-10-21 06:45:46 · answer #1 · answered by odu83 7 · 0 0

fedest.com, questions and answers