English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

An arrow is launched upwards at an angle of 55 degrees from the horizontal at a speed of 75 m/s. How do I calculate

1. the maximum height to which the arrow rises
2. how long it is in the air before it hits the ground
3. how far away does it land from the launching point?

Thanks.

2007-10-09 15:43:25 · 1 answers · asked by labelapark 6 in Science & Mathematics Physics

1 answers

1) The arrow will have vy(t)=vy0-g*t

when vy(t)=0, the arrow reaches apogee
so
t=vy0/g
or
t=75*sin(55)/9.8

now that you have t, calculate the height of apogee using
y(t)=vy0*t-.5*g*t^2
=vy0^2/g-.5*vy0^2/g
or
=.5*vy0^2/g
plug in vy0
=.5*sin(55)*75/9.8

2)The flight time to apogee is 1/2 the total flight time. To check, plug 150*sin(55)/9.8 into y(t). You will find y(t)=0.

3) Knowing the total flight time, use
x(t)=vx0*t to find the range
x(t)=75*cos(55)*150*sin(55)/9.8

Note, this is also equal to
75^2*sin(110)/9.8

j

2007-10-10 06:16:09 · answer #1 · answered by odu83 7 · 0 0

fedest.com, questions and answers