The plane's altitude is 6 miles. Suppose that when the distance between the plane and the radar station is 10 miles, this distance is increasing at a rate of 500 miles per hour. Find the speed of the plane.
^this is all that is given and the following is my own attempt.
I'm in calculus and it's dealing with relative rate.
also ive gotten this so far but cant seem to get the right answer, although i dont know what it is. i am using an online homework machine so it tells me when its right or wrong.
given: when g=10miles dg/dt=500 mi/h
goal: find d(angle)/dt
Equation: Tan(angle) = 1/6 * g
soln: sec^2(angle) * d(angle)/dt = d/dt(1/6*g)
136/36 *d(angle)/dt = 1/6 * dg/dt
136/36 * d(angle)/dt = 500/6
d(angle)/dt = 500/6 * 36/136
d(angle)/dt = 3000/136
and that is not hte answer and i'm not understanding what i am doing wrong.
please help!!!
2007-10-23
17:50:05
·
2 answers
·
asked by
Anonymous
in
Science & Mathematics
➔ Mathematics