English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

If a ball is thrown into the air with a velocity of 60 ft/s, its height after t seconds is given by y=60t-16t^2. Find the average velocity in ft/s for the time period beginning when t=1 and lasting 2 seconds.

If anyone could explain this to me, i'd appreciate it

2007-02-06 13:47:18 · 4 answers · asked by champers 5 in Science & Mathematics Mathematics

4 answers

At t = 1 the ball will have slowed to (60 - 32) = 28 fps.
V = at; 28 = 32 x t; t = 28/32; t = .875 sec. That's how long it will take for the ball to reach its highest point and stop. Average speed for .875 s = (28 + 0)/2 = 14 fps.
Now it starts down -0 accelerating @ 32fps/s for 1.125 sec. At t = 3 it will reach a velocity of 1.125 x 32 = 36 fps. It would have had an average speed of (0 + 36)/2 = 18 fps for that 1.125 secs.
Avg. V = (.875 x 14 fps + 1.125 x 18 fps) / 2 sec =
= (12.25 + 20.25) / 2
= 16.25 fps

2007-02-06 14:23:25 · answer #1 · answered by Richard S 6 · 0 0

Its position is given by the equation Height = 60t - 16 t^2

I don't know what math class this is so I'll pretend its calculus.

Velocity is equal to dy/dt = 60 - 32t = V

evaluate the equation for velocity at t= 1 and at t = 2 and take the average.

2007-02-06 21:55:40 · answer #2 · answered by Roadkill 6 · 0 0

Gravity slows the ball at a constant rate. You need to know the angle of accent

2007-02-06 21:50:54 · answer #3 · answered by jim m 5 · 0 0

http://www.ugrad.math.ubc.ca/coursedoc/math100/notes/derivative/ball1.html

2007-02-06 21:55:30 · answer #4 · answered by Eric L 5 · 0 0

fedest.com, questions and answers