An item costs $500 at time t = 0 and costs $P in year t. When inflation is r% per year, the price is given by the following equation.
P = 500e^(rt/100)
(a) If r is a constant, at what rate is the price rising (in dollars per year) initially? 5r dollars per year
(b) At what rate is the price rising after 2 years? 5re^(r/50) dollars per year
(c) Now suppose that r is increasing by 0.3 per year when r = 3 and t = 2. At what rate (dollars per year) is the price increasing at that time?
I just need help with part C, but I put up the answers on the other parts in case it is helpful to get the answer to C. I've tried C already and I get an incorrect answer of 5.466356401 dollars per year.
2007-10-28
17:40:22
·
1 answers
·
asked by
Christian
1
in
Science & Mathematics
➔ Mathematics