English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

An item costs $500 at time t = 0 and costs $P in year t. When inflation is r% per year, the price is given by the following equation.

P = 500e^(rt/100)

(a) If r is a constant, at what rate is the price rising (in dollars per year) initially? 5r dollars per year

(b) At what rate is the price rising after 2 years? 5re^(r/50) dollars per year

(c) Now suppose that r is increasing by 0.3 per year when r = 3 and t = 2. At what rate (dollars per year) is the price increasing at that time?

I just need help with part C, but I put up the answers on the other parts in case it is helpful to get the answer to C. I've tried C already and I get an incorrect answer of 5.466356401 dollars per year.

2007-10-28 17:40:22 · 1 answers · asked by Christian 1 in Science & Mathematics Mathematics

1 answers

dP/dt = 5(1 + dr/dt)e^(rt/100)
@ t = 2, r = 3, and dr/dt = 0.3
dP/dt = 5(1 + 0.3)e^(3*2/100)
dP/dt ≈ $6.901938 / yr

2007-10-28 18:10:29 · answer #1 · answered by Helmut 7 · 0 0

fedest.com, questions and answers