If y is a function of x, then the gimmick is to find the ratio between the change in y resulting from a tiny change in x. The ratio is defined as the "derivative". For a simple case, consider what happens if y = x*2. If we increase x by a tiny bit dx, then the change in y (i.e., dy) is given by:
(x+dx)^2 - x^2, or x^2 + 2dx + dx^2 - x*2, or 2dx - dx^2. But we suppose that dx is tiny, so dx*2 is tinier still, and we choose to ignore it, finally winding up with dy/dx = 2x. The same idea is applied to any function whose derivative is wanted. Newton and Leibnitz came up with the idea, apparently independently, although Newton is generally credited with inventing it first. But the notation now used is that of Leibnitz.
2007-02-02 17:17:04
·
answer #1
·
answered by Anonymous
·
0⤊
0⤋
With great difficulty and debate.
I believe it took over 100 years from when Newton first started using differential calculus until the definitions and proofs were truly made rigorous.
2007-02-02 17:09:43
·
answer #2
·
answered by Curt Monash 7
·
0⤊
0⤋
All the laws and formulae of differenciation are based on the following definition:
f'(x) equals limit h=> 0 { [f ( x + h ) - f ( x ) ] / h }
We get different formulae if we apply the above definition to varios functions f(x) and calculate the above limit...................
2007-02-02 17:18:19
·
answer #3
·
answered by dexter 2
·
0⤊
0⤋