Ind. Int.="Indefinite Integral"
In my book (Stewart) they say
"Recall that the most general antiderivative on a given interval is
obtained by adding a constant to a particular antiderivative. [Bold]We adopt the convention that
when a formula for a general Ind. Int. is given, it is valid only on an interval.[/Bold] Thus, we write
Ind. Int. 1/x^2 dx = - (1/x) + C
with the understanding that it is valid on the interval (0, infinite) or on the interval (-infinite, 0). This is true despite the fact that the general antiderivative of the function , f(x)=1/x^2, x ≠ 0 , is:
F(x):
- (1/x) + C1 if x<0
- (1/x) + C2 if x>0"
Well, I don't understand this convention, don't know if it is something too obvious and I'm complicating myself, like, "there could be points where the Ind. Int. isn't defined" or maybe there's a subtle point behind this, maybe they want to say that a Ind. Int. that holds for any interval, no matter how small, it's a valid Ind. Int of the function.
Thank u.
2007-12-16
03:31:08
·
2 answers
·
asked by
andacecha
2
in
Mathematics