is a core concept of advanced mathematics, specifically, in the fields of calculus and mathematical analysis. Given a function f(x) of a real variable x and an interval [a,b] of the real line, the integral
\int_a^b f(x)dx
represents the area of a region in the xy-plane bounded by the graph of f, the x-axis, and the vertical lines x=a and x=b.
The term "integral" may also refer to the notion of antiderivative, a function F whose derivative is the given function f. In this case it is called an indefinite integral, while the integrals discussed in this article are termed definite integrals. The fundamental theorem of calculus asserts that an antiderivative can be used to compute the integral over an interval. Some authors maintain a distinction between antiderivatives and indefinite integrals.
The idea of integration was formulated in the late seventeenth century by Isaac Newton and Gottfried Wilhelm Leibniz. Together with the concept of a derivative, integral became a basic tool of calculus, with numerous applications in science and engineering.
A rigorous mathematical definition of the integral was given by Bernhard Riemann. It is based on a limiting procedure which approximates the area of a curvilinear region by breaking the region into thin vertical slabs. Beginning in the nineteenth century, more sophisticated notions of integral began to appear, where the type of the function as well as the domain over which the integration is performed has been generalised. A line integral is defined for functions of two or three variables, and the interval of integration [a,b] is replaced by a certain curve connecting two points on the plane or in the space. In a surface integral, the curve is replaced by a piece of a surface in the three-dimensional space. Integrals of differential forms play a fundamental role in modern differential geometry. These generalizations of integral first arose from the needs of physics, and they play an important role in the formulation of many physical laws, notably those of electrodynamics. Modern concepts of integration are based on the abstract mathematical theory known as Lebesgue integration, developed by Henri Lebesgue.
ntegrals appear in many practical situations. Consider a swimming pool. If it is rectangular, then from its length, width, and depth we can easily determine the volume of water it can contain (to fill it), the area of its surface (to cover it), and the length of its edge (to rope it). But if it is oval with a rounded bottom, all of these quantities call for integrals. Practical approximations may suffice at first, but eventually we demand exact and rigorous answers to such problems.
Measuring depends on counting, with the unit of measurement as a key link. For example, recall the famous theorem in geometry which says that if a, b, and c are the side lengths of a right triangle, with c the longest, then a2 + b2 = c2. If we take the side of a square as one unit, its diagonal is √2 units long, a measurement which is greater than one and less than two. If we take the side as five units, the diagonal is at least seven units long, but still not perfectly measured. A side of 12 units will have a diagonal of about 17 units, but slightly less. In fact, no choice of unit that exactly counts off the side length can ever give an exact count for the diagonal length. Thus the real number system is born, permitting an exact answer with no trace of approximating counts. It is as if the units are infinitely fine.
Approximations to integral of √x from 0 to 1, with ■ 5 right samples (above) and ■ 12 left samples (below)
Approximations to integral of √x from 0 to 1, with ■ 5 right samples (above) and ■ 12 left samples (below)
So it is with integrals. For example, consider the curve y = f(x) between x = 0 and x = 1, with f(x) = √x. In the simpler case where y = 1, the region under the "curve" would be a unit square, so its area would be exactly 1. As it is, the area must be somewhat less. A smaller "unit of measure" should do better; so cross the interval in five steps, from 0 to 1⁄5, from 1⁄5 to 2⁄5, and so on to 1. Fit a box for each step using the right end height of each curve piece, thus √1⁄5, √2⁄5, and so on to √1 = 1. The total area of these boxes is about 0.7497; but we can easily see that this is still too large. Using 12 steps in the same way, but with the left end height of each piece, yields about 0.6203, which is too small. As before, using more steps produces a closer approximation, but will never be exact. Thus the integral is born, permitting an exact answer of 2⁄3, with no trace of approximating sums. It is as if the steps are infinitely fine (infinitesimal).
The most powerful new insight of Newton and Leibniz was that, under suitable conditions, the value of an integral over a region can be determined by looking at the region's boundary alone. For example, the surface area of the pool can be determined by an integral along its edge. This is the essence of the fundamental theorem of calculus. Applied to the square root curve, it says to look at the related function F(x) = 2⁄3√x3, and simply take F(1)−F(0), where 0 and 1 are the boundaries of the interval [0,1]. (This is an example of a general rule, that for f(x) = xq, with q ≠ −1, the related function is F(x) = (xq+1)/(q+1).)
In Leibniz's own notation — so well chosen it is still common today — the meaning of
\int f(x) \, dx \,\!
was conceived as a weighted sum (denoted by the elongated "S"), with function values (such as the heights, y = f(x)) multiplied by infinitesimal step widths (denoted by dx). After the failure of early efforts to rigorously define infinitesimals, Riemann formally defined integrals as a limit of ordinary weighted sums, so that the dx suggested the limit of a difference (namely, the interval width). Shortcomings of Riemann's dependence on intervals and continuity motivated newer definitions, especially the Lebesgue integral, which is founded on an ability to extend the idea of "measure" in much more flexible ways. Thus the notation
\int_A f(x) \, d\mu \,\!
refers to a weighted sum in which the function values are partitioned, with μ measuring the weight to be assigned to each value. (Here A denotes the region of integration.) Differential geometry, with its "calculus on manifolds", gives the familiar notation yet another interpretation. Now f(x) and dx become a differential form, ω = f(x)dx, a new differential operator d, known as the exterior derivative appears, and the fundamental theorem becomes the more general Stokes' theorem,
\int_{A} \bold{d} \omega = \int_{\part A} \omega , \,\!
from which Green's Theorem, the divergence theorem, and the fundamental theorem of calculus follow.
More recently, infinitesimals have reappeared with rigor, through modern innovations such as non-standard analysis. Not only do these methods vindicate the intuitions of the pioneers, they also lead to new mathematics.
Although there are differences between these conceptions of integral, there is considerable overlap. Thus the area of the surface of the oval swimming pool can be handled as a geometric ellipse, as a sum of infinitesimals, as a Riemann integral, as a Lebesgue integral, or as a manifold with a differential form. The calculated result will be the same for all.
summation::
Summation is the addition of a set of numbers; the result is their sum. The "numbers" to be summed may be natural numbers, complex numbers, matrices, or still more complicated objects. An infinite sum is a subtle procedure known as a series. Note that the term summation has a special meaning in the context of divergent series related to extrapolation.
differentiation::
In calculus, a branch of mathematics, the derivative is a measurement of how a function changes when the values of its inputs change. The derivative of a function at a chosen input value describes the behavior of the function near that input value. For a real-valued function of a single real variable, the derivative at a point equals the slope of the tangent line to the graph of the function at that point. In general, the derivative of a function at a point determines the best linear approximation to the function at that point.[1]
The process of finding a derivative is called differentiation. The fundamental theorem of calculus states that differentiation is the reverse process to integration.
Differentiation has applications to all quantitative disciplines. In physics, the derivative of the displacement of a moving body with respect to time is the velocity of the body, and the derivative of velocity with respect to time is acceleration. Newton's second law of motion states that the derivative of the momentum of a body equals the force applied to the body. The reaction rate of a chemical reaction is a derivative. In operations research, derivatives determine the most efficient ways to transport materials and design factories. By applying game theory, differentiation can provide best strategies for competing corporations.
Derivatives are frequently used to find the maxima and minima of a function. Equations involving derivatives are called differential equations and are fundamental in describing natural phenomena. Derivatives and their generalizations appear throughout mathematics, in fields such as complex analysis, functional analysis, differential geometry, and even abstract algebra.
2007-07-18 04:21:13
·
answer #7
·
answered by 621 3
·
0⤊
2⤋