Constant of integration

Last updated

In calculus, the constant of integration, often denoted by (or ), is a constant term added to an antiderivative of a function to indicate that the indefinite integral of (i.e., the set of all antiderivatives of ), on a connected domain, is only defined up to an additive constant. [1] [2] [3] This constant expresses an ambiguity inherent in the construction of antiderivatives.

Contents

More specifically, if a function is defined on an interval, and is an antiderivative of , then the set of all antiderivatives of is given by the functions , where is an arbitrary constant (meaning that any value of would make a valid antiderivative). For that reason, the indefinite integral is often written as , [4] although the constant of integration might be sometimes omitted in lists of integrals for simplicity.

Origin

The derivative of any constant function is zero. Once one has found one antiderivative for a function , adding or subtracting any constant will give us another antiderivative, because . The constant is a way of expressing that every function with at least one antiderivative will have an infinite number of them.

Let and be two everywhere differentiable functions. Suppose that for every real number x. Then there exists a real number such that for every real number x.

To prove this, notice that . So can be replaced by , and by the constant function , making the goal to prove that an everywhere differentiable function whose derivative is always zero must be constant:

Choose a real number , and let . For any x, the fundamental theorem of calculus, together with the assumption that the derivative of vanishes, implying that

thereby showing that is a constant function.

Two facts are crucial in this proof. First, the real line is connected. If the real line were not connected, we would not always be able to integrate from our fixed a to any given x. For example, if we were to ask for functions defined on the union of intervals [0,1] and [2,3], and if a were 0, then it would not be possible to integrate from 0 to 3, because the function is not defined between 1 and 2. Here, there will be two constants, one for each connected component of the domain. In general, by replacing constants with locally constant functions, we can extend this theorem to disconnected domains. For example, there are two constants of integration for , and infinitely many for , so for example, the general form for the integral of 1/x is: [5] [6]

Second, and were assumed to be everywhere differentiable. If and are not differentiable at even one point, then the theorem might fail. As an example, let be the Heaviside step function, which is zero for negative values of x and one for non-negative values of x, and let . Then the derivative of is zero where it is defined, and the derivative of is always zero. Yet it's clear that and do not differ by a constant, even if it is assumed that and are everywhere continuous and almost everywhere differentiable the theorem still fails. As an example, take to be the Cantor function and again let .

For example, suppose one wants to find antiderivatives of . One such antiderivative is . Another one is . A third is . Each of these has derivative , so they are all antiderivatives of .

It turns out that adding and subtracting constants is the only flexibility we have in finding different antiderivatives of the same function. That is, all antiderivatives are the same up to a constant. To express this fact for , we write:

Replacing by a number will produce an antiderivative. By writing instead of a number, however, a compact description of all the possible antiderivatives of is obtained. is called the constant of integration. It is easily determined that all of these functions are indeed antiderivatives of :

Necessity

At first glance, it may seem that the constant is unnecessary, since it can be set to zero. Furthermore, when evaluating definite integrals using the fundamental theorem of calculus, the constant will always cancel with itself.

However, trying to set the constant to zero does not always make sense. For example, can be integrated in at least three different ways:

So setting to zero can still leave a constant. This means that, for a given function, there is not necessarily any "simplest antiderivative".

Another problem with setting equal to zero is that sometimes we want to find an antiderivative that has a given value at a given point (as in an initial value problem). For example, to obtain the antiderivative of that has the value 100 at x = π, then only one value of will work (in this case ).

This restriction can be rephrased in the language of differential equations. Finding an indefinite integral of a function is the same as solving the differential equation . Any differential equation will have many solutions, and each constant represents the unique solution of a well-posed initial value problem. Imposing the condition that our antiderivative takes the value 100 at x = π is an initial condition. Each initial condition corresponds to one and only one value of , so without it would be impossible to solve the problem.

There is another justification, coming from abstract algebra. The space of all (suitable) real-valued functions on the real numbers is a vector space, and the differential operator is a linear operator. The operator maps a function to zero if and only if that function is constant. Consequently, the kernel of is the space of all constant functions. The process of indefinite integration amounts to finding a pre-image of a given function. There is no canonical pre-image for a given function, but the set of all such pre-images forms a coset. Choosing a constant is the same as choosing an element of the coset. In this context, solving an initial value problem is interpreted as lying in the hyperplane given by the initial conditions.

Related Research Articles

<span class="mw-page-title-main">Antiderivative</span> Concept in calculus

In calculus, an antiderivative, inverse derivative, primitive function, primitive integral or indefinite integral of a function f is a differentiable function F whose derivative is equal to the original function f. This can be stated symbolically as F' = f. The process of solving for antiderivatives is called antidifferentiation, and its opposite operation is called differentiation, which is the process of finding a derivative. Antiderivatives are often denoted by capital Roman letters such as F and G.

<span class="mw-page-title-main">Integral</span> Operation in mathematical calculus

In mathematics, an integral is the continuous analog of a sum, which is used to calculate areas, volumes, and their generalizations. Integration, the process of computing an integral, is one of the two fundamental operations of calculus, the other being differentiation. Integration started as a method to solve problems in mathematics and physics, such as finding the area under a curve, or determining displacement from velocity. Today integration is used in a wide variety of scientific fields.

<span class="mw-page-title-main">Mean value theorem</span> On the existence of a tangent to an arc parallel to the line through its endpoints

In mathematics, the mean value theorem states, roughly, that for a given planar arc between two endpoints, there is at least one point at which the tangent to the arc is parallel to the secant through its endpoints. It is one of the most important results in real analysis. This theorem is used to prove statements about a function on an interval starting from local hypotheses about derivatives at points of the interval.

<span class="mw-page-title-main">Cauchy's integral theorem</span> Theorem in complex analysis

In mathematics, the Cauchy integral theorem in complex analysis, named after Augustin-Louis Cauchy, is an important statement about line integrals for holomorphic functions in the complex plane. Essentially, it says that if is holomorphic in a simply connected domain Ω, then for any simply closed contour in Ω, that contour integral is zero.

In calculus, and more generally in mathematical analysis, integration by parts or partial integration is a process that finds the integral of a product of functions in terms of the integral of the product of their derivative and antiderivative. It is frequently used to transform the antiderivative of a product of functions into an antiderivative for which a solution can be more easily found. The rule can be thought of as an integral version of the product rule of differentiation.

In calculus, integration by substitution, also known as u-substitution, reverse chain rule or change of variables, is a method for evaluating integrals and antiderivatives. It is the counterpart to the chain rule for differentiation, and can loosely be thought of as using the chain rule "backwards".

Integration is the basic operation in integral calculus. While differentiation has straightforward rules by which the derivative of a complicated function can be found by differentiating its simpler component functions, integration does not, so tables of known integrals are often useful. This page lists some of the most common antiderivatives.

<span class="mw-page-title-main">Improper integral</span> Concept in mathematical analysis

In mathematical analysis, an improper integral is the limit of a definite integral as an endpoint of the interval(s) of integration approaches either a specified real number or positive or negative infinity; or in some instances as both endpoints approach limits. Such an integral is often written symbolically just like a standard definite integral, in some cases with infinity as a limit of integration interval(s).

<span class="mw-page-title-main">Linear differential equation</span> Differential equations that are linear with respect to the unknown function and its derivatives

In mathematics, a linear differential equation is a differential equation that is defined by a linear polynomial in the unknown function and its derivatives, that is an equation of the form

In mathematics, the Riemann–Liouville integral associates with a real function another function Iαf of the same kind for each value of the parameter α > 0. The integral is a manner of generalization of the repeated antiderivative of f in the sense that for positive integer values of α, Iαf is an iterated antiderivative of f of order α. The Riemann–Liouville integral is named for Bernhard Riemann and Joseph Liouville, the latter of whom was the first to consider the possibility of fractional calculus in 1832. The operator agrees with the Euler transform, after Leonhard Euler, when applied to analytic functions. It was generalized to arbitrary dimensions by Marcel Riesz, who introduced the Riesz potential.

<span class="mw-page-title-main">Dirichlet integral</span> Integral of sin(x)/x from 0 to infinity.

In mathematics, there are several integrals known as the Dirichlet integral, after the German mathematician Peter Gustav Lejeune Dirichlet, one of which is the improper integral of the sinc function over the positive real line:

In mathematics, a nonelementary antiderivative of a given elementary function is an antiderivative that is, itself, not an elementary function. A theorem by Liouville in 1835 provided the first proof that nonelementary antiderivatives exist. This theorem also provides a basis for the Risch algorithm for determining which elementary functions have elementary antiderivatives.

In calculus, the Leibniz integral rule for differentiation under the integral sign states that for an integral of the form

<span class="mw-page-title-main">Multiple integral</span> Generalization of definite integrals to functions of multiple variables

In mathematics (specifically multivariable calculus), a multiple integral is a definite integral of a function of several real variables, for instance, f(x, y) or f(x, y, z). Integrals of a function of two variables over a region in (the real-number plane) are called double integrals, and integrals of a function of three variables over a region in (real-number 3D space) are called triple integrals. For multiple integrals of a single-variable function, see the Cauchy formula for repeated integration.

<span class="mw-page-title-main">Antiderivative (complex analysis)</span> Concept in complex analysis

In complex analysis, a branch of mathematics, the antiderivative, or primitive, of a complex-valued function g is a function whose complex derivative is g. More precisely, given an open set in the complex plane and a function the antiderivative of is a function that satisfies .

The integral of secant cubed is a frequent and challenging indefinite integral of elementary calculus:

The fundamental theorem of calculus is a theorem that links the concept of differentiating a function with the concept of integrating a function. The two operations are inverses of each other apart from a constant value which depends on where one starts to compute area.

In integral calculus, the tangent half-angle substitution is a change of variables used for evaluating integrals, which converts a rational function of trigonometric functions of into an ordinary rational function of by setting . This is the one-dimensional stereographic projection of the unit circle parametrized by angle measure onto the real line. The general transformation formula is:

In mathematics, integrals of inverse functions can be computed by means of a formula that expresses the antiderivatives of the inverse of a continuous and invertible function , in terms of and an antiderivative of . This formula was published in 1905 by Charles-Ange Laisant.

Most of the terms listed in Wikipedia glossaries are already defined and explained within Wikipedia itself. However, glossaries like this one are useful for looking up, comparing and reviewing large numbers of terms together. You can help enhance this page by adding new terms or writing definitions for existing ones.

References

  1. Stewart, James (2008). Calculus: Early Transcendentals (6th ed.). Brooks/Cole. ISBN   0-495-01166-5.
  2. Larson, Ron; Edwards, Bruce H. (2009). Calculus (9th ed.). Brooks/Cole. ISBN   0-547-16702-4.
  3. "Definition of constant of integration | Dictionary.com". www.dictionary.com. Retrieved 2020-08-14.
  4. Weisstein, Eric W. "Constant of Integration". mathworld.wolfram.com. Retrieved 2020-08-14.
  5. "Reader Survey: log|x| + C", Tom Leinster, The n-category Café, March 19, 2012
  6. Banner, Adrian (2007). The calculus lifesaver : all the tools you need to excel at calculus . Princeton [u.a.]: Princeton University Press. p.  380. ISBN   978-0-691-13088-0.