Primitive may refer to:
In calculus, an antiderivative, primitive function, primitive integral or indefinite integral of a function f is a differentiable function F whose derivative is equal to the original function f. This can be stated symbolically as F′ = f. The process of solving for antiderivatives is called antidifferentiation (or indefinite integration) and its opposite operation is called differentiation, which is the process of finding a derivative.
Antiderivatives are related to definite integrals through the fundamental theorem of calculus: the definite integral of a function over an interval is equal to the difference between the values of an antiderivative evaluated at the endpoints of the interval.
The discrete equivalent of the notion of antiderivative is antidifference.
The function F(x) = x3/3 is an antiderivative of f(x) = x2. As the derivative of a constant is zero, x2 will have an infinite number of antiderivatives, such as x3/3, x3/3 + 1, x3/3 - 2, etc. Thus, all the antiderivatives of x2 can be obtained by changing the value of C in F(x) = x3/3 + C; where C is an arbitrary constant known as the constant of integration. Essentially, the graphs of antiderivatives of a given function are vertical translations of each other; each graph's vertical location depending upon the value of C.
In computer science, primitive data type is either of the following:
In most programming languages, all basic data types are built-in. In addition, many languages also provide a set of composite data types. Opinions vary as to whether a built-in type that is not basic should be considered "primitive".
Depending on the language and its implementation, primitive data types may or may not have a one-to-one correspondence with objects in the computer's memory. However, one usually expects operations on basic primitive data types to be the fastest language constructs there are. Integer addition, for example, can be performed as a single machine instruction, and some processors offer specific instructions to process sequences of characters with a single instruction. In particular, the C standard mentions that "a 'plain' int object has the natural size suggested by the architecture of the execution environment". This means that int
is likely to be 32 bits long on a 32-bit architecture. Basic primitive types are almost always value types.