Mathematical notation is a system of symbolic representations of mathematical objects and ideas. Mathematical notations are used in mathematics, the physical sciences, engineering, and economics. Mathematical notations include relatively simple symbolic representations, such as the numbers 0, 1 and 2; variables such as x, y and z; delimiters such as "(" and "|"; function symbols such as sin; operator symbols such as "+"; relational symbols such as "<"; conceptual symbols such as lim and dy/dx; equations and complex diagrammatic notations such as Penrose graphical notation and Coxeter-Dynkin diagrams.
A mathematical notation is a writing system used for recording concepts in mathematics.
The media used for writing are recounted below, but common materials currently include paper and pencil, board and chalk (or dry-erase marker), and electronic media. Systematic adherence to mathematical concepts is a fundamental concept of mathematical notation. For related concepts, see logical argument, mathematical logic, and model theory.
A mathematical expression is a sequence of symbols that can be evaluated. For example, if the symbols represent numbers, then the expressions are evaluated according to a conventional order of operations which provides for calculation, if possible, of any expressions within parentheses, followed by any exponents and roots, then multiplications and divisions, and finally any additions or subtractions, all done from left to right.
In a computer language, these rules are implemented by the compilers. For more on expression evaluation, see the computer science topics: eager evaluation, lazy evaluation, shortcut evaluation, and evaluation operator.
Modern mathematics needs to be precise, because ambiguous notations do not allow formal proofs. Suppose that we have statements, denoted by some formal sequence of symbols, about some objects (for example, numbers, shapes, patterns). Until the statements can be shown to be valid, their meaning is not yet resolved. During the reasoning process, we might let the symbols refer to those denoted objects, perhaps in a model. The semantics of that object has a heuristic side and a deductive side. In either case, we might want to know the properties of that object, which we might then list in an intensional definition.
Those properties might then be expressed by some well-known and agreed-upon symbols from a table of mathematical symbols. This mathematical notation might include annotations such as
In different contexts, the same symbol or notation can be used to represent different concepts (just as multiple symbols can be used to represent the same concept). Therefore, to fully understand a piece of mathematical writing, it is important to first check the definitions of the notations given by the author. This may be problematic, for instance, if the author assumes the reader is already familiar with the notation in use.
It is believed that a mathematical notation to represent counting was first developed at least 50,000 years ago--early mathematical ideas such as finger counting have also been represented by collections of rocks, sticks, bone, clay, stone, wood carvings, and knotted ropes. The tally stick is a way of counting dating back to the Upper Paleolithic. Perhaps the oldest known mathematical texts are those of ancient Sumer. The Census Quipu of the Andes and the Ishango Bone from Africa both used the tally mark method of accounting for numerical concepts.
The development of zero as a number is one of the most important developments in early mathematics. It was used as a placeholder by the Babylonians and Greek Egyptians, and then as an integer by the Mayans, Indians and Arabs (see the history of zero for more information).
The earliest mathematical viewpoints in geometry did not lend themselves well to counting. The natural numbers, their relationship to fractions, and the identification of continuous quantities actually took millennia to take form, and even longer to allow for the development of notation.
In fact, it was not until the invention of analytic geometry by René Descartes that geometry became more subject to a numerical notation. Some symbolic shortcuts for mathematical concepts came to be used in the publication of geometric proofs. Moreover, the power and authority of geometry's theorem and proof structure greatly influenced non-geometric treatises, such as Principia Mathematica by Isaac Newton for instance.
The 18th and 19th centuries saw the creation and standardization of mathematical notation as used today. Leonhard Euler was responsible for many of the notations currently in use: the use of a, b, c for constants and x, y, z for unknowns, e for the base of the natural logarithm, sigma (?) for summation, i for the imaginary unit, and the functional notation f(x). He also popularized the use of ? for Archimedes constant (due to William Jones' proposal for the use of ? in this way based on the earlier notation of William Oughtred).
In addition, many fields of mathematics bear the imprint of their creators for notation: the differential operator of Leibniz, the cardinal infinities of Georg Cantor (in addition to the lemniscate (?) of John Wallis), the congruence symbol (?) of Gauss, and so forth.
Theorem-proving software naturally comes with its own notations for mathematics; the OMDoc project seeks to provide an open commons for such notations; and the MMT language provides a basis for interoperability between other notations.
(Western notation uses Arabic numerals, but the Arabic notation also replaces Latin letters and related symbols with Arabic script.)
In addition to Arabic notation, mathematics also makes use of Greek alphabets to denote a wide variety of mathematical objects and variables. In some occasions, certain Hebrew alphabets are also used (such as in the context of infinite cardinals).
The great accomplishment of Descartes in mathematics invariably is described as the arithmetization of geometry.