Skip to main content

Algorithms and Complexity

"Complexity" refers to the study of the efficiency of computer algorithms. In this a problem is specified, with the ultimate goals being to determine the greatest efficiency possible for algorithms designed to solve instances of the problem, and to design an algorithm possessing the optimal efficiency.

For example, consider the problem of multiplying two square real matrices. Here, an instance of the problem consists of two specific matrices. Applying the procedure everyone learns in linear algebra for multiplying matrices, one has an algorithm which multiplies two n-by-n matrices using n^3 scalar multiplications, along with slightly fewer scalar additions, for a total number of arithmetic operations proportional to n^3. However, this is not optimal for the problem, as algorithms have been devised for which the total number of arithmetic operations is at most proportional to n^p where p = 2.376. Regarding lower bounds, all that is known is that no algorithm for multiplying two square matrices can do so using fewer than n^2 arithmetic operations in general. We thus say that the computational complexity of matrix multiplication lies somewhere between n^2 and n^p for p = 2.376.

A full formalization of complexity requires there be an underlying mathematical model of a computer, so that definitions and proofs can possess complete rigor. There are various mathematical models of computers used in complexity, the most prevalent being the Turing machine, but also important is the real number machine, which is the model that best fits the spirit of the example above. Complexity that is fully formalized is known as complexity theory.

Complexity is widely done in science and engineering, even if not complexity theory. Indeed, it is standard when presenting a new algorithm — even algorithms aimed squarely at applications — to argue superiority of the algorithm by bounding its running time as a function of key problem parameters (e.g., n^p for p = 2.376), and showing the bound beats the bounds previously established for competitor algorithms. Referring to an algorithm's "complexity" has thusly become commonplace.