In computer science, the **time complexity** of an algorithm quantifies the amount of time taken by an algorithm to run as a function of the size of the input to the problem. The time complexity of an algorithm is commonly expressed using **big O notation**, which suppresses multiplicative constants and lower order terms.

When expressed this way, the time complexity is said to be described asymptotically, i.e., as the input size goes to infinity. For example, if the time required by an algorithm on all inputs of size n is at most `5n^3 + 3n`

, the asymptotic time complexity is `O(n^3)`

.

Time complexity is commonly estimated by counting the number of elementary operations performed by the algorithm, where an elementary operation takes a fixed amount of time to perform. Thus the amount of time taken and the number of elementary operations performed by the algorithm differ by at most a constant factor.

Since an algorithm may take a different amount of time even on inputs of the same size, the most commonly used measure of time complexity, the worst-case time complexity of an algorithm, denoted as ** T(n)**, is the maximum amount of time taken on any input of size

`n`

. Time complexities are classified by the nature of the function `T(n)`

.For instance, an algorithm with `T(n) = O(n)`

is called a *linear time algorithm*, and an algorithm with `T(n) = O(2^n)`

is said to be an *exponential time algorithm*.

**Useful links:**

- Time complexity (Wikipedia)
- How to find time complexity of an algorithm
- Time complexities of classic algorithms

**Related tags:** complexity-theory