Table of Contents

Definition

Time Complexity

Figure 1: Graphs of number of operations, N vs input size, n for common complexities, assuming a coefficient of 1

In computer science, the time complexity of an algorithm quantifies the amount of time taken by an algorithm to run as a function of the length of the string representing the input. The time complexity of an algorithm is commonly expressed using big O notation, which excludes coefficients and lower order terms. When expressed this way, the time complexity is said to be described asymptotically, i.e., as the input size goes to infinity. For example, if the time required by an algorithm on all inputs of size n is at most 5n3 + 3n for any n (bigger than some n0), the asymptotic time complexity is O(n3).

The expression O is also called Landau's symbol.

Time complexity is commonly estimated by counting the number of elementary operations performed by the algorithm, where an elementary operation takes a fixed amount of time to perform. Thus, the amount of time taken and the number of elementary operations performed by the algorithm differ by at most a constant factor.

In other words:

The number of (machine) instructions which a program executes during its running time is called its time complexity in computer science.

Samples

Name Running time Example
constant O(1) statement
linear O(n) single_loop(statement)
quadratic O(n2) double_loop(statement)
logarithmic O(log2n) binary search α
linearithmic n*O(log2n) β

α The running time of the algorithm is proportional to the number of times N can be divided by 2. This is because the algorithm divides the working area in half with each iteration.

β The running time consists of N loops (iterative or recursive) that are logarithmic, thus the algorithm is a combination of linear and logarithmic.

The full list of common time complexities

Timing technique

import timeit

def merge_dicts(*dict_args):
    """
    Given any number of dicts, shallow copy and merge into a new dict,
    precedence goes to key value pairs in latter dicts.
    """
    result = {}
    for dictionary in dict_args:
        result.update(dictionary)
    return result

x = {'a': 1, 'b': 2}
y = {'b': 3, 'c': 4}
print(min(timeit.repeat(lambda: merge_dicts(x, y), number=100)))
print(min(timeit.repeat(lambda: merge_dicts(x, y))))

# min(timeit.repeat(lambda: {**x, **y})) ==> for Python 3.5

# dictionary item overwrite caveat illustration
z = {'c': 1}
assert(merge_dicts(x, y, z)['c'] == 1)
0.00024804499980746186
2.233158576999813


blog comments powered by Disqus

Published

09 November 2016

Categories

algorithms literate programming

Tags