An algorithm is a set of
ordered instructions based on a formal language with the following conditions:
- Finite. The number of instructions must be finite.
- Executable. All instructions must be executable in some language-dependent way, in a finite amount of time.
An algorithm does not have to be deterministic - there are many random-based algorithms, e.g. QuickSort with selecting the pivot element randomly.
An algorithm can be expressed in many ways:
- As a sequence of instructions
- As a block-scheme
- As a code in an existing or new programming language
- As a piece of text in a human language
- As a realization in a formal model of computation such as a Turing machine
- Or in other, similar ways
An algorithm can solve a class of problems. For example, both "sum 1 and 3" and "sum 4 and 5" are problems in the same class: "sum two integer numbers." Furthermore, a given class of problems can generally be solved by a variety of algorithms.
One important aspect of algorithm design is performance. For large datasets, a good algorithm may outperform a poor algorithm by several orders of magnitude. Algorithm performance is often rated with Big O or Θ notation, but one should be cautious with asymptotic notation, since big constants may be involved.
A key algorithm classification is known as Algorithm Complexity.
Another important step of algorithm design is proof of correctness. The algorithm should actually provide a correct result for any instance of the problem. We should always take care of all the conditions where the proposed algorithm might fail. This could be done using several techniques, For example, Model Checking, Assertion, Loop Invariant.
Additional resources on algorithms include: