Jump to content

An analysis of the concept of algorithm

From Wikiversity

This article by Dan Polansky analyzes the concept of algorithm. The objective is to obtain a clear idea about what is and what is not an algorithm and what it is distinguished from.

A sketch of a definition:

  • An algorithm is a finite set of instructions/steps to perform precise enough that a program for the computer (electronic computing machine) can be written implementing them.

Alternatively:

  • An algorithm is a finite set of instructions/steps to perform precise enough that any untalented, uncreative, unperceptive person skilled in performing simple mechanical operations can perform the steps.

As a result, approximate synonyms are method or procedure, not in the sense of programming terminology (where method is a member function of a class and procedure is a kind of function e.g. in Pascal) but in a more general sense. One might also use the phrase the manner of procedure. One can contrast algorithm to method by pointing out that a method does not need to consist of simple steps. For instance, one would not describe the scientific method as an algorithm.

An example of an algorithm is bubble sort, a method of sorting in which one successively swaps neighboring items in a sequence if applicable until one can be sure that the resulting sequence is sorted.

Another example is the sieve of Eratosthenes for finding prime numbers, in which one marks multiples of primes as non-primes and take primes to be those numbers that remain unmarked.

Yet another example is the algorithm for decimal multiplication that children learn, whether it was called "algorithm" by the teacher or not.

A useful way to capture an algorithm is by way of pseudocode, which resembles computer program code to some extent but with an admixture of natural language for increased expressive power. Pseudocode provides some of the benefits of compilable/interpretable program code without some of its nuisances; one can mix the relatively free expression in natural language with such constructions as if and while. Alternatively, one could capture an algorithm in a somewhat pseudocode-like programming language, e.g. Python, but not even Python matches the free expressive power of pseudocode.

Some key defining properties and observations:

1) An algorithm is not identical with a computer program. The same algorithm can be implemented in different programming languages, and even within one programming language, the same algorithm can be implemented by different programs, for some value of "different".

2) An algorithm is not identical with a description of the algorithm in a particular natural language. The same algorithm can be described in different natural languages and even within one natural language, one can have different descriptions of what is the same algorithm.

3) An algorithm does not depend on the existence of electronic computing machines (today called computers); a human computer is enough.

4) An algorithm typically has a certain generality, having an input space and an output space. Thus, one does not talk of an algorithm for decimal addition of 1234 and 5678 but rather about an algorithm for decimal addition of any two positive integers.

5) The input space and output space are often infinite. Thus, an algorithm can operate on any two integers, any two real numbers (finitely represented in the computer as floats), any two character sequences, etc. Any given input is finite but the input space is not, although if one allows algorithms operating on mathematical reals, a particular real can contain an infinite amount of information.

6) An algorithm is not identical with a particular input-output behavior of a computer program/function: the same input-output behavior can be achieved by different algorithms. Thus, e.g. the behavior of outputting the input sorted can be achieved by different algorithms.

7) Programs show not only input-output behavior but also interactive behavior. It is unclear whether a way to achieve the interactive behavior is called an algorithm.

8) An algorithm can be for a specific task (e.g. sorting or decimal multiplication), and this is often mentioned in definitions of the concept of algorithm. However, one can think of sets of instructions/steps to perform that are not for a task. It remains to be clarified whether the literature is willing to call taskless sets of instructions/steps to perform algorithms. In theoretical computer science, a Turing machine does not need to match a relatively brief description of a task in order to be called a Turing machine. Similarly, by creating a random program as part of, say, genetic programming (perhaps by generating Lisp code) one cannot hope to always hit on a useful or good program given the task/objective function, but the things generated are still called programs. But then, there could be programs that do not correspond to an algorithm. On the other hand, when one takes pseudocode as the substrate in which to capture algorithms, surely some pseudocode programs would not match a task, except perhaps the quasi-task of performing steps so-and-so.

9) The concept of algorithm is distinguished from that of heuristic. By one terminological choice, a heuristic is not an algorithm (since heuristic solves the task imperfectly and from a strict point of view is incorrect). Alternatively, one can see heuristics as certain kind of algorithms, not guaranteed to solve the task perfectly. Thus, a heuristic could find a local maximum for a task that searches for a global one or a heuristic could work well for a large portion of possible inputs but fail on some inputs. This subclass/subsumption approach must have been taken by Daniel Dennett in his Darwin's Dangerous Idea unless memory fails. If one rejects this subclass/subsumption approach, one would naturally want a name for the class of things that subsumes both algorithms and heuristics and none is obvious. The subclass/subsumption approach is implied in the widespread use of the phrase "heuristic algorithm". It is further suggested by the phrase "face recognition algorithm", since these algorithms are imperfect at face recognition. (The term heuristic is also used in cognitive psychology in a broadly similar meaning.)

10) Correctness of an algorithm is concerned with a match to the desired input-output behavior but not with execution speed and memory requirements. An algorithm can be impractically slow (e.g. fully exhaustive game tree exploration without pruning for chess) but that does not make it incorrect.

11) If we are willing to talk of incorrect algorithms, we should perhaps be willing to subsume the concept of heuristic under the concept of algorithm.

12) One would initially think of an algorithm as being serial, but there are also parallel algorithms. They are important since the aggregate computing power of a collection of networked computers can greatly exceeed a single-CPU (even if multicore) computer.

13) An algorithm can provide an infinitely long output, that is, output featuring countably infinitely many items. On the other hand, this can be reformulated as an algorithm with a finite output, by adding the index of the output item as an additional input.

14) Algorithms are usually analyzed using an imaginary machine that has a potentially infinite memory and potentially infinite time. The word potentially is important; algorithms are not allowed to complete, say, countably infinitely many steps and then proceed to another step.

15) Algorithms have the property called computational complexity, including how much time (how many steps) they take and how much space, expressed as a function of input size. Complexity can be determined on the algorithm level rather than having to consider an implementing program. For instance, one talks of linear time, polynomial time and exponential time. (The name of the property is rather misleading: it is often the simple algorithm that has bad time complexity. Put differently, one can often reduce the computational "complexity" of an algorithm by making the algorithm more complex.)

16) Some algorithms are randomized, using random or pseudorandom numbers from a provider during execution. Such algorithms are non-deterministic in principle, although if one uses pseudorandom number generator seeded from a fixed initial value, the result is technically deterministic. An example of a class of randomized algorithms are Monte Carlo integration and simulation. See also Wikipedia: Randomized algorithm.

17) Randomized algorithms are one class of non-deterministic algorithms. Another class is one corresponding to non-deterministic Turing machines. There is a special semantics with these: a non-deterministic Turing machine accepts an input if at least one of the non-deterministic runs ends in an accepting state. In this semantics, the output space consists of two values, input accepted and input rejected. To implement this semantics in a computer, one could run the algorithm non-deterministically (using random or pseudorandom numbers to make the non-deterministic decisions) a huge number of times; if it would hit upon a run that accepts the input, the input is accepted and the answer is correct; if it would never hit upon an accepting run, the input is rejected, but this answer is tentative and possibly incorrect.

Let us show the bubble sort algorithm as an example:

  1. Consider a sequence s of integers, not necessarily sorted, as a mutable object.
  2. Repeat what follows as long as at least one swap occurs in the loop.
    1. (A loop) For all values of integer index i starting at 1 and ending at the length of the sequence minus 1
      1. If the element of the sequence at index i plus one is smaller than the one at index i
        1. Swap the two elements
  3. Observation: The sequence is now sorted in ascending order.

One will get a better idea by examining more algorithms, which is not done here.

It remains to be clarified whether the concept of algorithm only applies to what are typographical manipulations and their analogues, which includes mathematical calculation such as decimal multiplication and division. If so, a cooking recipe would not be an algorithm even if sufficiently detailed and unambiguous; also a geometric construction, e.g. halving an angle given two half-lines with the use of a ruler and a compass, would not be an algorithm.

One could concieve of an extended concept of algorithm that allows countably infinitely many steps to be peformed at once, as if one had exponentially accelerated time (think Achilles and tortoise) for a finite time window. Thus, one could say things like, let us have an infinite array a (with indices 1, 2, 3, ...) and let us initialize the array with pseudorandom uniformly distributed bytes (0-255); let us set z to 1 if at least one of the bytes is zero, else to 0. Given this extended concept, one could solve the halting problem since one could perform all the emulation steps at once and see whether the emulated program has halted or not. It is not clear whether anyone ever considered this extended concept seriously. On the other hand, one investigates oracle machines in computability theory that give programs more power than usually available, so this kind of idea is not unheard of. One could even consider the first order logic operating on integers to be a sort of programming language, available to express what we might call hyperalgorithms. But this is an aside.

One may want to exclude the concept of a formula from that of an algorithm. Thus, one would not speak of an algorithm for the calculation of the Newtonian gravitational force given two mass points and their distance but rather of a formula. One may hesitate to refer to something as an algorithm when that thing contains no conditional and no loop (in general, no branching), as is often the case for formulas. Even a formula with a simple conditional (e.g. of a form X = F(Y) if Z else G(Y)) may be felt to be not an algorithm. Alternatively, one may consider formulas to be algorithms. Determining the most customary usage would require more research.

The above deliberations suggest that algorithm is not a stricly (mathematically, formally) defined technical term but rather has certain vagueness.

See also

[edit | edit source]

Further reading

[edit | edit source]