FP (short for Function Programming) was invented by John Backus and described in his famous Turing Award lecture. It is not really a programming language per se, as its content is not fixed, and it lacks input/output facilities. Backus, who is also the chief designer of Fortran, one of the authors of the Backus-Naur Form (a metalanguage for describing the syntax of formal languages), and a member of the committee that created Algol 60, actually described a family of languages that he called ‘FP systems’. What got to be known as ‘FP language’ is the particular instance of such a system, described in the mentioned article.
FP is a purely functional language. Moreover, the style of programming that it represents can be described as combinator-based. A program in FP is a set of functions. The functions have no free variables and not even explicitly mentioned arguments. In fact, each function has precisely one argument, so no naming of that is ever required. New functions can be defined using the existing ones by applying a predefined set of combining forms on them. A program is executed by applying one of the funcions to a data object, and it is only then that data comes into use.
Because the functions are always defined without explicit reference to data (wherever constants are needed in a definition, they are provided by constant-producing functions), and because only application is used (but not functional abstraction, as in the λ-calculus based languages), this style of programming is also known as function-level, variable-free, or applicative.
Data objects are always constant in FP. A datum can be atomic, such as a number, a Boolean, a character or a string, or it can be a compound value. The only form of data structuring in FP is sequence construction. Sequences are heterogeneous and symmetric, in that arbitrary new values (including sequences) can be appended to any of their ends. There are a number of predefined functions providing useful operations on sequences, including direct access through indexing.
A distinguishing feature of FP is the strict separation between (ordinary, first-order) functions and combining forms (higher-order functions). This is different from the treatment of functions in λ-calculus and in most other functional languages. More specifically, in FP the combining forms are predefined, with no provision for defining new ones, always have explicit arguments, and can be nested. In contrast, functions cannot be nested and never have explicit arguments, but new functions can be defined as necessary. Unlike functions, a combining form can have more than one argument.
In the mid 1980s Backus and his colleagues came up with the language FL. FL was ‘the result of an effort to design a practical functional programming language based on [...] FP.’ Many new features were added in FL with respect to FP: ability to define new combining forms, local definitions (a
where clause), first-class exceptions, input/output facilities, user-defined datatypes with predicate-based pattern-matching, many predefined functions and some combining forms. In some (insignificant) respects FL's syntax differs from that of FP.
FL was implemented and some programs of various size and complexity seem to have been written in it, but currently (as of 2012) no trace of publicly available information on all this is likely to be found.
Here is an example of a function definition in FP that makes use of most of FP's combining forms and some of its predefined functions. The function computes the Cartesian product (a.k.a. direct product) of an ordered set of ordered sets. Ordered sets are naturally represented as FP sequences. For example, for the sequence
<<1,2>,<3>,<4,5,6>> the result is
Defcart ≡ /(cat
∘α ((α apndl)
Understanding the definition requires knowledge of how its building blocks work, so there follows short explanation of these.
id is the identity function.
<ø> is the sequence that contains the empty sequence (
ø) as a single item, and correspondingly
<ø> is the function that produces that sequence.
] is the construction combining form. For any set of functions
[f1,...,fn] transforms a value
x into a sequence of values
cat is concatenation of sequences: it joins the sequences that comprise a sequence into a single sequence.
apndr append an item to the left and right end of a sequence, respectively. More precisely,
distr construct pairs by ‘distributing an item along a sequence’. Namely,
∘ is the composition of functions – a combining form. For any two functions
f∘g is the function that, for an argument
α is the apply-to-all combining form (the function map in other languages). For any function
αf transforms a sequence
/ is the insert combining form. For a function
/f transforms a sequence
<x1,...,xn> into the value
x1 f' (x2 f' (...(xn-1 f' xn)...)) (assuming that
f' is the infix operator corresponding to
Finally, here is how the function
apndr adds the sequence
<ø>, representing the empty Cartesian product, to the end of the initial sequence of sequences. Then the function
cat is repeatedly (due to
/) applied, once for each sequence, starting from the last one. With each application, the contribution of the corresponding sequence is accumulated to the overall result (which was initially
<ø>). To that end,
distr pairs the so-far-known result sequence with each item of the currently used argument sequence, so that, by applying (through the outer
α) the function
(α apndl) to each pair, that item is put in front of each sequence in the result (pairing it with those sequences by
distl and then adding it in front of the sequences by
α apndl), and then, applying
cat, all such sequences are catenated into one.
Both Backus's paper and FP were extremely influential in the promotion of the functional programming paradigm and the development of the subsequent functional languages. Backus eloquently exposed the inherent defects of imperative languages, and stressed the advantages of the purely functional style.
Backus' criticism of imperative languages can be summarized as follows:
At a somewhat greater level of detail, Backus observed that the problems with imperative languages lie in the separation between expressions and statements that they exhibit, their complicated naming and substitution rules, the intimate and unavoidable interdependence between data and actions in them, etc. It was precisely these defects that Backus' proposal was meant to overcome.
Ensuring that computation (functions) is defined completely separate from the data on which it operates would make it possible to build complex programs, each part of which can be described entirely by the effect that it has on its own argument. With no data interrelations between functions, the whole program could always be constructed by combining independent parts.
Additional advantage is that in the absence of named function arguments it is much easier to treat functions as objects in a suitably constructed algebra and thus to formulate and prove facts about their properties and do useful transformations of code.
An important trait of the line of thought that characterizes FP is the tendency to avoid all kinds of repetition, i.e., not only loops but also recursion, as much as possible. Since repetition's role is to fill the gap between ‘word-at-a-time’ processing and having some work done on a whole data collection, it had to be replaced by powerful functions and combining forms that would allow such collections to be processed as a whole.
Kenneth Iverson's work was recognized in this respect for the applicative, not λ- but combinator-based style of his language APL and for the rich possibilities in this language for operating on entire aggregates of data. However, Backus observed that APL was still too much an imperative language, and that it had an insufficient number of combining forms.
Modern functional languages, such as Standard ML and Haskell, as well as modern thinking in functional programming theory, practice and teaching (see e.g. the papers of J. Hughes, of E. Mejer etc., and of J. Gibbons below), bear much in common with the philosophy of FP. In particular, J. Hughes' article, also very influential as a functional programming advocacy text, makes a very strong point of that ‘better modularity alone is the key to the power of functional languages’ and that using higher-order functions provides a ‘glue’ (i.e., means of combination) that ‘allows greatly improved modularisation’.
Much research has gone into discovering general patterns of recursion. Once found, such patterns become standard library functions, thus obviating the need for direct use of recursion in programs.
On the other hand, FP's fixing of the number of combining forms was obviously considered restrictive and not followed in most other languages. Also, except for APL and its descendants, such as J, K, and Nial, there is no distinction between ordinary and higher-order functions (combining forms), and the latter are usually not built in the language but expressed as any other function.
Unlike FP, most modern functional languages are not combinator-based but make use of the full λ-calculus to define functions, although of course it is possible to write variable-free programs in them. A notable exception is the experimental language Joy, which not only is combinator-based, but leaves out even application, and uses composition – syntactically expressed as concatenation – as the only form of function combination.
The article that introduced FP: Can Programming Be Liberated from the von Neumann Style? A Functional Style and Its Algebra of Programs (1977 ACM Turing Award Lecture)
The FL Project: The Design of a Functional Language
Function Level Programming and the FL Language (a 3/4-hour video lecture of J. Backus)