|, Posts
comments 9

Defining Information, Even More

Here is a scholarly paper that caught my eye. It appears in the latest issue of the journal Information; the title is “Naturalizing Information”; the author is Stanley N. Salthe, a professor emeritus of biology from Brooklyn College. It attempts to create a better-than-ever, all-purpose definition of “information.” A meta-definition, perhaps I should say. Let me just quote the opening sentences:

In this paper I forge a naturalistic, or naturalized, concept of information, by relating it to energy dissipative structures. This gives the concept a definable physical and material substrate.

The question “How do you define ‘information’?” is one that gives me the willies. I hear it often in the context of discussing my new book, The Information, which, after all, devotes 500+ pages to the subject. Sometimes I simply refer to the ultimate arbiter, the OED, which, however, requires 9,400 words to answer the question. Kevin Kelly, who put it to me during this interview, had an answer in mind already: Gregory Bateson’s famous phrase, “a difference which makes a difference.” Bateson, in turn, fashioned that clever epigram to encapsulate the mathematical definition created by Claude Shannon, the inventor of information theory. (What Bateson actually wrote was: “A difference which makes a difference is an idea. It is a ‘bit,’ a unit of information.”)

What makes it frustrating for me to define information (and the reason the OED needs to go on so long) is that the word is so important in such different realms, from the scientific to the everyday. These realms are bound at the hip. But the connections aren’t always obvious.

So Salthe tries to tie them up in a package—to make, as he says, a hierarchy of definitions: “The conceptual bases for this exercise will be nothing more than two commonly recognized definitions of information—Shannon’s, and Bateson’s—together with my own, thermodynamic, definition.” The thermodynamic definition, again following Shannon, involves entropy. “My perspective,” says Salthe, “is that the evident distance from thermodynamic equilibrium of our universe is a fact that contextualizes, and subsumes everything else.” (He adds, with air quotes, “Arguing that the ‘real’ definition of information is an amalgam of all three, I find it impossible to say in a single sentence what that definition is.” To which I want to say, welcome to my world.)

From there, it gets complicated. Very complicated: to give you an idea, here is an illustration. I urge the interested reader to download the full essay. Section 6 has a title that many will consider an understatement: “Interpretation: Information Can Generate Meaning.” That is, after all, why we care. The journey from information to meaning is what matters.



  1. marian morgan says

    “The journey from information to meaning is what matters.”

    Exactly. I am still reading (and struggling somewhat with) your book, but I think that sentence sums it up perfectly.

  2. Gerard Jagers op Akkerhuis says

    The ‘difference that makes a difference’ implies that there is a brain choosing what difference we want to consider. In this sense I regard the definition of Checkland and Scholes a bit more precise: ‘data that can be endowed a meaning in a context’. Various data and contexts can be imagined. For example Kauffman has linked the context for information to the hypercycle in the cell (every molecule in an autocatalytic set representing a part of the information necessary to perform the process as a whole). In fact, also elements of other cyclic closures could be used in this way (pion exchange between protons and neutrons in the nucleus, or interactions between neural modules in the brain). The application of different contexts and endowed meanings can solve the divergence between structure as information (classical Shannon) or ‘agreements’ as information (e.g. an empty letter as a token for acceptance of a marriage). For a hierarchical ranking of increasingly ‘informed’ (= acquired a certain form!) structures, and for defining what structures could offer the basis for considerations about information, I suggest it would be profitable to use the operator hierarchy as a basis, as this recently developed theory offers a fundamental and strict rationale for analyzing organization in nature (see http://www.hypercycle.nl).

    • gleick says

      Is it more precise? This challenge arises: define “meaning; define “context”; avoid circularity.

      • Gerard Jagers op Akkerhuis says

        Thank you for your reply. Just a short answer.

        About “meaning”: that is always endowed upon something by the reasoning of people.
        About “context”: that is the larger picture used to endow a meaning to a process.
        No circularity involved in this reasoning.

        So I consider it defendable that the definition of information by Checkland and Scholes as: “data with an endowed meaning in a (chosen) context” offers a quite useful definition for information.

        Those that consider structure per se as information, for example DNA, seem to apply a more esoteric concept of information. Yet, if one leaves out the context of the cell and its survival as the means for endowing a meaning to its funtioning, DNA is nothing more than a complex molecule. This raises the question whether every molecule represent information? I would opt for a negative answer, because what the molecule represents is a (more or less complex) structure. Accordingly, it seems logical that it is only after endowing this structure a functionality in the context of the cell that it can be considered as information.
        Kind regards, dr. mult. Gerard Jagers op Akkerhuis

  3. Stan Uffner says

    In college I was studying entropy and statistical physics, and in particular, the Boltzman Entropy Equation, which became the Shannon Information Equation. The relationship between energy and information started to clear up for me (a little) and at one point I realized their identity in word play, that is, “energy that is in formation.” Energy that is in some way patterened, aligned, and able to be differentiated from the background field.

    I’m looking forward to reading your book.

  4. David Ellerman says

    There is a mathematically formulated information theory based directly on the basic idea of a difference or distinction that arises out of the new logic of partitions. The logical entropy of a partition, as opposed to the Shannon entropy, is just the normalized count of the number of distinctions made by the partition (See: “http://www.mathblog.ellerman.org/2010/02/from-partition-logic-to-information-theory/ and the links given there). Bateson’s “difference that makes a difference” may be about communicating information (i.e., the difference for the sender that makes a difference for the receiver)–as was Shannon’s theory.
    I was stunned to find in your book that John Wilkins had already arrived at the “difference” or distinction definition of information back in 1641. I wanted to quote your commentary on this, in addition to the original Wilkins quote, but you end it by saying that there was four hundred years between Wilkins and Shannon (p. 161). But it is three hundred by my count. Did I miss something, or should I modify the quote?

  5. Russell Swanborough says

    From an informationologist’s perspective (mine) all words in a definition must be defined and uses must be excluded but can be added after the definition. For example, an earlier suggestion that, [information is] “data with an endowed meaning in a (chosen) context” is not useful because ‘data’ is not defined. Similarly, “a difference which makes a difference” defines nothing, only a use. Shannon’s definition is also a use, especially a use in communication, it does not define the basic artefact.

    So far, this is the best we can get to:- “Information is signals of coherent content that occur within or between orgs”. Where signals are from the five senses (or machine), coherent means ‘not-noise’, content means 3dimensional, 2d, 1d or abstract, occur means one of the four time contexts, within or between means external or internal, and orgs (organisms and organisations) are the only possible users of information.

    This avoids uses and all words can be defined. Uses can be anything you want, go the OED for that…

  6. Alastair McGowan says

    Circularity seems to be our core problem. My own discipline in psychology of situational awareness (functional application of consciousness if you like) floundered on this problem of definition over ten years ago and in my work I got no further than ‘specific in relation to context’. Many many definitions of situation awareness were painfully drawn out but they all failed be be precise while avoiding circularity. Specific related with context – itself circular. When I put an agent in a simulated mathematical context and let it run, or two agents, or more, the degrees of freedom about states of the world eventually diminish to near zero given sufficient scale gradations. I often imagine that singularity is what is behind this. That there is no bit? There is no consciousness? Null hypothesis.

Leave a Reply

Your email address will not be published. Required fields are marked *