Feeds:
Posts
Comments

Posts Tagged ‘Information Theory’

A second post in a series of posts about Information Theory/Learning based perspectives in Evolution, that started off from the last post.

Although the last post was mostly about a historical perspective, it had a section where the main motivation for some work in metabiology due to Chaitin (now published as a book) was reviewed. The starting point about that work was to view evolution solely through an information processing lens (and hence the use of Algorithmic Information Theory). Ofcourse this lens by itself is not a recent acquisition and goes back a few decades (although in hindsight the fact that it goes back just a few decades is very surprising to me at least). To illustrate this I wanted to share some analogies by John Maynard Smith (perhaps one of my favourite scientists), which I had found to be particularly incisive and clear. To avoid clutter, they are shared here instead (note that most of the stuff he talks about is something we study in high school, however the talk is quite good, especially because it tends to emphasize on the centrality of information throughout). I also want this post to act as a reference for some upcoming posts.

Coda:

Molecular Biology is all about Information. I want to be a little more general than that; the last century, the 19th century was a century in which Science discovered how energy could be transformed from one form to another […] This century will be seen […] where it became clear that information could be translated from one from to another.

[Other parts: Part 2, Part 3, Part 4, Part 5, Part 6]

Throughout this talk he gives wonderful analogies on how information translation underlies the so called Central Dogma of Molecular Biology, and how if the translation was one-way in some stages it could have implications (i.e how August Weismann noted that acquired characters are not inherited by giving a “Chinese telegram translation analogy”; since there was no mechanism to translate acquired traits (acquired information) into the organism so that it could be propagated).

However, the most important point from the talk: One could see evolution as being punctuated by about 6 or so major changes or shifts. Each of these events was marked by the way information was stored and processed in a different way. Some that he talks about are:

1. The origin of replicating molecules.

2. The Evolution of chromosomes: Chromosomes are just strings of the above replicating molecules. The property that they have is that when one of these molecules is replicated, the others have to be as well. The utility of this is the following: Since they are all separate genes, they might have different rates of replication and the gene that replicates fastest will soon outnumber all the others and all the information would be lost. Thus this transition underlies a kind of evolution of cooperation between replicating molecules or in other other words chromosomes are a way for forced cooperation between genes.

3. The Evolution of the Code: That information in the nucleic could be translated to sequences of amino acids i.e. proteins.

4. The Origin of Sex: The evolution of sex is considered an open question. However one argument goes that (details in next or next to next post) the fact that sexual reproduction hastens the acquisition from the environment (as compared to asexual reproduction) explains why it should evolve.

5. The Evolution of multicellular organisms: A large, complex signalling system had to evolve for these different kind of cells to function in an organism properly (like muscle cells or neurons to name some in Humans).

6. Transition from solitary individuals to societies: What made these societies of individuals (ants, humans) possible at all? Say if we stick to humans, this could have only happened only if there was a new way to transmit information from generation to generation – one such possible information transducing machine could be language! Thus giving an additional mechanism to transmit information from one generation to another other than the genetic mechanisms (he compares the genetic code and replication of nucleic acids and the passage of information by language). This momentous event (evolution of language ) itself dependent on genetics. With the evolution of language, other things came by:  Writing, Memes etc. Which might reproduce and self-replicate, mutate and pass on and accelerate the process of evolution. He ends by saying this stage of evolution could perhaps be as profound as the evolution of language itself.

________________

As a side comment: I highly recommend the following interview of John Maynard Smith as well. I rate it higher than the above lecture, although it is sort of unrelated to the topic.

________________

Interesting books to perhaps explore:

1. The Major Transitions in EvolutionJohn Maynard Smith and Eörs Szathmáry.

2. The Evolution of Sex: John Maynard Smith (more on this theme in later blog posts, mostly related to learning and information theory).

________________

Read Full Post »

Irving S. Reed

Prof. Irving S. Reed, noted for his various contributions to Signal Processing, Coding Theory and many other areas and perhaps most well known for the Reed-Solomon codes passed away yesterday. His ideas have found applications from CDs to cell phones to deep space communications. USC announced his passing in a press release yesterday. It rightly ends as: Millions of people today enjoy the benefits of Reed’s many inventions and contributions to technology without being aware of their remarkable benefactor. Oftentimes I feel really sad at thinking of such inventions and people, but at other times I tend to think that this is the highest possible compliment an idea or an invention can get. After all, perhaps one mark of an idea/invention to be truly great is that it becomes so obvious/widespread that its origins are more or less forgotten.

Read Full Post »

A mildly personal post.

The title does not imply that the lines quoted below correspond to the exact origin of Kolmogorov Complexity, though they are related and give away the essence.

Andrey Kolmogorov

Information theory must precede probability theory and not be based on it. By the very essence of this discipline, the foundations of information theory have a finite combinatorial character.

Andrey Kolmogorov

With my background in Electrical Engineering I had the opportunity to take courses in Information Theory and Coding which made the idea of Shannon’s Information Theory quite familiar. But there was a time when I had enough background to started noticing conversations that were perhaps relegated to the background before. Simply because I didn’t know enough to make any sense of them and hence these conversations were more or less noise to me. But these happened to be on Kolmogorov Complexity. I hadn’t sat down and studied it. But had been reading articles here and there that mentioned it even with ideas such as The Godel Incompleteness theorems and the Halting Problem. It created the impression that this area must be fundamental but not clearly why.

And then I came across the above rather cryptic lines by Kolmogorov. Used to the idea of entropy (defined in terms of probability) as information, they made my brain hurt. I spent a couple of days thinking about them and suddenly I realized WHY it was so fundamental. And things started making more sense. Ofcourse I didn’t know anything about it as such, but the two day thinking session convinced me enough, that in a sense it was as fundamental as calculus for me given the things I was interested in (along with Shannon‘s and Fisher’s ideas). It also convinced me enough to want to know more about it no matter what projects I was involved in and immediately bought a book that I have been trying my way through as an aside to what I have been working on (linked below).

I find such insightful one liners that happen to cause almost a phase transition or a complete change in the way you look at some thing (information theory in this case) quite remarkable, making the new view very beautiful. Ofcourse there is a “right” time for them to occur but this was certainly one of them. The lines below had an auxiliary effect too:

The applications of probability theory can be put on a uniform basis. It is always a matter of consequences of hypotheses about the impossibility of reducing in one way or another the complexity of the descriptions of the objects in question. Naturally this approach to the matter does not prevent the development of probability theory as a branch of mathematics being a special case of general measure theory.

The concepts of information theory as applied to infinite sequences give rise to very interesting investigations, which, without being indispensable as a basis of probability theory, can acquire a certain value in investigation of the algorithmic side of mathematics as a whole.

– Andrey Kolmogorov (1983)

While the above was a more personal story there are many other famous examples of cryptic one liners changing a view. Here’s a famous one:

A Famous Cryptic Comment:

Robert Fano

I remember reading a story about the great mathematician and electrical engineer Robert Fano. Around the same time the father of Cybernetics, Norbert Wiener was at MIT and was famous at the time for wandering around campus and then talking to anybody about anything that caught his fancy. There are stories on how graduate students would run away when Wiener was sighted coming to save their time. Wiener’s eccentricities are famous (recommendation [2] below) but let me not digress. In one of his routine days he appeared in the office of Fano and made a cryptic comment:

You know, information is entropy.

Fano spent a good time thinking about what this might mean and he has himself remarked that it was in part responsible for his developing, completely independently the first law of Shannon’s theory. Claude Shannon even cited Fano in his famous paper.

I can’t help thinking that such one liners are perhaps the best examples of information compression and Kolmogorov complexity.

_________________________

Recommendations:

1. An Introduction to Kolmogorov Complexity and its Applications – Ming Li and Paul Vitanyi (on the basis of the first third)

2. Dark Hero of the Information Age – Conway and Siegelman

_________________________

Earlier Related Post:

1. Ray Solomonoff is No More (has a short discussion on Solomonoff’s ideas in the same. It is noteworthy that Solomonoff published the first paper in what is today called Kolmogorov Complexity. His approach to the area was through Induction.  Kolmogorov and Chaitin approached it from randomness).

Read Full Post »

There are two kinds of truths: those of reasoning and those of fact. The truths of reasoning are necessary and their opposite is impossible; the truths of fact are contingent and their opposites are possible. (The Monadology of Leibniz)

The past few months have made me realize more and more about the sheer number of fundamental ideas that can be traced back, atleast in part to Gottfried Leibniz. The ones that I find most striking (other than his countless other contributions in calculus, geology, physics, philosophy, rationality, theology etc.) given what has been on my mind recently are his ideas in formal systems, symbolic logic and Kolmogorov Complexity.

It is not incorrect to think that Leibniz could be considered the first computer scientist to have lived. His philosophy centered around having a universal language of symbols combined with a calculus of reasoning, something from which modern symbolic logic and notation has directly descended from. An interest in mathematical logic also directly leads to an interest in the “mechanization of thought”, the same could be seen in Leibniz who was a prolific inventor of calculating devices.

His elucidation of what might be called the earliest ideas in Algorithmic Information Theory/Kolmogorov Complexity is equally intriguing. While he explicates them in depth, what he essentially talks about is the complexity of an “explanation” (basically Kolmogorov Complexity). And that an arbitrarily complex explanation is no explanation at all. I also find this idea similar to the bias-variance tradeoff in machine learning and the problem of overfitting. What I find striking is the clarity with which these ideas had been expressed and how little they have changed in essence in 3 centuries (though formalized).

In my intrigue, I have tried to read his very short works – Discours de métaphysique and The Monadology. While these have been debated over the centuries, their fundamental nature is unquestioned and are a recommended read. More recently I mentioned that I had been really intrigued by Leibniz for some months to my teacher from the undergraduate days. He was instrumental in getting me to read Cybernetics (by Norbert Wiener) and in Signal Processing in general. He was quick to point to this paragraph from Wiener’s book that I did not even remember reading:

Norbert Wiener

Since Leibniz there has perhaps been no man who has had a full command of all the intellectual activity of his day. Since that time, science has been increasingly the task of specialists, in fields which show a tendency to grow progressively narrower. A century ago there may have been no Leibniz, but there was a Gauss, a Faraday, and a Darwin. Today there are few scholars who can call themselves mathematicians or physicists or biologists without restriction.

A man may be a topologist or an acoustician or a coleopterist. He will be filled with the jargon of his field, and will know all its literature and all its ramifications, but, more frequently than not, he will regard the next subject as something belonging to his colleague three doors down the corridor, and will consider any interest in it on his own part as an unwarrantable breach of privacy.

Norbert Wiener, Cybernetics or the Control and Communication in the Animal and the Machine. 1948.

Since the mention of Wiener has occurred, it might also be useful to consider his trenchant advice just before the start of the above passage:

For many years Dr. Rosenblueth and I had shared the conviction that the most fruitful areas for the growth of sciences were those which had been neglected as a no-man’s land between the various established fields […]

_________________________

Onionesque Reality Home >>

Read Full Post »