On uncertainty, information and predictability.

The relationship between exergy, entropy and information and uncertainty.

James J. Kay

© Copyright July 1996, updated December 2002.


Uncertainty has two components:

reducible uncertainty which is in principle resolvable by gathering more information and

irreducible uncertainty which in principle cannot be resolved because of indeterminacy in the world.

Uncertainty is measured by <ln p i> that is the expectation value of the log (base e) of the probability that the ith possible outcome will actually occur. (This is just the Shannon entropy for the probability distribution.)


Useful information is that which causes a change in probability assignment. (Information is any new knowledge we obtain, useful or not.)

Shore and Johnson demonstrated that the correct way to incorporate new information into a probability distribution was to choose pi to minimize <ln(pi/qi)> subject to the constraints of the information we know (expectation values of observable and their moments).

<ln(pi/qi)> measures the useful information gained.

<ln p i> - <ln q i> measures the reduction in uncertainty.

They are not equal! (The first is the average of differences, the second the difference of averages.)


In fact, as Shore and Johnson show, one can have the uncertainty increase at the same time as the useful information increases. For example when you do a set of experiments and discover that your assumptions are incorrect and that the situation is less constrained than you thought. Your uncertainty increases, but you did gain useful information!

You can also do experiments in which you gain useful information and alter the probability distributions, but not change your uncertainty. For example if you have a roulette wheel, and begin by assuming that particular numbers come up more frequently, in a particular distribution, but then perform some experiments and realize that the wheel is biased, as you thought (i.e. you got the shape of the distribution right) but toward a different set of numbers. You have certainly gained useful information and perhaps money, but your uncertainty remains constant!

Furthermore Shore and Johnson provide a test for determining when the remaining uncertainty is due to our ignorance and is resolvable by more information or due to indeterminacy in the situation and hence is irreducible! Thus their method provides a calculus for dealing with how to efficiently gain more information (since it defines useful information) and when our understanding (i.e. degree of predictability) is as complete as possible.

Predictability, not information (useful or not), is the opposite of uncertainty! I can obtain lots of information without improving my predictability. What I want to do (and Shore and Johnson tell you how to do this) is improve my predictability (that is limit my uncertainty) as much as possible while limiting the information I need to obtain to do this! So information properly applied reduces uncertainty, but gaining new information is not necessarily related to reducing uncertainty and increasing predictability. In fact the problem of indeterminacy is that you can gather as much information as you want but never decrease your uncertainty or increase your predictability.


About useful information, exergy and entropy.

Given the above it is clear quite quickly why exergy and entropy are NOT opposites. EXERGY is the useful information we have about energy, gained when we transform energy from one form to another. ENTROPY increase is the uncertainty we have created about energy, when we transform energy from one form to another. They are not opposites for the same reasons stated above. So an increase in exergy occurs we gain more useful information about energy, that is we have more ability to do something with it. An increase in entropy corresponds to an increase in our uncertainty about energy. Work and heat are related by the first law and this makes them in a sense opposites but exergy and entropy are NOT conserved, one can change without the other changing. For example one can allow some exergy to "escape" from a system by letting it do work on on its environment. By definition no entropy is produced! One can also construct systems where entropy changes without exergy changing. (Allow chemicals to combine and produce heat at the right temperatures and the loss of exergy in the chemicals is offset by the gain in exergy from the heat added to the system (i.e. the temperature increase).) This is why an entropy perspective on thermodynamics is NOT equivalent to an exergy perspective! One does need both to understand the second law dynamics of a system, just as one needs to consider useful information and uncertainty to have an "information theory".


For discussion of the mathematics underlying this discussion and the relationship between exergy and "useful information" see:
Kay, J. "The relationship between information theory and thermodynamics: the mathematical basis" (90K PDF file)
particularly the very last section.

A review of the basic measures of information theory can be found in section 3.2 of Kay, J. "A Review of the Fundamental Measures of Information Theory and their Application in Ecology." (216K PDF file)


Back to JK musings page