The Law of Increasing Functional Information

Look around you. From atoms to galaxies, from rocks to living cells, from language to technology, the universe has a curious habit: over time, it keeps inventing more elaborate, structured, and capable things. This is strange, because one of the most famous laws of physics seems to say the opposite should happen.

That law is the second law of thermodynamics. It tells us that entropy—often described as disorder—tends to increase. Shoes get scuffed. Ice melts. Eggs scramble and never unscramble. The universe runs downhill.

And yet, somehow, complexity rises. Stars form from dust. Planets differentiate into oceans, continents, and atmospheres. Life emerges from chemistry. Minds appear. Civilizations build tools that evolve faster than biology ever could.

So what’s going on?

A growing number of scientists think the answer may lie in a new idea called the Law of Increasing Functional Information—a proposal that tries to explain why complexity keeps emerging without violating physics, purpose, or common sense.

————————

Evolution Didn’t Start With Life

When we talk about evolution, we usually begin with biology. But that skips a long and crucial prelude.

Before the first cell existed, the universe had already undergone billions of years of chemical and cosmic evolution. Protons and neutrons formed atoms. Atoms formed molecules. Molecules gathered into stars. Stars forged elements. Planets formed. Minerals diversified. Oceans appeared. Energy flowed.

By the time life arrived, the universe had already done an enormous amount of exploratory work.

This raises a deeper question:

Is evolution something special that only life does, or is it part of a much broader cosmic pattern?

————————

One Pattern, Many Systems

When scientists step back and compare very different systems—atoms, minerals, ecosystems, languages, technologies—they notice something striking. Despite their differences, all evolving systems seem to share three basic traits.

First, they are made of many interacting components. These might be atoms, molecules, cells, organisms, or people.

Second, those components can be rearranged in vast numbers of ways. The number of possible configurations is usually astronomical.

Third, the system experiences selection pressures. Some arrangements persist because they work better, last longer, or open new possibilities. Others disappear.

Whenever these three ingredients are present, evolution happens—whether the system is alive or not.

————————

Three Ways Things Get Selected

Selection itself comes in different forms.

Some structures persist simply because they are stable. Atomic nuclei and crystal lattices fall into this category. This is called static persistence.

Other systems persist because they are dynamically active, constantly exchanging matter and energy with their surroundings. Living organisms are the obvious example. What survives here is not the material itself but the pattern of activity. This is dynamic persistence.

Then there is selection for novelty—new abilities that open previously inaccessible worlds. Eyes allow seeing. Wings allow flight. Language allows culture. Each novelty expands what the system can explore.

Life, in particular, appears to be open-ended: it doesn’t just optimize within a fixed space, it keeps discovering new spaces altogether.

————————

What Is “Functional Information”?

To make sense of this trend, scientists distinguish between two kinds of information.

One kind is descriptive information—how many bits it takes to fully specify something. This is known as Kolmogorov complexity. Scramble a book or shuffle DNA randomly, and this number doesn’t change much.

But that kind of information doesn’t capture function.

Functional information does.

Functional information measures how rare a useful arrangement is among all possible arrangements. If a system can be configured in trillions of ways, but only a tiny fraction actually do something, those functional configurations carry a lot of information.

The rarer the function, the higher the functional information.

As systems evolve and acquire more refined, specialized, or powerful functions, their functional information increases.

————————

Rocks Can Evolve Too

This idea becomes clearer with a surprising example: minerals.

There are 72 chemical elements that can form minerals. The number of possible atomic combinations is mind-boggling—around 10⁴⁶ possibilities. Yet only about 6,000 minerals actually exist.

Why so few?

Because nature “selects” for configurations that persist. Most combinations fall apart. A small fraction are stable under real planetary conditions.

Over time, Earth’s mineral diversity increased. Early stars produced about 25 simple minerals. Planets enabled many more. Life introduced entirely new mineral-forming processes.

When scientists calculate the functional information of Earth’s current mineral system, they find it has risen steadily over geological time. Rocks, it turns out, evolve.

————————

Entropy Is Not the Enemy

At this point, a reasonable objection arises: doesn’t the second law of thermodynamics already explain everything?

Not quite.

Entropy explains why energy spreads out and why time has a direction. But it does not explain why matter repeatedly organizes itself into systems that perform increasingly sophisticated functions.

The Law of Increasing Functional Information does not contradict thermodynamics. It complements it. Entropy still increases overall. But within open systems that receive energy and experience selection, functional organization can grow.

Think of it like gravity. Gravity has no purpose, yet stars form. Likewise, increasing functional information doesn’t aim at life, intelligence, or meaning. Those are outcomes, not goals.

————————

Why This Matters

This framework helps scientists think more clearly about:

  • The origin of life, without invoking miracles or purpose
  • The continuity between chemistry and biology
  • Why technology and culture evolve so rapidly
  • Where to look for life-like processes beyond Earth, such as on Europa, Enceladus, or Titan

It also reframes humanity’s story. We are not an accident dropped into a meaningless universe, nor the predetermined endpoint of cosmic history. We are one expression of a much deeper tendency: matter exploring what it can do.

————————

A Theory That Must Evolve Too

The Law of Increasing Functional Information is not finished. It may be refined, challenged, or replaced. Its authors openly admit this. In a fitting twist, the theory itself must face selection pressures—scrutiny, testing, competition with better ideas.

If it survives, it will be because it works.

And if it fails, it will join the long lineage of discarded ideas that nonetheless helped us see further.

Either way, the universe keeps doing what it has always done: experimenting, selecting, and becoming more interesting with time.

References

Szostak, J. W. (2003). Functional information: Molecular messages. Nature, 423(6941), 689.

(Introduces the concept of functional information in biological molecules.)

Hazen, R. M., Griffin, P. L., Carothers, J. M., & Szostak, J. W. (2007). Functional information and the emergence of biocomplexity. Proceedings of the National Academy of Sciences, 104(Suppl. 1), 8574–8581.

(Extends functional information to complex evolving systems.)

Hazen, R. M., & Morrison, S. M. (2020). An evolutionary system of mineralogy: Proposal for a mineral evolution database. American Mineralogist, 105(9), 1262–1280.

(Shows how Earth’s minerals evolved through planetary and biological processes.)

Hazen, R. M., Griffin, P. L., Bondaruk, O., et al. (2023). On the roles of function and selection in evolving systems. Proceedings of the National Academy of Sciences, 120(15), e2216070120.

(Formal statement of the Law of Increasing Functional Information.)

Schneider, E. D., & Kay, J. J. (1994). Life as a manifestation of the second law of thermodynamics. Mathematical and Computer Modelling, 19(6–8), 25–48.

(Explores entropy in open, far-from-equilibrium systems.)

Prigogine, I. (1980). From being to becoming: Time and complexity in the physical sciences. W. H. Freeman.

(Classic work on dissipative structures and self-organization.)

Schrödinger, E. (1944). What is life? The physical aspect of the living cell. Cambridge University Press.

(Foundational insight into order, entropy, and living systems.)

Kauffman, S. A. (1993). The origins of order: Self-organization and selection in evolution. Oxford University Press.

(Explores complexity arising from self-organizing systems.)

Walker, S. I., & Davies, P. C. W. (2013). The algorithmic origins of life. Journal of the Royal Society Interface, 10(79), 20120869.

(Information-theoretic approaches to life’s emergence.)

Adami, C. (2012). The use of information theory in evolutionary biology. Annals of the New York Academy of Sciences, 1256(1), 49–65.

(Information as a measurable evolutionary quantity.)

Lineweaver, C. H., Davies, P. C. W., & Ruse, M. (Eds.). (2013). Complexity and the arrow of time. Cambridge University Press.

(Interdisciplinary perspectives on time, entropy, and complexity.)

If you’d like, I can next:

• Condense this into a short “Further Reading” list for general audiences

• Add inline citations matched to specific paragraphs

• Create a visual reference map linking entropy, evolution, and information

• Adapt this list for Aeon / Nautilus / Quanta-style publication standards

Just tell me the next step.

Leave a Reply

Your email address will not be published. Required fields are marked *

Loading...