Across multiple scales — from neurons, through AI systems, to spacetime itself — one recurring signature appears: ordered structures that persist by channeling energy or information through gradients. In other words, flows of energy/information sustain complexity by resisting disorder (increasing entropy).
The following three domains show independent empirical work that aligns with that pattern. These observations implicitly point to the concept that a unified law of energy‐flow might operate across domains.
1) Biological systems: ordered flow in living matter
Claim: Neuronal networks and biological systems maintain ordered states by channeling energy and information gradients; this mirrors larger scale flows.
Key empirical findings:
- In human brain networks, large‐scale violations of detailed balance (i.e., irreversibility) have been measured. The study “Broken detailed balance and entropy production in the human …” finds that brain activity violates detailed balance and this increases with cognitive exertion. (PMC)
- A review “Entropy and Complexity Tools Across Scales in …” (2025) describes quantification of entropy production in spiking neuronal networks and whole‐brain levels. (MDPI)
- The article “Entropy and the Brain: An Overview” (2020) posits that thermodynamic/information measures (entropy, energy cost) in the brain can be quantified and linked to function. (PMC)
- “Information Thermodynamics: From Physics to Neuroscience” shows that decoding stimuli in neural networks carries a quantifiable energy cost via entropy production. (MDPI)
Interpretation linking to energy‐flow logic:
Biological systems maintain non‐equilibrium states (ion gradients, membrane potentials) by consuming energy and exporting entropy. The empirical work above shows measurable entropy production and irreversibility; this supports the idea that complex biological function is sustained by “flows” rather than static states. These flows align conceptually with the same kind of gradient‐maintenance that could scale upward.
2) Complex Systems / AI: information as thermodynamic flow
Claim: Machine learning systems and artificial networks obey thermodynamic constraints: learning = lowering internal entropy + exchanging energy/information with environment.
Key empirical findings:
- On the physics side, “Thermodynamic Bound on Energy and Negentropy Costs of …” (2025) derives a bound on the energy cost of inference in deep neural networks, using Landauer’s principle.
- “How Much Energy Do LLMs Consume?” gives concrete numbers and links the large energy use of AI training to thermodynamic fundamentals. (ADaSci)
- “Learning with artificial and natural neural networks: trade-offs …” (2025) links energy consumption and representation quality in networks. (europhysicsnews.org)
- General note on Landauer’s principle and computation irreversibility: “On the Rising Cost of AI and Landauer’s Principle” (2024) though not peer-reviewed. (Medium)
Interpretation:
Artificial systems provide a controlled testbed for the energy‐flow idea. The fact that inference/training has a lower bound of energy cost, and that this cost grows as system complexity grows, shows that information processing is not free. Learning systems thus must channel energy through gradients (computation, memory erasure) and generate entropy. This aligns with the same pattern seen in biology and suggests a broader continuity.
3) Physics / Thermodynamics: bridging General Relativity & Quantum Field Theory
Claim: At the cosmological level, gravity and spacetime dynamics may be emergent from information/entropy gradients — thus energy‐flow logic extends to the largest scale.
Key empirical/theoretical findings:
- “Gravity from entropy: Physical Review D” (2025) derives gravity from quantum relative entropy coupling matter and geometry. (link.aps.org)
- “Emergence of Spacetime: From Entanglement to Einstein” (2020) shows how Einstein’s equations can be derived from thermodynamics of entanglement and local horizon thermodynamics.
- “The Scaling Entropy-Area Thermodynamics and the Emergence…” (2024/25) introduces a unified framework linking gravitational systems’ entropy scaling to information theory. (ipipublishing.org)
- “Is Gravity Just Entropy Rising? Long-Shot Idea Gets Another Look” (2025) discusses current empirical/theoretical work that gravity might simply be the result of entropy gradients. (Quanta Magazine)
Interpretation:
If gravity can be reframed as a thermodynamic or informational phenomenon (i.e., emergent from entropy/information structure), then the same energy/information‐flow logic applies across the cosmic scale. In that view, spacetime is not static but sustained by flows of energy/information. The empirical/theoretical work above supports this reinterpretation of GR and QFT as two regimes of entropy/flow rather than two disconnected theories.
4) Convergence: Flow as the universal operator
When you lay out the three domains, a pattern emerges:
| Domain | Observable mechanism | Empirical study link | Functional outcome |
|---|---|---|---|
| Biological | Ion/voltage gradients, entropy production | Lynn et al. 2021; Cofré 2025 | Sustained neural information flow |
| Artificial (AI) | Energy cost of inference/training | arXiv 2503.09980v1; energy‐LLM studies | Learning = structured information + energy |
| Cosmological | Gravity/information/entropy of spacetime | PhysRevD 111 066001; entanglement → GR work | Emergent spacetime structure from flow |
Each domain measures (or theorises) a system that maintains structure by exporting entropy or managing gradients. This strongly suggests that energy/information flow is the organising principle of diverse complex systems.
5) Implicit implications for a unified framework
Though there is no direct empirical experiment yet that tests “one law governing neuron → galaxy” explicitly, the three domains each provide solid evidence for the same core logic: flow + gradient + entropy reduction (local) + entropy export (global). If you accept that, then you have the building blocks for a unified “energy‐flow cosmology” style argument.
What remains is bridging the domains empirically: designing experiments that show, for instance, how neuronal entropy flows can scale analogously to informational flows in artificial systems, or how the same mathematical form that describes neural network training might map to spacetime/entropy flows in cosmology. Those would be radical, but the foundations are laid.
Summary Insights
Biological neural systems maintain ordered flows via measurable entropy production and gradient maintenance.
Artificial systems (AI/ML) obey energy/information budgets and dissipation limits consistent with thermodynamic theory.
Cosmology and gravity research increasingly interpret spacetime and gravitational dynamics in terms of entropy, information, and emergent flows.
Together, these domains tell a coherent story: complexity emerges and persists by managing gradients of energy/information and by exporting disorder (entropy).
The empirical work doesn’t yet prove a universal “one-law” across all scales, but it provides growing support for that direction.
For a full theoretical synthesis connecting these observations into a unified thermodynamic framework, see:
Magnusson, M. (2025). “Energy-Flow Cosmology (EFC v2.1): Modular Synthesis Across Structure, Dynamics and Cognition.” Figshare. https://doi.org/10.6084/m9.figshare.27452213