The Azimuth Project
Deriving Hamiltonians

2020.04.06, Andrius Kulikauskas: I have set up this page to try to understand Chris Goddard’s thoughts on deriving Hamiltonians and to engage him and develop related ideas. Discussion is currently at the thread on information theory.

Chris: Understanding where hamiltonians come from or having some idea as to a principle as to how to do so would be potentially extremely significant and a very powerful insight. Indeed, this was hammered home to me from my undergraduate and honours studies - it seemed to me that on the basis of 19th and 20th century physics that there was really no clear explanation or consensus as to where really action functionals came from, other than through the use of tried and tested heuristics and “physical intuition” based loosely on arguments of “symmetry”. (The latter of these - appeals to notions of symmetry - presumably relates to thinking of things in terms of Lie Groups that act on geometric models in which dynamics occurs, and in a way could potentially be made more rigorous in and of itself, however this was not an avenue of thought towards which my interests took me. Needless to say, however, I have my doubts that the existing machinery presents an acceptable level of generality.)

Chris: From a mathematician’s perspective, however, appealing vaguely to “physical intuition” and “symmetry” (in a primitive fashion) is not good enough - one would really like to be able to “derive” or “prove” that the equations of electromagnetism, general relativity, quantum mechanics etc can be derived from particular natural assumptions about the geometry in which dynamics is to take place. Frieden laid the groundwork there, and I’d like to think that I did a fair bit of work from 2005 to 2010 fleshing it out, and then from 2010 to 2020 in terms of generalising this framework.

Ideas from Roy Frieden

  • Extreme physical (Fisher) information is a unifying principle of physics.
  • A `request' for data creates the law that, ultimately, gives rise to the data. The observer creates their local reality. Such a theory of measurement, one which incorporates the observer into the phenomenon that they observe.
  • Fisher information is a physical measure of disorder. Like entropy, it changes monotonically in time.
  • The Fisher information I[q]I[q] is the amount of information contained in the data collected during the measurement process.
  • The bound information J[q]J[q] is the amount of information in the phenomenon.
  • Think of the Lagrangian functional K[q]=I[q]J[q]K[q] = I[q]-J[q] as the physical information of the system.
  • Axiom 1. Perturbed amounts of information satisfy the formula δJ[i]=δI[i]\delta J[i]=\delta I[i].
  • A phenomenon that only obeys normalization is said to exhibit ‘maximum ignorance’ in its independent variable.

Ideas from Chris Goddard’s A Treatise on Information Geometry.

Chapter 6. Statistical Geometry. “We are first and foremost interested in generalising the notion of a particle path. … We would like the measure to maximise at the centre of this tube, ie about some optimal geodesic path, but also somehow model uncertainty in position by having exponential falloff relative to the core geodesic.”

Chapter 7. Fisher Information and application to the theory of Physical Manifolds.

  • 7.1 Entropy is latent information. Gaining information eliminates uncertainty. If we perform an experiment, we gain information by finding out which event occurs. Shannon entropy is an external, global measure. We need rather an internal, local measure of information.
  • 7.2 Fisher Information: I(x,X)=E[[xln(X,θ)f(x,θ))] 2]I(x,X)=E[[\frac{\partial}{\partial x}ln(X,\theta)f(x,\theta))]^2] is the variance of the score of ff. Fisher Information is the maximal measure of variability in the probability distribution over M, subject to linearity in its second argument for independent random observations.
  • 7.2 The channel information is the total Fisher information I(x)= MI(x,X)dxI(x)=\int_{}^{M}I(x,X)dx. It is the total amount of information that manifold M can carry.
  • 7.2 The bound information JJ is the unavoidable information contained within the system. For a physical manifold, I=JI=J.
  • 7.2 The Refined Principle of Extreme Physical Information (EPI): The physical information K=IJK=I-J must be locally minimal: δK=0\delta K=0 and δ 2K0\delta^2K \geq 0.
  • 7.2 The universe organizes a trade off between knowing exactly what happens everywhere (which would require a massive investment of energy in pinning down the geometry) and spending the least energy possible.
  • 7.2 To a reasonable approximation, we can think of the universe as sitting in solution space at the global minimum for information. We can draw various cartoon models of increasing sophistication. Our act of eliminating noise due to statistical fluctuations will always leave higher order noise yet to be quantified and understood. Our goal is to produce models that work under a range of conditions that we can experimentally measure and understand.
  • 7.4 Fisher information is the best information in the regard that it is the unique functional which is zeroed by the estimator of maximal likelihood. Also, as the disorder of the system decreases, so does the Fisher information.

Thoughts by Andrius Kulikauskas

  • I think of the Lagrangian as relating kinetic energy (cash we carry) and potential energy (a bank account). The minimization of the Lagrangian means that physics is constrained so that there is the minimal conversion between the cash we carry and our account.
  • I think of entropy as a measure of (non) deliberateness. Intuitively, a deliberate subsystem is one that is structured by causes outside of the subsystem, and thus generally needs outside energy to sustain it. Alternatively, it is one where there is a stark distinction between the observer and the observed. I believe that key to understanding entropy is to understand what it means to partition phase space because the partition chosen can make all the difference.

Readings

Questions

  • Andrius: Can we meaningfully think of I[q]I[q] as the kinetic energy and J[q]J[q] as the potential energy?
  • Andrius: What does it mean to say that J[ϕ]J[\phi ] is a universal constant?
  • Andrius: Can we think of the physics of information in terms of the cognitive concepts Whether, What, How, Why? and what the latter mean, for example, in the Yoneda Lemma?