The relation between free energy and Rényi entropy looks even neater if we solve for and write the answer using instead of : But you can also check that both sides of this equation have well-defined limits as. The proof is a trivial calculation:Īt least for. Let’s see how the Rényi entropy is related to the free energy. Roughly speaking, it’s the energy that you can use. It’s important, because it equals the total expected energy of our system, minus the energy in the form of heat. Since when, the Gibbs state reduces to our original probability distribution at. Then the state of thermal equilibrium, also known as the canonical ensemble or Gibbs state at inverse temperature is the probability distribution Let’s think of these numbers as energies. We can always writeįor some real numbers. We start with our probability distribution. Let me show you why it’s true - the proof is pathetically simple. It seems like a good thing to know - but I haven't seen anyone say it anywhere! Have you? So, up to the fudge factor, Rényi entropy is the same as free energy. Starting with that Hamiltonian, we can then compute the free energy at any temperature, and up to a certain factor this free energy turns out to be the Rényi entropy, where. So I wanted a better way to think about Rényi entropy, and here’s what I’ve come up with so far.Īny probability distribution can be seen as the state of thermal equilibrium for some Hamiltonian at some fixed temperature, say. And that’s true - but there are other quantities that have the same property. If you ask people what’s good about the Rényi entropy, they’ll usually say: it’s additive! In other words, when you combine two independent probability distributions into a single one, their Rényi entropies add. (A fun puzzle, which I leave to you.) So, it’s customary to define to be the Shannon entropy… and then the Rényi entropy generalizes the Shannon entropy by allowing an adjustable parameter. This looks pretty weird at first, and we need to avoid dividing by zero, but you can show that the Rényi entropy approaches the Shannon entropy as approaches If is a probability distribution on a finite set, its Rényi entropy of order is defined to be In 1960, Alfred Rényi defined a generalization of the usual Shannon entropy that depends on a parameter. John Baez, Rényi entropy and free energy.But for now, here’s a little idea I came up with, triggered by our conversations: I’m learning a lot of wonderful things, and I hope to tell you about them someday. There are a lot of people interested in entropy here, so some of us - Oscar Dahlsten, Mile Gu, Elisabeth Rieper, Wonmin Son and me - decided to start meeting more or less regularly. I want to keep telling you about information geometry… but I got sidetracked into thinking about something slightly different, thanks to some fascinating discussions here at the CQT.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |