A red-hot rock has a lot of energy due to the random motion of its molecules. You can't do anything with this energy if the rock is in an equally red-hot furnace. You can if you put it in contact with something colder: you can boil water, make steam and drive a piston. (2/n)
-
Show this thread
-
The thermal energy in a red-hot rock can't do work in an environment at the same temperature. So this energy is not "free energy". But if the rock is moving, it has "free energy". You can do work with this energy - even in an environment at the same temperature! (3/n)
3 replies 4 retweets 69 likesShow this thread -
Amazingly, there's a formula for free energy, which turns it into a precise and useful concept. It's F = <E> - TS where <E> is the system's expected energy, T is its temperature and S is its entropy. (Experts will now start to complain, but I know what I'm doing.) (4/n)
5 replies 11 retweets 80 likesShow this thread -
Why do I say "expected" energy? "Expected" means "average" or "mean". We're actually doing probability theory here, since our rock (or whatever) may have randomly moving parts. Concepts like "temperature" and "entropy" also involve probabilities. (5/n)
1 reply 5 retweets 59 likesShow this thread -
What's the basic idea of F = <E> - TS ? I like to say: free energy is the energy minus the energy due to being hot. The "energy due to being hot" is temperature times entropy. (6/n)
2 replies 9 retweets 60 likesShow this thread -
But what's really going on here! In which situations does "free energy" make sense? It's very general. We can define free energy whenever we have a finite set X with a probability distribution p and real-valued function E on it, and a number T called "temperature". (7/n)
2 replies 6 retweets 53 likesShow this thread -
We can define the "entropy" S of a probability distribution p on a finite set X. It's S = -sum p(i) log(p(i)) where we sum over all points i of X. This is biggest when p is smeared-out and flat, smallest when p is zero except at one point. It measures randomness. (8/n)
2 replies 5 retweets 52 likesShow this thread -
We can also define the "expected value" of any function E: X -> R when we have a probability distribution p on a finite set X. It's <E> = sum p(i) E(i) where we sum over all points i of X. This is just the average value of E, weighted by the probability of each point. (9/n)
4 replies 4 retweets 40 likesShow this thread -
So, now you know the definition of the "free energy" F = <E> - TS for any number T, any real-valued function E: X -> R and any probability measure p on any finite set X. Learning why this is so great takes longer! You need to learn what you can do with it. (10/n)
3 replies 5 retweets 50 likesShow this thread -
If you want to learn a tiny bit more about why free energy is important, try Section 1 of this paper: https://arxiv.org/abs/1311.0813 Here Blake Pollard and I quickly explain why a system in equilibrium at some temperature T will minimize its free energy. (11/n, n = 11)
9 replies 15 retweets 141 likesShow this thread
Nice thread. Accessibility and precision are in perfect harmony.
-
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.