Conversation

i don't have very clean formulations of anything yet or i'd just write them down but for starters, 1) there's no such thing as entropy, only relative entropy (KL divergence); entropy as normally understood is relative entropy wrt the uniform distribution (on a finite set)
1
2
wait so entropy is a statistical phenomenon and not strictly a 'natural property' of the world? (e.g. planck constants) So it seems entropy would occur in a fairly simple mathematical facsimile of reality
1
1
Replying to
and i'm still learning about the mathematics of very large numbers of microscopic bits but the basic idea is that in that regime things have a "generic" ("high-entropy") behavior and everything else is exponentially unlikely relative to that (e.g. heat flowing from cold to hot)
1
1
most of this should be basically standard statistical mechanics in some sense, certainly not claiming to have any new ideas here, but hopefully i can at least try to write things down in a language that makes sense to me and other mathematically inclined people
1
1
Show replies