Conversation

i don't have very clean formulations of anything yet or i'd just write them down but for starters, 1) there's no such thing as entropy, only relative entropy (KL divergence); entropy as normally understood is relative entropy wrt the uniform distribution (on a finite set)
1
2
Show replies