Poll: If information is yin what is the most natural and interesting yang to it?
-
-
It's more basic than that. Entropy can be cast precisely as a measure of ignorance: more ignorance of the system means a higher entropy system. Granting information about the system (and thereby constraining its possible states) leads to a reduction in entropy.
-
Feels like there is a real difference here. Talking explicitly about an observer is talking about *two* information states and their mutual information here. Whereas single sinusoid to white noise seems like less vs more information. Maybe signal vs system view is different
End of conversation
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.