Conversation

Poll: If information is yin what is the most natural and interesting yang to it?
  • Energy
    14.6%
  • Ignorance
    17.5%
  • Emotions
    22.6%
  • Randomness
    45.3%
371 votesFinal results
18
9
Replying to and
It's more basic than that. Entropy can be cast precisely as a measure of ignorance: more ignorance of the system means a higher entropy system. Granting information about the system (and thereby constraining its possible states) leads to a reduction in entropy.
4
1
Feels like there is a real difference here. Talking explicitly about an observer is talking about *two* information states and their mutual information here. Whereas single sinusoid to white noise seems like less vs more information. Maybe signal vs system view is different 🤔
1