1/ Information is rare and hides in pockets as it battles the universe’s perennial march to disorder: the growth of entropy.
-
Show this thread
-
2/ Earth is a very special place in the Universe as it is a singularity of physical order, or information as defined by Shannon in information theory.
4 replies 14 retweets 104 likesShow this thread -
3/ First, information is physical or more precisely it is physically embodied. Information is not a thing; rather, it is the arrangement of physical things. It is not an amorphous soup of atoms but physical order.
2 replies 13 retweets 111 likesShow this thread -
4/ Crash a 1m$ Ferrari against a wall, the dollar value evaporate while its weight doesn’t. The value was store in the way those atoms were arranged. This arrangement is information.
5 replies 50 retweets 247 likesShow this thread -
5/ Second, information is meaningless. Meaning only emerges when a message reaches a life-form or a machine with the ability to process information. Meaning is not carried in the message it is derived from context and prior knowledge.
2 replies 38 retweets 194 likesShow this thread -
6/ History from the lifeless (physical) to the living (biological) to the social and then to the economic is centered not so much on the arrow of time but on the arrow of complexity: the growth of information.
1 reply 15 retweets 110 likesShow this thread -
7/ The mechanisms supporting this growth of information lie around three main pillars: out-of-equilibrium system, accumulation of information in solids and ability of matter to compute.
1 reply 6 retweets 66 likesShow this thread -
8/ In a close physical system, the second law of thermodynamics states that the entropy always tends to increase, meaning that systems march from order to disorder, from an information-rich state to an information-poor state.
2 replies 10 retweets 60 likesShow this thread -
Replying to @Kpaxs
I would say that in this case is clearly the opposit, you can not extract data from a perferctly ordered system, entropy=0 is equal to nformation=0. Is when entropy grows when you obtain information from it. The greater the numer of possible states the more data it generate.
1 reply 0 retweets 0 likes
True if you consider the definition of information as stated by Shannon. But they are many other possible definition.
-
-
Thanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.