Your prior probability that my next tweet would be this exact sentence was probably somewhere around 10^-30.
-
Show this thread
-
Replying to @ModelOfTheory
Each letter in a typical English sentence has around 1 bit of information. 2^140 is around 10^42. I’m not sure how much information can be gained from knowing who’s the tweeter.
2 replies 0 retweets 0 likes -
Replying to @noop_noob @ModelOfTheory
I'd have put the prior probability higher than 10^-20. *Typical English* has an entropy of about a bit per letter, but weird sun tweets are much more predictable, and talking about priors and self-references is common. And once you get to "was," the remaining text is low entropy.
1 reply 0 retweets 2 likes -
Replying to @davidmanheim @noop_noob
Typical English also has long stretches of predictable text, and I assumed the 1 bit per character estimate takes that into account. I'm not convinced that weird sun tweets are significantly more predictable than typical English text.
1 reply 0 retweets 1 like
I thought about it again, and I think you're right that my tweet was significantly lower entropy than typical English. I could buy that the prior probability of it should have been more like 10^-20.
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.