5/ As you read that list, your mind might readily jump to examples of each of them, so effortlessly that they seem pre-formed in some vast internal archive, ready to be “read off” at a moment’s notice
-
Show this thread
-
6/ But your mind is making each of those up in the moment, the same way the next tweet seems to exist pre-made just off the edge of the screen, but is actually being generated on demand right at the instant you scroll up
2 replies 1 retweet 17 likesShow this thread -
7/ Much of the book is dedicated to detailed explanations of a long series of experiments that support these assertions, from perception to memory to visual acuity to emotion to logic to physiology
1 reply 1 retweet 10 likesShow this thread -
8/ In AI, it was long attempted to “excavate our mental depths, and to bring to the surface as much of this supposed inner storehouse of beliefs as we can.” But the knowledge, insights and strategies that top human players claim they are using turn out to be totally contradictory
1 reply 1 retweet 11 likesShow this thread -
9/ Instead, huge strides in machine learning have been made by completely bypassing human knowledge extraction, and creating algorithms that directly confront the problem to be solved and learn from experience
1 reply 2 retweets 15 likesShow this thread -
10/ “Chess grandmasters can’t really explain how they play chess; doctors can’t explain how they diagnose patients; & none of us can remotely explain how we understand the everyday world of ppl & objects.”
2 replies 2 retweets 21 likesShow this thread -
11/ “What we say sounds like explanation – but really it is a terrible jumble that we are making up as we go along... analysis of these streams of verbal description, however long they continue, shows that they are little more than a series of loosely connected fragments.”
1 reply 1 retweet 17 likesShow this thread -
12/ This is called the Illusion of Explanatory Depth: “the bizarre contrast between our feeling of understanding and our inability to produce cogent explanations, whether explaining how a fridge works, how to steer a bicycle, or the origin of the tides...”
1 reply 1 retweet 24 likesShow this thread -
13/ In linguistics, Noam Chomsky’s generative grammar tried to systematize structure of language, but it turns out that even the structural patterns observed in language – not just its meaning – are a jumble of inconsistent regularities, sub-regularities and outright exceptions
1 reply 1 retweet 13 likesShow this thread -
14/ In economics, assumption that consumers and companies have a complete and consistent understanding of their own preferences has also foundered
2 replies 1 retweet 8 likesShow this thread
tbf those assumptions themselves are post hoc rationalizations of simpler models that just worked!
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.