The Sapir Whorf hypothesis (language defines what we can perceive and think) is mostly wrong for natural language, but true for programming. Computer languages don't differ in what they can do but in how they let us think.
-
-
You're right, Everett more or less demonstrated that ambiguity in this paper (https://daneverettbooks.com/wp-content/uploads/2014/04/FEFG-cognition-in-press.pdf …). He also showed how numbers are better thought of as compressing information into what's useful, the implication being that it's not at all inevitable we develop numbers.
-
I also like how counting as a technology gels with the paper
@devonzuegel posted: https://twitter.com/devonzuegel/status/987860902056681472 …. More,@michael_nielsen's response has a clear analog to the Piraha: they have no means of conveying where something is, and, they have a problem with cultural memory
- 5 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.