Any readings on the 1980's culture in AI research?
-
-
Replying to @11kilobytes
Hmm… nothing come immediately to mind, I’m afraid
2 replies 0 retweets 0 likes -
Replying to @Meaningness @11kilobytes
I’m sure lots was written at the time though
1 reply 0 retweets 1 like -
Replying to @Meaningness
That's a good suggestion. What are the "cognitive distortions?" Is it something like wrong modes-of-being that are promoted by Rationalist eternalism? Or perhaps cogntivism + representationalism?
1 reply 0 retweets 3 likes -
Replying to @11kilobytes
Yes, exactly those! Plus maybe a kind of naive idealism and cluelessness about the world in general.
1 reply 0 retweets 5 likes -
Replying to @Meaningness @11kilobytes
Malcolm ☣️cean - check sense of smell 👃 — 🌎 🇨🇦 Retweeted Malcolm ☣️cean - check sense of smell 👃 — 🌎 🇨🇦
I'm guessing a lot of this has to do with reification of the left hemisphere's world, which is baked into most paradigms but particularly loud & explicit in AI-land. Thread, which I think you'd find worth reading and I'd be interested in your thoughts on:https://twitter.com/Malcolm_Ocean/status/1187752394702151681 …
Malcolm ☣️cean - check sense of smell 👃 — 🌎 🇨🇦 added,
Malcolm ☣️cean - check sense of smell 👃 — 🌎 🇨🇦 @Malcolm_OceanReading this post by@JanelleCShane and I'm like "oh man, our current ai bots are basically left hemispheres". > "When confused, it tends not to admit it" https://aiweirdness.com/post/175110257767/the-visual-chatbot … pic.twitter.com/wKowi16RtyShow this thread1 reply 0 retweets 2 likes -
relatedly, the traditional "laws of thought" are "laws of LHemisphere" law of identity law of non-contradiction law of excluded middle they only make sense in contextless spaces of zero nebulosity AI folks may not think literally these, but... similar https://en.wikipedia.org/wiki/Law_of_thought#The_three_traditional_laws …
1 reply 0 retweets 0 likes -
Replying to @Malcolm_Ocean @11kilobytes
I’m interested in McGilcrist because so many people are fans. However, everything I’ve read about the book suggests it’s a pretty standard dual-process theory. That does give some insight but is limited and well-trod. Maybe I’m missing what’s distinctive? https://meaningness.com/eggplant/cognitive-science#dual-process …pic.twitter.com/yENH8BBxm6
4 replies 0 retweets 3 likes -
I am trying to check my understanding here--but would it be correct to say that the problem with dual process theories is that they: 1) assume reasonableness and rationality are "modules" in the mind 2) mix up reasonableness and relevance realization system. RR maps on to...
1 reply 0 retweets 0 likes -
Replying to @tr4nsmute @Meaningness and
...what Heidegger meant by "the world presented to us is already meaningful", and reasonableness and rationality are things we do--and do not necessarily correspond to specific cognitive modules.
1 reply 0 retweets 1 like
Yes, all those! And additionally a dual process theory has to lump together a lot of disparate things to get down to just two categories. Important distinctions are deliberately obscured to make the classification seem to work.
-
-
Replying to @Meaningness @tr4nsmute and
Example. When the LW rationalists first encountered my ranting against rationalism, they assumed I meant “emotions are important too” or “intuition is superior” or “sometimes it’s useful to believe false things” etc.
1 reply 0 retweets 1 like -
Replying to @Meaningness @tr4nsmute and
Which was natural because those ARE standard critiques of rationalism; but they aren’t at all what I was saying. If the 10,000 things we do that aren’t rationality are lumped together, then you can gesture at the whole lot and say “important too” which is true but vague.
2 replies 0 retweets 1 like - 8 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.