Yes this is another version of the “billions of tiny spooks” problem. As usual no one has done a good explanation of this for a general audience, although it’s well-understood in philosophy of mind. Why do I always have to do all the translational work?https://meaningness.com/representational-theory-of-mind …
-
-
Curious about why you people think the homonculus is still there.
1 reply 0 retweets 0 likes -
(Awake briefly at 2:30am, will follow up later) (Speaking only for myself): It’s not there; it’s that cognitivist explanations of subjectivity, and of intentionality, can’t work without it.
1 reply 0 retweets 2 likes -
Replying to @Meaningness @xuenay and
So, the question is, how do you get mental things in a materialist metaphysics, i.e. one without spooks (such as a homunculus). Dennett is unusual in seemingly taking an “eliminationist” approach, i.e. simply denying that the mental phenomena (qualia, intentionality) exist.
1 reply 0 retweets 1 like -
Replying to @Meaningness @xuenay and
(“Seemingly” because when I last read his stuff, which was like 30 years ago, he waffled a bit. He may have clarified or changed his position since, but I haven’t heard so.)
2 replies 0 retweets 0 likes -
Replying to @Meaningness @xuenay and
Eliminativism solves the logical problem, but hardly anyone else buys it. Also it seems to make the substantive part of the job of cogsci much harder, because you can no longer use mental entities in your explanations.
1 reply 0 retweets 2 likes -
Replying to @Meaningness @xuenay and
Also, Eliminativist: You don’t have subjective experiences. Anyone else: Yes I do! E: That’s just an illusion. A: An illusion is a mistaken subjective experience, and you just said I don’t have them.
2 replies 0 retweets 6 likes -
Replying to @Meaningness @xuenay and
E: Well, I was being polite. Actually, you are just wired up to say you have experiences. And beliefs. You don’t actually believe you have experiences, or anything else. No one is home; you are a low-quality robot. A: [Punches him]
2 replies 0 retweets 2 likes -
Replying to @Meaningness @xuenay and
So say we admit there is subjective experience and want to explain it. Generally experience is experience *of* something; it is “intentional” in the technical sense of *about* something. So how does it get its aboutness?
1 reply 0 retweets 0 likes -
Replying to @Meaningness @xuenay and
The usual cognitivist move is to make the intentionality of experience dependent on the intentionality of representations. That’s because for a while they thought they had an explanation for the intentionality of representations.
1 reply 0 retweets 1 like
Or, actually, they thought AI guys did. We thought they did, so we both proceeded with the assumption that intentionality was understood, leaving the hard part to the other field. Once both sides realized this, the whole thing imploded.
-
-
Replying to @Meaningness @xuenay and
If there were an explanation of intentionality, that wouldn’t be an explanation of subjectivity. One could imagine an AI with genuinely-referring representations that has no subjectivity. In fact it is commonly (though mistakenly) believed that programs routinely do just that.
1 reply 0 retweets 2 likes -
Replying to @Meaningness @xuenay and
The prototype for the physical theory of mental representations is sentences written on paper. The difficulty is that they have meaning (intentionality) only for a reader. Who reads the sentences in our heads?
1 reply 0 retweets 1 like - 24 more replies
New conversation -
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.