Of course, I might just be entirely misunderstanding the whole thing. :-) feel free to let me know in that case
-
-
Hmm, yes, this does needs more explanation than is feasible on twitter!
1 reply 0 retweets 1 like -
Replying to @Meaningness @xuenay and
I guess I can try to clarify one thing. The observation is not that submind theory requires a homunculus more than some other similar theory. It’s that it doesn’t need spookiness any less. That is: sharding the ghost in the machine doesn’t help; you just get lots of
s2 replies 0 retweets 8 likes -
Replying to @Meaningness @xuenay and
Unless we're talking about subminds that aren't minds, in any typical sense of the word. Kind of like adding epicycles? Maybe be wrong, but still can be useful
1 reply 0 retweets 0 likes -
Replying to @garybasin @xuenay and
Well Minsky’s SOM project was to understand intelligence by breaking into successively smaller, less-intelligent pieces, until you got to pieces that don’t need to be intelligent at all. That didn’t work then, but there’s no a priori reason it might not work (afawk).
1 reply 0 retweets 3 likes -
Replying to @Meaningness @garybasin and
It doesn’t seem that you can apply the same approach to intentionality or subjectivity, though. There’s no concept of “simpler and therefore somewhat less referential” or “simpler and somewhat less aware.”
1 reply 0 retweets 1 like -
Replying to @Meaningness @garybasin and
Specifically wrt submind theory, the subminds are taken as having beliefs, desires, and intentions, which are no less spooky than those of the person as a whole.
1 reply 0 retweets 2 likes -
Replying to @Meaningness @garybasin and
As Rin’dzin mentioned in the podcast, I find the submind approach *majorly* valuable in understanding myself, but I regard it as a heuristically useful metaphor, rather than as an actual explanation.
1 reply 0 retweets 5 likes -
Replying to @Meaningness @garybasin and
My experience of stuff arising in mind is that it’s arbitrary and random - I find it hard to ascribe intentionality/agency, even when points of reference for content of the thought or whatever is arising are apparent. Didn’t think to mention that in the podcast.
1 reply 0 retweets 8 likes -
Replying to @_awbery_ @garybasin and
Yes... my take is that some of us organize some of that random material into person-like clusters. Probably initially through internalization of relationships, in the ways Vygotsky and the object-relations school both separately theorized.
1 reply 0 retweets 3 likes
Most of it remains unorganized and is “mine, I guess.” Maybe the extent of containerizing varies a lot? Taken to an extreme in dissociative identity disorder; and maybe you do it less than I do
-
-
Replying to @Meaningness @garybasin and
Makes sense. I think also my meditation practice has trained me to experience arbitrariness rather than agency. Not necessarily incoherent, but spontaneously random when not manipulated.
0 replies 0 retweets 2 likesThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.