I understand, re your parable of the pebbles, that representation is a relationship between the thing with behavior and the world, which has to be actively maintained.
-
-
Replying to @robamacl @Meaningness and
This is very similar to Dennet's recent use of "aboutness" in his last book (which I have not finished). This is perhaps a less loaded term than representation.
1 reply 0 retweets 0 likes -
Replying to @robamacl @Meaningness and
I also understand that what we might call strong representationism form AI circa '80 did not pan out. This is is idea that we can "represent the world" in logic or whatever, then we can represent what we want to happen, and run SAT solver or whatever and get useful behavior.
1 reply 0 retweets 1 like -
Replying to @robamacl @Meaningness and
This failure is unsurprising in hindsight. One clue is that this clearly is not how relatively simple organisms such as flatworms generate behavior.
1 reply 0 retweets 0 likes -
Replying to @robamacl @Meaningness and
I'd say that at least one thing I mean by representation is that it causally mediates mediates behavior. If I change the state of the representation, then the behavior will change.
1 reply 0 retweets 0 likes -
Replying to @robamacl @Meaningness and
So, we might say the position of the bimetal strip in a (old) thermostat resents the temperature. If I push on the strip, I can change that and make the furnace go on.
1 reply 0 retweets 0 likes -
Replying to @robamacl @Meaningness and
Likewise, if someone stole a stone from the shepard's basket, his behavior would change. But no such effect from the stones the passing girl has.
1 reply 0 retweets 0 likes -
Replying to @robamacl @OortCloudAtlas and
Well, this was the main topic of philosophy of mind in the 80s, and I think it’s reasonably fair to say that it ended with everyone giving up and moving on. Dennet was one major player. The problems are hairy and not easily summarized.
2 replies 0 retweets 0 likes -
Replying to @Meaningness @robamacl and
The buzzword was “intentionality” in analytic philosophy and “the symbol-grounding problem” in AI. The “binding problem” was a related manifestation in connectionism. I’d suggest the SEP articles cited here? [Eggplant draft text]pic.twitter.com/qQoBMCroDV
1 reply 0 retweets 2 likes -
Replying to @Meaningness @robamacl and
“Intentionality” is just jargon for “aboutness.” The question is how can a physical thing-in-the-head be about something else. The usual analysis is that the bimetallic strip is *not* intentional because it’s too closely coupled causally. (Not everyone agrees on this point.)
2 replies 0 retweets 1 like
The retinotopic maps in V1 cortex are arguably intentional but, like the bimetallic strip, they are arguably too mechanistic to count. There’s a sense that a representation has to be sufficiently separated that it can be wrong, & involves interpretation, not just bottom-up flow
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.