Patterns are the presentation. Before I can say that a red octagon is a 'sign' that represents the obligation to STOP, I have to be presented with a red octagon. It need not even be considered a pattern - it is a directly visible presence.
-
-
Replying to @S33light @Abel_TorresM
Again, no. The octagon requires establishing shapes over gestalts over edges over adjacency relations. Each layer has to be constructed in a learning process and is not immediately given.
1 reply 0 retweets 1 like -
Replying to @Plinz @Abel_TorresM
The gestalts have to be presented first before anything can be learned about them. The gestalt is visible and it need not represent anything at all.
1 reply 0 retweets 0 likes -
Replying to @S33light @Abel_TorresM
From the perspective of your brain, the patterns on the retina are not even ordered before it learns cooccurrence statistics. After learning a lot of those, it can get a 2d map, but not a gestalt yet. Try and code it, you'll see.
3 replies 0 retweets 1 like -
Gestalts are formed in the mind as a way of compressing the patterns (i.e. we discover exactly those gestalts that best encode the patterns, at the level between edge and shape).
2 replies 0 retweets 1 like -
Replying to @Plinz @Abel_TorresM
There is no justification for a data compression schema to be rendered as visible. Brain activity can be res'd down in another part of the brain, but that activity is the data. No sights or models are generated, it's just chemistry repeating.
1 reply 0 retweets 0 likes -
Replying to @S33light @Abel_TorresM
Vision _is_ the set of functions representing the data compression (as textured objects in a space of positions). It does not exist in addition to that, or independently of it.
2 replies 0 retweets 1 like -
Replying to @Plinz @Abel_TorresM
It can't be, because you can have the exact same data compression in a computer without any graphic output or isomorphic geometry in the circuits. There's no triangle in the computer hardware, only in our experience of a screen.
1 reply 0 retweets 0 likes -
Replying to @S33light @Abel_TorresM
The arrangement of the computer hardware in your mind's model of physical space is not directly relevant for the function it computes. Experience emerges over models (which are computed functions), not over the substrate.
2 replies 0 retweets 1 like -
Replying to @Plinz @Abel_TorresM
Why would any such thing as 'experience' 'emerge', and how? The brain needs no models, it's got hardware states controlling all behaviors, just like the computer.
1 reply 0 retweets 0 likes
The models are macro state descriptions of the brain's micro states. You (a model of an experiencing subject) exist within the models, not within the brain.
-
-
Replying to @Plinz @Abel_TorresM
Models and descriptions have no physical or phenomenal power. Before an input can be read as a signal, there must be some innate phenomenological capacity to receive, inspect and signify. Without that, no physical event is an input/output or even a cause/effect.
0 replies 0 retweets 0 likesThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.