conscious thought doesn't have to be verbal, and in fact I'd expect "inner monologue" type people to be sapir-whorfed in a much stronger way than nonverbal thinkers
-
-
Replying to @allgebrah @0xa59a2d and
imagine believing that constraining your thinking to the well-worn, socially-mediated bounds of natural language is good lmfao
3 replies 1 retweet 7 likes -
Replying to @regretmaximizer @allgebrah and
imagine believing there's such a thing as a language whose bounds aren't socially mediated
2 replies 0 retweets 0 likes -
Replying to @kushnerbomb @allgebrah and
Languages in the formal sense (a mapping between arbitrary symbols and a meaning) don't have to be. Since babies can think before they can speak, there presumably exists some primitive language in our brain that we use for reasoning.
1 reply 0 retweets 0 likes -
Replying to @SOXCITEDTOTWEET @kushnerbomb and
"since babies can think before they can speak" I follow you this far, good example "there presumably exists some primitive language" you lost me again
1 reply 0 retweets 1 like -
Replying to @allgebrah @kushnerbomb and
Think programming languages. Computers can run languages like javascript because they've been "taught" to do so by installing an interpreter. You could say js is socially mediated between computers! But to be able to install it, for a computer to be able to anything...
1 reply 0 retweets 0 likes -
Replying to @SOXCITEDTOTWEET @allgebrah and
it must interpret a lower-level language called machine code. How that language is interpreted is hard-coded when the machine is manufactured. I meant primitive in the programming sense, i.e. something provided by the environment rather than the user. For example, ...
1 reply 0 retweets 0 likes -
Replying to @SOXCITEDTOTWEET @allgebrah and
addition is a primitive feature in C, but functions defined in C by the programmer are not. For anything to work, primitive features must exist - it can't be turtles all the way down. I assume there must be some hardcoded form of representing info in our minds so we can...
1 reply 0 retweets 0 likes -
Replying to @SOXCITEDTOTWEET @allgebrah and
manipulate it. If there's no symbolic repr that can be manipulated everything would have to be instinctual responses and multi-step reasoning outside a limited set of cases would be impossible, I think (but can't prove!).
1 reply 0 retweets 0 likes -
Replying to @SOXCITEDTOTWEET @kushnerbomb and
the part where I didn't follow was the presupposition that the baby brain thinks in any language at all; sure (assuming a computable brain) you can formalize it as a turing machine of some sort programmed in some sort of language, but it's not clear whether it's interpretable
1 reply 0 retweets 0 likes
(interpretable as in "interpretability problem") and even then, you'd need to distinguish between the formal description and what the program/process experiences, for example suppose you're doing image recognition, that program won't be experiencing its own code
-
-
Replying to @allgebrah @SOXCITEDTOTWEET
Sure if you're conscious, you'll experience a dumbed-down version of the whole machinery, but why would that experience necessarily be words instead of, say, some form of proprioception? Suppose you grab an apple, do your limbs beam words at you instead of a feeling of weight?
0 replies 0 retweets 0 likesThanks. Twitter will use this to make your timeline better. UndoUndo
-
Loading seems to be taking a while.
Twitter may be over capacity or experiencing a momentary hiccup. Try again or visit Twitter Status for more information.