Conversation

This Tweet was deleted by the Tweet author. Learn more
You’re unable to view this Tweet because this account owner limits who can view their Tweets. Learn more
You’re unable to view this Tweet because this account owner limits who can view their Tweets. Learn more
You’re unable to view this Tweet because this account owner limits who can view their Tweets. Learn more
You’re unable to view this Tweet because this account owner limits who can view their Tweets. Learn more
You’re unable to view this Tweet because this account owner limits who can view their Tweets. Learn more
(interpretable as in "interpretability problem") and even then, you'd need to distinguish between the formal description and what the program/process experiences, for example suppose you're doing image recognition, that program won't be experiencing its own code
1
Sure if you're conscious, you'll experience a dumbed-down version of the whole machinery, but why would that experience necessarily be words instead of, say, some form of proprioception? Suppose you grab an apple, do your limbs beam words at you instead of a feeling of weight?