Say we discover (somehow) that it's actually impossible to simulate a human-level conscious mind on inorganic substrate. What did we discover?
(e.g. we're running in a simulation, and a resolution limit prematurely halts Moore's Law; etc)
Conversation
the word "simulate" is doing a lot of work here. simulate wrt what observables?
1
3
turing test type things say more about the communication channel with the AI than they say about the AI
2
8
It's a good question! I'm not quite sure, so I'll pose a weak frame: suppose we discover (positively) that there is some reason why an inorganic computational system could never "seem" (to all humans) to behave with human-like conscious agency. What must be true of such a world?
2
1
"seem" through what medium? in what social context? with what priming? what observations do people do the "seeming" with?
1
Any and all—the most expansive and permissive possible combination of answers to these and related questions. People simply don't regard these entities as conscious, no matter what.
1
then you've baked these entities not being conscious into the definition of consciousness and so it doesn't tell you anything
1
Replying to
I don't think that's really true. Say that no matter how much you practiced juggling, you could never make a ball appear to levitate in the air without support. I've baked this inability into non-levitatability, but it can be explained by physical law (i.e. gravity).
Replying to
but you have a well-defined set of observables there (the position of the ball). you can say an object levitates iff its position wrt the ground as observed by some instrument doesn't change over time
1
I bet you /can/ make a ball appear to levitate with clever use of strobe lights
There are lots of human and non-human organic beings that don’t pass Turing tests. So what? Inorganic brings will be conscious when people decide to treat them as such. Keep in mind that it does not have to be a universal belief in order to motivate people to act as if they are.
2
1
people treat furbys as conscious
1
This analogy does not convince me, because you CAN make a ball *appear* to levitate. So, if you can make a process *appear* to be conscious, what's the big deal if it actually isn't (according to your hypothesis)? What's the concrete consequence?



