Siri, write this down immediately. "A movie where robots are performing reverse-Turing tests on humans to see if we're smart enough to be considered sentient...
We have no empirical metrics for what conscious even is. It’s a completely emergent property. This is a long running philosophical debate. People also disagree on whether or not animals are actually conscious. So if people don’t even think their dog is conscious, then their ability to decide if an algorithm is would be questionable.
The weirdest problem we’re going to have is that AI could get really good at faking consciousness. It could merely be mimicking consciousness based on how it’s learned consciousness should look like, but is really just regurgitating information.
So how do we tell the difference between “real” consciousness and mimicry?
I’d also suggest going through https://www.3blue1brown.com/topics/neural-networks to understand some of the concepts behind it. The ‘regurgitating data’ isn’t entirely accurate (and that statement is likely to evoke some debate). The amount of source data is compressed far too much for it to be returned largely intact (see also https://en.wikipedia.org/wiki/Kolmogorov_complexity for some bit on the information theory)… though yes, there are some situations where specific passages (or images) get memorized.
We have no empirical metrics for what conscious even is. It’s a completely emergent property. This is a long running philosophical debate. People also disagree on whether or not animals are actually conscious. So if people don’t even think their dog is conscious, then their ability to decide if an algorithm is would be questionable.
The weirdest problem we’re going to have is that AI could get really good at faking consciousness. It could merely be mimicking consciousness based on how it’s learned consciousness should look like, but is really just regurgitating information.
So how do we tell the difference between “real” consciousness and mimicry?
Philosophers haven’t come up with a good way to determine if you were the only conscious being in a universe populated with zombies or not. See also https://en.wikipedia.org/wiki/Philosophical_zombie
I’d also suggest going through https://www.3blue1brown.com/topics/neural-networks to understand some of the concepts behind it. The ‘regurgitating data’ isn’t entirely accurate (and that statement is likely to evoke some debate). The amount of source data is compressed far too much for it to be returned largely intact (see also https://en.wikipedia.org/wiki/Kolmogorov_complexity for some bit on the information theory)… though yes, there are some situations where specific passages (or images) get memorized.