|
|
Post by general313 on Nov 13, 2017 15:57:58 GMT
We can't even be sure two humans see the same "red" qualia when they look at something with red light. True enough, but I think there's at least a more likely assumption that two humans experience something similar when seeing red; much harder to assume that when the medium used to experience is completely different as it would be with AI. I tend towards the idea that hardware is irrelevant. Maybe I'm being mislead by over-reliance on the following analogies, but it seems to be the best that we have to go with for the time being. When computers transitioned from vacuum tubes to transistors, it didn't make a bit of difference to the software engineers that developed the operating system and applications for these computers. Similarly neural network behavior seems to be completely unaffected by whether it is run in a software simulation, a GPU, even more specialized digital hardware, analog hardware or biological cellular tissues. All the scientific evidence so far suggests that the brain operates very much like a very sophisticated system of interconnected neural networks. I suspect that red qualia properties are determined by the structure of a neural network and not at all on the physical details of how that neural network is implemented.
|
|