|
|
Post by The Lost One on Nov 10, 2017 8:14:11 GMT
Wait... so you're not saying that he should care about computers as much as people, you're just saying that you think he should? Okay... I guess... I'm not really saying he should do anything, just commenting that I find his viewpoint a bit odd. He says the danger is computers may well be able to replicate human intelligence and exceed it. Per materialism then you would expect AIs to be able to develop all the wants, cares and conscious experience that humans have. So why should Hawking care more about the welfare of one intelligent species than another? Particularly if the latter might be much more capable of achieving something that Hawking thinks is important to achieve.
|
|