Eλευθερί
Junior Member
@eleutheri
Posts: 3,710
Likes: 1,670
|
Post by Eλευθερί on May 8, 2018 21:19:50 GMT
Relevant to Westworld (original & current), Ex Machina, Blade Runner, Terminator: The Sarah Connor Chronicles, Lost in Space (original & current), etc
Suppose an android is so sophisticated that it physically is virtually indistinguishable from a real human being, and that it behaves in ways that are virtually indistinguishable from how real people behave in terms of speaking and how it *seems* to think. Suppose also that this particular android is a model for a really "good" person: it goes out of its way to be helpful to others, puts others' concerns before its own, and is generally very likeable.
Suppose, further, that there is a real human being who is a model for a total aszhole. A greedy, selfish, loathsome bully, who goes out of his way to help himself at just about everyone else's expense. Almost nobody likes him.
Now, given these two beings, if there were a situation in which only one of two "lives" could be saved--say, if there were room for only one of them in a lifeboat--What would be the morally right thing to do?:
(1) Save the android (2) Save the real human being
(and for the Christians here, What would Jesus do?)
|
|
Eλευθερί
Junior Member
@eleutheri
Posts: 3,710
Likes: 1,670
|
Post by Eλευθερί on May 9, 2018 3:06:10 GMT
Feel free to explain your feelings, whether you have a strong opinion or even if you're conflicted on the question. Don't be shy!
|
|
|
Post by PreachCaleb on May 9, 2018 15:52:02 GMT
I have a question: Do we know which one is the android?
|
|
|
Post by Winter_King on May 9, 2018 16:09:21 GMT
I have a question: Do we know which one is the android? Yeah I was wondering the same thing. If they are indistinguishable from humans how do we know which one is the human?
|
|
Eλευθερί
Junior Member
@eleutheri
Posts: 3,710
Likes: 1,670
|
Post by Eλευθερί on May 9, 2018 18:13:11 GMT
PreachCaleb Winter_KingThe android knows it is an android, and being the "good person" that it is, it volunteers to you that it is an android and offers to sacrifice its life. (The person, on the other hand, if asked this sort of question, would lie about this to save his own skin.)
|
|
|
Post by PreachCaleb on May 9, 2018 19:24:48 GMT
PreachCaleb Winter_King The android knows it is an android, and being the "good person" that it is, it volunteers to you that it is an android and offers to sacrifice its life. (The person, on the other hand, if asked this sort of question, would lie about this to save his own skin.) Wait, so the person would lie about being a person?
|
|
Eλευθερί
Junior Member
@eleutheri
Posts: 3,710
Likes: 1,670
|
Post by Eλευθερί on May 9, 2018 20:17:07 GMT
PreachCaleb Winter_King The android knows it is an android, and being the "good person" that it is, it volunteers to you that it is an android and offers to sacrifice its life. (The person, on the other hand, if asked this sort of question, would lie about this to save his own skin.) Wait, so the person would lie about being a person? You're thinking it too hard. What's important is that if the person were asked to tell the truth about his identity, he would lie if he thought it would help him. For example, if the person thought that if he admitted to being a person he would be denied the lifesaving intervention, then yes, he would lie and say he was an android. So, if it makes it easier for you to contemplate a possible situation, let's say that there is space for only one more in the lifeboat (or on the spaceship). Both the real android and the person are claiming that they are an android (because, for whatever reason, the person thinks that an android would get priority for the available space).
|
|
|
Post by PreachCaleb on May 9, 2018 20:33:53 GMT
But then that completely changes the scenario. Now it's no longer about me deciding whether to save a person or an android.
Now it's me deciding which android to save.
|
|
|
Post by politicidal on May 9, 2018 20:39:13 GMT
Neither. More room in the lifeboat for me.
|
|
Eλευθερί
Junior Member
@eleutheri
Posts: 3,710
Likes: 1,670
|
Post by Eλευθερί on May 9, 2018 22:46:54 GMT
But then that completely changes the scenario. Now it's no longer about me deciding whether to save a person or an android. Now it's me deciding which android to save. In this scenario, you know that in reality there is one android and there is one human. So you can deduce that one of them is lying.
|
|
|
Post by coldenhaulfield on May 10, 2018 17:03:35 GMT
But then that completely changes the scenario. Now it's no longer about me deciding whether to save a person or an android. Now it's me deciding which android to save. Nah, you missed the point entirely.
|
|
|
Post by PreachCaleb on May 10, 2018 18:26:02 GMT
But then that completely changes the scenario. Now it's no longer about me deciding whether to save a person or an android. Now it's me deciding which android to save. In this scenario, you know that in reality there is one android and there is one human. So you can deduce that one of them is lying. Am I to make a moral decision based on a guess? Plus, what led the person to believe I've already chosen to save the android?
|
|
|
Post by coldenhaulfield on May 10, 2018 20:03:28 GMT
In this scenario, you know that in reality there is one android and there is one human. So you can deduce that one of them is lying. Am I to make a moral decision based on a guess? Plus, what led the person to believe I've already chosen to save the android?
|
|