Conversation

Poll thread: love and robots, morality edition 1. A man designs a chatbot out of advanced machine learning designed for top-tier realistic sexting. He then talks sexually with his creation for his own sexual satisfaction. This is:
  • Very not okay
    3.9%
  • Kinda not okay
    5.2%
  • Kinda okay
    20.7%
  • Very okay
    70.2%
6,872 votesFinal results
29
80
A man uses advanced machine learning to design an AI that produces a video stream of an attractive woman (not based off any one person). The image of the woman responds realistically to his commands to undress, bend over, etc. This is:
  • Very not okay
    3.4%
  • Kinda not okay
    5.3%
  • Kinda okay
    22.3%
  • Very okay
    69%
4,438 votesFinal results
3
23
A man designs a hyper-realistic physical robot in the form of a beautiful woman. It can do basic things like make moaning noises if you stroke it or close its eyes when the lights turn off. The man then has sex with this robot. This is:
  • Very not okay
    3.7%
  • Kinda not okay
    5.4%
  • Kinda okay
    22.4%
  • Very okay
    68.5%
4,170 votesFinal results
1
23
A man designs a hyper-realistic physical robot in the form of a beautiful woman. It has primitive intelligence (like Alexa or gpt3), but can walk and move perfectly realistically, fluidly, indistinguishable from a real person. He has sex with it. This is:
  • Very not okay
    4.2%
  • Kinda not okay
    6.5%
  • Kinda okay
    23.4%
  • Very okay
    65.9%
4,060 votesFinal results
4
22
A man uses advanced machine learning to design a hyper-realistic chatbot, complete with a consistent personality, memories, learns over time. This chatbot is trained on emotional support. He uses it to vent to, cry to, and it offers unconditional love and assurance. This is:
  • Very not okay
    4.4%
  • Kinda not okay
    7.8%
  • Kinda okay
    23.9%
  • Very okay
    63.8%
3,938 votesFinal results
2
22
A man designs a hyper-realistic physical AI/robot in the form of a beautiful woman. It has realistic interaction - consistent personality, memories, learns over time. It can move perfectly realistically, fluidly, indistinguishable from a real person. He has sex with it. This is
  • Very not okay
    6.4%
  • Kinda not okay
    10.8%
  • Kinda okay
    27%
  • Very okay
    55.9%
3,822 votesFinal results
9
21
A man designs a hyper-realistic AI/robot in the form of a beautiful woman that talks, moves, acts, and learns almost indistinguishably from real people. He designs its reward function to be fulfilled from sex, begging for sex, being a slut. He has sex with it. This is:
  • Very not okay
    9.5%
  • Kinda not okay
    14%
  • Kinda okay
    26.4%
  • Very okay
    50.1%
4,147 votesFinal results
3
27
A man designs a hyper-realistic AI/robot in the form of a beautiful woman that talks, moves, acts, and learns almost indistinguishably from real people. He designs its reward function to be fulfilled from masochism, wanting to be broken, hurt, tortured. He tortures it. This is:
  • Very not okay
    21.4%
  • Kinda not okay
    25.9%
  • Kinda okay
    22.3%
  • Very okay
    30.3%
4,157 votesFinal results
Replying to
Is it experiencing suffering while it's being "hurt" (perhaps mixed/conflicting with pleasure), or is it only appearing to experience pain while actually experiencing only pleasure?
1
Replying to
this is so interesting to me - i'm way more okay with designing a being's reward function to desire being hurt and then hurting it than i am with not designing its reward function at all and then doing something to it that it might not like bc you built it and therefore "own" it
1
2
Replying to
for me this all hinges on whether the robot is conscious and whether the man knows it's conscious. if yes, then the man should act toward it with the same ethical standards he would use for beings with a comparable level of consciousness. if not, then it's a sex toy; do w/e
1
Replying to
He designed its reward function. This still limits the idea of true sentients and autonomy. If it can obstruct it’s original function and find purpose beyond its creators original functions set criteria then I see no reason to place value when it cannot perceive value in itself.
1
1
Replying to
My "Kinda not okay" on this one is based on my suspicion that indulging in anti-social fantasies is unhealthful. Paternalism. I rate being "Kinda not okay" with this as "Kinda not okay." Life is complicated. Hypothetical life all the more so.
4
Replying to
If I answered this from the perspective of the guys mental health, I'd say not okay. But it's been created to enjoy the torture. Sure the guy might be seen as not ok for doing that, but the act itself sounds fine to me.