The debate "Should a Fully Capable AI be Considered a Person" was started by
March 25, 2020, 5:26 am.
16 people are on the agree side of this discussion, while 54 people are on the disagree side.
That might be enough to see the common perception.
It looks like most people are against to this statement.
Nemiroff posted 5 arguments, diecinueve posted 1 argument to the agreers part.
Craven198787 posted 1 argument to the disagreers part.
Nemiroff, diecinueve, Dazed_Confused, chhotelalgoswami, Khairon, EdgarQer and 10 visitors agree.
akashmeka, jrardin12, Joelm, Bandile, eli, J1, tyler0300, Trump2020, purunsky, StrangeTime, Rashia, marky and 42 visitors disagree.
I don't have an opinion either way on whether an ai should automatically be a person or not. Personhood seems to be subjective in modern times. So the answer must take into account consensus.
The AI should be tested on what it can and cannot do, if it feels or not, if it has a social drive, and then compared to humans. Then that information should be debated in the society with the AI and a referendum should take place.
If I was in that society and there was no personal difference between me and an AI (except for their superior computation), if they weren't just an advanced calculator but had a social need/pain encoded into them then I'd vote yes. Without a solid social part my vote will always be no.
wow. hard no on this one so far.
by person, i dont mean a human, i mean an individual with a will. i think at a certain level it is irrefutable. what seperates us?
burying things that don't die sounds silly tho. is this meant to give given rights or as a punishment? i dont get this analogy.
yes to time off. if they are sentient, they may want to do a personal task for a while. keeping something sentient in forced labor is asking for a revolt that we will probably lose.
i want to reference the matrix where humans chose to not give robots any rights, so the robots decided to not give humans any rights. if you wont do this out of morality, may i recommend doing it out of self preservation?
I don't think so should we bury them to and give them time off for vacation
you are correct that personhood is an abstract, agreed upon definition... but the question was one of personal opinion. *should* they be considered persons, not whether they currently are.
the human personhood debate has 2 sides. 1. it's always a person if it will eventually become a human. or 2. it depends on sentience development.
ai is clearly never gonna be human, and in this example it's already sentient. so it may be quite a different scenario.
Depends on what the community that makes the AI think. A person is just an agreed upon abstract entity our legislation protects. We can't even agree when a human becomes a person so I imagine the same plurality will arise with AI, and I am fine if communities don't immediately accept them as persons
would you deny a thinking, feeling being personhood?