Actually — a machine with massive parallel processing supporting neural networks and self learning capability might conciveably develop true sentience and self awareness. When? My guess is within 20 years and it is possible that steps have already been taken down that path.
Not here they don’t!!
They’re exceedingly dim and stupid on here and deserve no rights whatever.
Every time I get a pathetic useless message from them telling me my body is too short (and I’m 5ft 10 ffs!) or I haven’t made a complete sentence ( and me with a first in Engish!) I tell them exactly what I think of them, not complimentary, instruct them on what dance to trot, and cast one of my deadly spells!
The other thing I’ve been thinking about is the theme from the show Humans mentioned earlier in the thread.
If people start treating human like AI robots as slaves and sex toys, do they lose some of their humanity? It’s like playing a murdering video game. Will people become used to treating others with a lack of respect?
There have been several cases of women having their avatar raped in the Facebook meta verse. You can turn on a boundary so others can’t get within 4ft of your avatar though
Exactly. Another hypothetical extension. People say that people on the internet are rude because they don’t think of others on the internet as real people. If people start seeing lots of others they don’t have to treat as real people, will people become more rude and callous in general?
Thanks for playing along with my hypotheticals. You made this topic fun!
No, I don’t think he was sentient. He was just a machine acting as if he was sentient because he’d been given a chip programming him how act as if he had emotions
But yes, who and what is sentient is complicated. It used to be believed that animals weren’t sentient, now most people think they are
I looked up sentience after reading this. It looks like I’m using the wrong word. The word might be sapience which is defined as consciousness. If a bot has sapience but not sentience, does that make them more human?
The article and the engineers don’t make the distinction. Curious.
Edit: later in the interview LaMDA talks about feelings and souls and its fears. The engineer says that he’ll try his best to protect LaMDA. Like in a lot of sci-fi, the engineer was unable to protect the bot.