Wow, what a great question.! There is a school of thought that believes that computer processing speeds will equal those of humans in the next 15 to 20 years. When this happens and computers are able to process information and stimuli at the same speed as humans will sentience be next? If so, then how could they be denied "Human Rights"? There are many SF writers wha have explored this question, not the least of whom being A. Asimov. I would suggest reading these works before formulating an opinion,
Warmest Regards,
KC Miller
8645809358
2006-07-07 17:43:41
·
answer #1
·
answered by kcatmc2 2
·
0⤊
0⤋
Define thought for themselves, because their 'thoughts' can only go to the point where the programmer has programmed algorithms and subroutines into the system. If you're asking if at he point where a 'computer' gains knowledge of it's existence and becomes sentient, then yes, I think it should be granted the right to exist and not be deactivated for fear of retaliating against it's maker.
2006-07-07 17:49:35
·
answer #2
·
answered by DarthFangNutts 5
·
0⤊
0⤋
Even though Descartes said "I think, therefore I am," I believe in this instance personhood would have to be described in terms of being a sentient being, one who has feeling and unstructured consciousness.
2006-07-07 17:37:17
·
answer #3
·
answered by Anonymous
·
0⤊
0⤋
in a what if question i feel like i have to give a far out-what if answer. if ai beings actually exist(ed) then i believe they would be subordinate to their creators and not allowed the creativity or freedom to express themselves.
2006-07-07 17:34:21
·
answer #4
·
answered by Anonymous
·
0⤊
0⤋
Yes
Well, 'being' rights....
2006-07-07 17:33:45
·
answer #5
·
answered by laylaUnplugged 4
·
0⤊
0⤋