Yes I did see it. I don't think we have much to worry about. It would largely depend on what sort of application it is going to be used for.
Computers can already beat humans at Chess, nobody worried about that. A computer is just a machine, a tool for us to use, why would anyone want to give it the ability to think for itself.
There are too many variables involved in making decisions, and I don't think computers should be involved in that sort of thing. Several planes crashed because they were allowed to make decisions and made the wrong one.
An airbus crashed during demonstrations because the pilot wanted to make a low, slow flypast, but the computer took over because it thought the pilot was going to land, and would not allow him to retake control.
Would you want that sort of thing on a computer making important decisions, what if it makes the wrong one when millions of lives are at stake.....
2006-10-26 14:39:17
·
answer #1
·
answered by colin.christie 3
·
0⤊
1⤋
Didn't see it, wish I had.
If in the 1950's the strategic defence systems had been linked to decision making by computer the world population would be about 1000 people and they would now be dwelling in caves.
Really the problem is the other way round we keep trying to fit people into pigeon holes so they fit in with computers!! Despite all our SATS & IQ tests the one thing we can do that computers cannot is to imagine something which has not been thought of before. Sadly though our education system is rubbish at testing for or developing creativity in children.
So that's it then, computers can process data but they cannot create a new concept. The ultimate creepy bit will be making us into Star Trek Borgs where we are enhanced by technology... We will then be like the end of Annimal Farm (pigs & humans) where it may be impossible to know where we end and the technology begins. Sounds terrifying to me!!!
2006-10-26 15:28:38
·
answer #2
·
answered by Anonymous
·
0⤊
0⤋
I am not worried ( I`ll dead by the time it happens!) It is essential that they have a fail safe in built, so that they can at anytime be de-activated/shut down with haste. Providing we can always have control, their use could have amazing possibilities in so many fields in the future. I am not so sure that I like the idea of down-loading the knowledge from human brains into them. Surely in doing this we would also part with knowledge that requires human emotions to deal with it, would the computer be able to make a logical decision on an emotional issue ? If they could not , what would happen? Would they have to be `cold calculators` dismissing or discarding anything it could not deal with. Or do we program them with human emotions and trust that their particular human was a moral person. I am glad the decision will not be mine.
2006-10-26 12:18:09
·
answer #3
·
answered by Social Science Lady 7
·
0⤊
1⤋
Yes I did see it and I am scared.
I'm not sure whether I'm more fearful of the potential of the technology or the people that may rise against it. To be honest it reminded me of the scenario in Terminator, where the computers could go mad and try and kill us. The good thing is that I'll be dead and gone by the time the pooh hits the fan.
I did wonder of the relevance of some of the research, why exactly do we need remote controlled rats?
2006-10-26 11:26:58
·
answer #4
·
answered by Gomduri 2
·
1⤊
0⤋
I'm sceptical, if a computer can be as fast and as powerful as a human brain then we are all in trouble. But, and its a big but, surely a computer can only learn what we (the programmers) put into it. If its programme can become more powerful than us, and become a threat, then cant we just 'unplug' it. Did this horizon seem a bit 'terminator' to you. As for the gay bomb? Well, no wonder the US pulled the plug on that one. Good watching but I won't loose sleep because of it
2006-10-26 11:23:26
·
answer #5
·
answered by Goatie 3
·
0⤊
1⤋
It's rather arrogant to think we are alone in this universe so i think the technical pursuit should continue so that we can human kind can benefit, as for AI it's possible we will create something that will destroy us but how is this any more dangerous than "weapons of mass destruction?" the only difference is that if we created a higher intelligence than us then perhaps they would be smarter and set about preserving life as we should be.
2006-10-26 11:22:42
·
answer #6
·
answered by Anonymous
·
0⤊
1⤋
Yip i seen it!
It does seem scary especially the piece about 'hooking' a monkey upto some wires to allow it to control a robotic arm with just his brain was pretty surreal-I suppose it could offer possibilites to those who are paralysed but it scares the hell out of me!
2006-10-26 11:23:56
·
answer #7
·
answered by MickyMick 1
·
0⤊
0⤋
I'd better be careful about what I say because there could be some computer reading this some day and it will spell my doom.
My computer just started acting more intelligent lately and has refused to follow my instructions.
2006-10-30 03:31:13
·
answer #8
·
answered by Jeff K 4
·
0⤊
0⤋
Damn I wish I had seen that, its so groundbreaking.
It doesnt scare me though as we will build them with rules (like Asimovs 3 laws) or just not as ambitious, hateful as humans. As long as we build them sensibly we retain control.
Tech wars will happen, it already has, eg cruse missiles V dumb drones (suicide bombers).
2006-10-26 11:19:56
·
answer #9
·
answered by John S 4
·
0⤊
1⤋
Don't worry, the future may have something completely different waiting for us.
2006-10-26 11:23:40
·
answer #10
·
answered by Anonymous
·
0⤊
1⤋