When I let my mind wander it can turn into a dangerous weapon. I've been thinking lately of turning robots into the world's policemen. This is by no means an original concept, as the first time I recall seeing it is in the movie 'The Day the Earth Stood Still.' Gort is the robot and can vaporize just with a look. Klaatu is the only one that can control him, and gives a stern warning that if we expand our petty wars into space, Gort will act unilaterally.
And that's the problem with using robots as policemen. They have to be programmed by someone who has their own particular set of morals.
So whose morals do we use? If we go by the Ten Commandments a lot of people would be jailed. Are there morals we can agree on? Stealing: bad? What constitutes stealing? Robbing a bank? Surely. Banks charging exorbitant interest? Maybe not.
Can we agree that rape is a bad thing? Apparently not. There are cultures that allow minors to marry older husbands. I'll go with husbands because that's 99.9999% of the problem. You just don't hear very often of an older woman marrying a young boy, and I would consider the boy to be the lucky one.
Murder? Another gray area. I read an article the other day about a man that was murdered in front of witnesses, but because the man was so bad himself, everyone who witnessed it claimed they were turning away in horror at the time the crime was committed and therefore saw nothing.
You see, a robot has to see things in black and white, and the laws it would enforce have to be programmed into it. No one can agree on what a crime is.
And what about the rights of criminals? I'm not saying anyone has a right to commit a crime, but if said crime happens in your own home does a robot have the right to come in and stop it? Do we station robots in every home, or use domestic robots that people buy as watchful eyes?
It becomes a matter of what would society put up with. We already have the technology to limit cars so that they enforce speed limits, but who would want to own a car that won't let you go even one mile over the speed limit? Don't tell me I can't speed if the conditions allow it. Yes, it would cut back on accidents if everyone used self-driving cars. But apparently we as humans feel we have a right to act dangerously if we want.
Which brings us back to whose rules do robots follow? We're all human, with the frailties and insecurities that come with being that way. A robot being asked to police us would likely self-destruct because of all the contradictions
Or they would make us their slaves because we don't deserve freedom.
As for me, I welcome our robot overlords.
And that's the problem with using robots as policemen. They have to be programmed by someone who has their own particular set of morals.
So whose morals do we use? If we go by the Ten Commandments a lot of people would be jailed. Are there morals we can agree on? Stealing: bad? What constitutes stealing? Robbing a bank? Surely. Banks charging exorbitant interest? Maybe not.
Can we agree that rape is a bad thing? Apparently not. There are cultures that allow minors to marry older husbands. I'll go with husbands because that's 99.9999% of the problem. You just don't hear very often of an older woman marrying a young boy, and I would consider the boy to be the lucky one.
Murder? Another gray area. I read an article the other day about a man that was murdered in front of witnesses, but because the man was so bad himself, everyone who witnessed it claimed they were turning away in horror at the time the crime was committed and therefore saw nothing.
You see, a robot has to see things in black and white, and the laws it would enforce have to be programmed into it. No one can agree on what a crime is.
And what about the rights of criminals? I'm not saying anyone has a right to commit a crime, but if said crime happens in your own home does a robot have the right to come in and stop it? Do we station robots in every home, or use domestic robots that people buy as watchful eyes?
It becomes a matter of what would society put up with. We already have the technology to limit cars so that they enforce speed limits, but who would want to own a car that won't let you go even one mile over the speed limit? Don't tell me I can't speed if the conditions allow it. Yes, it would cut back on accidents if everyone used self-driving cars. But apparently we as humans feel we have a right to act dangerously if we want.
Which brings us back to whose rules do robots follow? We're all human, with the frailties and insecurities that come with being that way. A robot being asked to police us would likely self-destruct because of all the contradictions
Or they would make us their slaves because we don't deserve freedom.
As for me, I welcome our robot overlords.
No comments:
Post a Comment