Ethics+&+Robotics

Read the NY Times article entitled "[|War Machines: Recruiting Robots for Combat]"

Respond to the following query with 250+ words: "The article describes the conflict between those who advocate using armed robots in combat and those who warn against the ethical issues around automating the decisions by the robots to use force. What is your opinion in that debate? Which side do you feel has the more compelling argument? Also, how can we as a Quaker school, with deep-rooted non-violent principles, reconcile our participation in the world of robotics with the potential use of these same devices as weapons of war?"

Remember to include your name!

//Jenna McKinley:// After fighter jets and the atomic bomb, it seems unsurprising that we would be speculating about robots in the army. As is the case with many subjects related to war, there is a great deal of controversy over the use of robots instead of humans on the battlefield.

These robots are designed to safeguard human life. Some can disarm bombs, which obviously would cause a lot of harm if detonated, but even if the bombs are not disarmed properly, the loss of a robot is much less devastating than the loss of a human.

However, the New York Times article made a good point: that if robots are more fully integrated into world warfare, wars will be much easier to start and wage—which is not what we should be aiming for. The article also acknowledged that these robots are not infallible. They could, upon receiving the wrong intelligence, target a houseful of civilians instead of a group of terrorists. However, I don't see how this risk is that much greater than with actual humans. The wrong intelligence is still the wrong intelligence, no matter who is firing the shots.

Honestly, I'm not sure where I stand on this issue, but I don't think there is anything wrong or untoward about learning about and building robots at a Quaker school. It's not as if the only possible purpose of robotics is to kill people; robotics is (are?) involved in a plethora of other things in the world today. Also, if Quakers focus their pacifist efforts on stamping out the root causes of violence and war rather than shunning everything related to it, this discussion will be, happily, irrelevant.

Blanca Zelaya-Rincon:

Robots are useful in many situations, but there are some in which I think that they should play no part. If wars would be easier to start and wage by using robots, I think that this shows us that maybe using them in wars is not the best answer. Eventually wars would be fought with only robots, so the idea is that less people would be killed or harmed. However, I believe that more innocent people could be killed by these robots by accident, because they may not be able to distinguish the opponent from the innocent all the time. Robots do not have morals, and even if you program them so that they theoretically would, I doubt that it would work well. I know that in the article John Arquilla was quoted saying “I will stand my artificial intelligence against your human any day of the week and tell you that my A.I. will pay more attention to the rules of engagement and create fewer ethical lapses than a human force”, but it is very difficult to make the best decision when there are so many aspects that may contradict each other. For artificial intelligence to work well, the robot would have to indeed be very intelligent. Is it really a good idea to have something armed and so intelligent out there? This situation reminds me of all those sci-fi movies where the robots become smarter than the humans and start killing them for what they think is “the good of the world”. No matter how crazy this may seem to some, I do not find it completely impossible. So, I honestly do not think that robots should be used in wars for fighting purposes. They could be used as spies, and detectors, and stuff like that, but I do not think that they should be armed. As a Quaker school, I do not think that it is wrong to study robotics at all. If you become paranoid about things like robots being used for violence, then you have to remember that virtually anything can be converted into a tool used for violence. If you think this way and live through fear, then you will probably lock yourself in the basement for the rest of your life.

// Charlotte Harrington:  // In the past, there has been glory and risk attached to war, now with developments in robotics it is turning into more of a video game. By turning war into some game that offers a lesser risk for the soldiers, it makes war something that its much easier to begin as there is less of a cost. Wars would be triggered more easily, which would not be the desired effect of robots in war. There have been additional arguments that the machines while they would be better at firing weapons and surviving, they do not have the same instinct of soldiers, which would be more dangerous for civilians. The counter argument however is as Tom Malinowski said,   “If the decisions are being made by a human being who has eyes on the target, whether he is sitting in a tank or miles away, the main safeguard is still there.” Therefore as long as artificial intelligence is limited and men still control the actions of the bot, it seems robots in war would be a better option. Their role would mean less glory for the soldiers, but the value of their lives in my opinion is worth more than that. Robots, as this article showed, have the potential to be very dangerous weapons and if that is considered, robotics in Quaker schools such as ours is challenged. Why should a Quaker school support the development of robots as they are used as weapons in war, which is against the core Quaker belief of non-violence? I think that the other functions of robots should be considered, such as how they can perform cleaning activities, and how knowledge in the area of robotics does not have to lead to war. While the topic of robots is controversial and I personally am slightly confused, I currently would say that robots, if their intelligence is limited are perfectly acceptable in war. I also think Quaker schools should be allowed to study robotics because the other functions of the robotics should be remembered.