Wednesday 30 Jul
 
 
 photo BO-Button1_zps13524083.jpg

 

OKG Newsletter


Home · Articles · News · News · Oklahoma researchers discover why...
News
 

Oklahoma researchers discover why Westerners are uneasy with robots


Ben Fenwick August 28th, 2008

Somewhere now on the windswept reaches along the Korean Demilitarized Zone, a soldier may be taking a position that might mean life or death for anyone approaching. The soldier is steely-eyed and abl...

Ted-Metzler-mug.

Somewhere now on the windswept reaches along the Korean Demilitarized Zone, a soldier may be taking a position that might mean life or death for anyone approaching.

The soldier is steely-eyed and able to kill according to the rules of engagement (ROE), orders for when to fire on someone approaching the position.

MORAL QUALMS
FORMS OF RELIGION

The ROE can be tricky, requiring fortitude to pull the trigger when the approaching figure doesn't respond properly. Sometimes a soldier who is a little too compassionate might freeze up and never fire, unable even to take the life of some bent on killing him.

However, those who would assign this particular soldier to the DMZ position know it can fire according to orders, with deadly precision. That's because this particular soldier is made by Samsung.

It's a robot.

"It is reportedly planned to have the ability to autonomously decide when to use lethal force," said artificial intelligence researcher Ted Metzler, a professor at Oklahoma City University and director of the Darrell W. Hughes Program for Religion and Science Dialogue. "When you invest that kind of autonomy in a robot, it becomes disturbing."

MORAL QUALMS
As robots take on increasingly complicated tasks on behalf of humans, or even in lieu of them, there are more issues such as the gun-toting Korean robot to consider, Metzler said, and not all the moral qualms are the same.

Metzler and other AI researchers recently presented a study, undertaken at OCU, that explores the moral queasiness some humans are starting to have with the increasing role that robots have in society.

Presented at the Association for the Advancement of Artificial Intelligence conference in Chicago this summer, the study, based on a survey of OCU students, showed a surprising difference between East and West when it comes to using robots for tasks thought to have moral implications; whether caring for a lonely widow or deciding when to fire a gun.

"As we make these robots more lifelike, they become more acceptable to people "¦ up to a point," Metzler said. "If they are too lifelike, a kind of revulsion sets in. Because it's creepy " at least, that's the theory of some researchers."

However, in the East, among such countries as Japan and South Korea, people may be less unsettled at the thought of robots taking over traditionally emotional roles such as caregiver, pet, or, in the case of the robot sentry, executioner, which appears to be indicated by the OCU study completed last fall.

FORMS OF RELIGION
Metzler said the study showed a schism between adherents to Western forms of religion and those who follow Eastern religions, such as Buddhism. In Western "Abrahamic" religious " Judaism, Christianity and Islam " there is the strong belief that humans are made in the image of God and are imbued with a kind of divine spark, Metzler said. 

In Eastern religions, however, the divine spark is often found in many things " animate or inanimate objects. For this reason, Metzler said, humanlike robots appear to be much more easily accepted in places like Japan.

"They call it the Buddha nature," he said. "What it is really referring to is the idea that there is a spirituality imminent in everything " even stones, rock, trees. As a result of this animistic religious view, people (adhering to Eastern views) are quite satisfied with accepting a robot as a peer. After all, it is manifesting the Buddha."

What OCU's study shows, Metzler said, is that Western religion has a problem with robots that are too human.

"The students who are prevailingly Christian have a response of rejection after a certain point " 'I have a soul. A machine cannot,'" Metzler said. "We have these inherited religious concepts in the West that are increasingly going to bump heads with this technology as it becomes more sophisticated."

In some parts of society, these differences may be small, Metzler said. For instance, would a robot dog be all that much different from a remote-control car or a video game?

After a point, Metzler said, it is morally different. For example, the Japanese have an aging population that has a cultural bias against importing foreign workers. Who will take care of the aged? Who will provide them with companionship? The Japanese are seriously considering robots for this role, Metzler said.

"This raises some psychological concerns. What effect does this have on our ideas of companionship and compassion?" Metzler said. "We are seeing ourselves in our robots." "Ben Fenwick

 
  • Currently 3.5/5 Stars.
  • 1
  • 2
  • 3
  • 4
  • 5
 
 

 

 
 
Related content
Vigilante' vendetta?
Related to:News;Ben Fenwick
 
Close
Close
Close