Eureka: How Robots Can Trick You Into Loving Them

Written By Unknown on Kamis, 19 September 2013 | 18.38

I like to think of my Roomba as cute and industrious. He makes noises while he cleans that make me feel as if he's communicating with me, which contributes to the fact that I have assigned him a sex. He exists in a liminal space between animate and inanimate, but once he's done cleaning my floors, I don't mind putting him in the closet until I need his services again — he's a rolling disc, after all.

Robosimian — a headless, quadrupedal disaster-response robot designed by engineers at NASA's Jet Propulsion Laboratory — is a bit more useful than my Roomba, slightly more human-looking and a lot less cute: the C-3PO to my R2-D2. Robosimian can maneuver over rubble and through narrow corridors in order to, one day, rescue humans from peril. But its more difficult task will be forming some sort of bond with the E.M.T.'s and first responders who might use it. Robosimian will be more than just a tool, but not quite a colleague.

In the future, more robots will occupy that strange gray zone: doing not only jobs that humans can do but also jobs that require social grace. In the last decade, an interdisciplinary field of research called Human-Robot Interaction has arisen to study the factors that make robots work well with humans, and how humans view their robotic counterparts.

H.R.I. researchers have discovered some rather surprising things: a robot's behavior can have a bigger impact on its relationship with humans than its design; many of the rules that govern human relationships apply equally well to human-robot relations; and people will read emotions and motivations into a robot's behavior that far exceed the robot's capabilities. As we employ those lessons to build robots that can be better caretakers, maids and emergency responders, we risk further blurring the (once unnecessary) line between tools and beings.

Provided with the right behavioral cues, humans will form relationships with just about anything — regardless of what it looks like. Even a stick can trigger our social promiscuity. In 2011, Ehud Sharlin, a computer scientist at the University of Calgary, ran an observational experiment to test this impulse to connect. His subjects sat alone in a room with a very simple "robot": a long, balsa-wood rectangle attached to some gears, controlled by a joystick-wielding human who, hidden from view, ran it through a series of predetermined movements. Sharlin wanted to find out how much agency humans would attribute to a stick.

Some subjects tried to fight the stick, or talk it out of wanting to fight them. One woman panicked, complaining that the stick wouldn't stop pointing at her. Some tried to dance with it. The study found that a vast majority assumed the stick had its own goals and internal thought proc­esses. They described the stick as bowing in greeting, searching for hidden items, even purring like a contented cat.

When a robot moves on its own, it exploits a fundamental social instinct that all humans have: the ability to separate things into objects (like rocks and trees) and agents (like a bug or another person). Its evolutionary importance seems self-evident; typically, kids can do this by the time they're a year old.

The distinction runs deeper than knowing something is capable of movement. "Nobody questions the motivations of a rock rolling down a hill," says Brian Scassellati, director of Yale's social robotics lab. Agents, on the other hand, have internal states that we speculate about. The ability to distinguish between agents and objects is the basis for another important human skill that scientists call "cognitive empathy" (or "theory of mind," depending on whom you ask): the ability to predict what other beings are thinking, and what they want, by watching how they move.

"We make these assumptions very quickly and naturally," Scassellati says. "And it's not new, or even limited to the world of robotics. Look at animation. They know the rules, too. A sack of flour can look sad or angry. It's all about how it moves."

Maggie Koerth-Baker is science editor at BoingBoing.net and author of "Before the Lights Go Out," on the future of energy production and consumption.


Anda sedang membaca artikel tentang

Eureka: How Robots Can Trick You Into Loving Them

Dengan url

http://koraninternetonline.blogspot.com/2013/09/eureka-how-robots-can-trick-you-into_19.html

Anda boleh menyebar luaskannya atau mengcopy paste-nya

Eureka: How Robots Can Trick You Into Loving Them

namun jangan lupa untuk meletakkan link

Eureka: How Robots Can Trick You Into Loving Them

sebagai sumbernya

0 komentar:

Posting Komentar

techieblogger.com Techie Blogger Techie Blogger