Ellie sets up shop in a magenta-colored armchair that's displayed on a wide-screen television at USC's Institute for Creative Technologies in Playa Vista.

Dark-haired with medium-toned skin, dangly earrings and cheekbones to envy, she could be white or black or Latina, 25 or 45 years old.

As she talks with a person seated opposite the screen, her motions and utterances mimic those of a therapist — a sympathetic nod here, a pleasant "tell me more about that" there. But never so precisely that she seems real.

That's OK, say the scientists who spawned her.

"She has to offer the best of both worlds," said Gale Lucas, a social psychologist at the institute. "She has to be human enough but also look like a machine. If she was just like us, it wouldn't work."

"It" is a virtual-reality program called SimSensei, launched by USC researchers with funding from the Defense Advanced Research Projects Agency. The aim is to coax people who might be reluctant to share emotions with another human to tell Ellie, a machine.

The setup involves a computer program that directs Ellie to ask questions. A webcam and microphone allow the computer's software to "see" and "hear" her conversation partner's response. The feedback guides the direction of the discussion — including appropriately timed nods and questions from Ellie — and helps the system analyze emotional cues.

As a human therapist might during a clinical encounter, Ellie starts off with chitchat and gradually moves to more challenging terrain, asking a subject: "Tell me about your relationship with your family" or "How easy is it for you to get a good night's sleep?"

The system probes audio responses for keywords and visually tracks 66 points on the face, taking note of signs of happiness, anger and other emotions. It also measures the interviewee's rate of speech, how long he or she pauses before answering Ellie's questions and the amount of variation in facial expression. A flat expression is one sign of being depressed.

Unlike a human interviewer, Ellie cannot respond to questions. Early in the session, she acknowledges she is "not a therapist" but is there to listen.

"If someone says something terrible, we want them to understand why she isn't asking them more about it," said Giota Stratou, a research programmer at the institute.

Despite Ellie's limitations, Stratou said, "people open up to her. They cry … they tell her they have nightmares."

And they do so more easily than they otherwise might, it turns out.

In a study published last year in the journal Computers in Human Behavior, Lucas and other researchers reported that participants were more likely to speak freely and display emotion when they believed they were interacting with a computer.

"People opened up more to the virtual human than to a real person. They said they felt less judged by the virtual human," Lucas said, even though they knew that their answers would be recorded and viewed later by workers in a lab.

"It's about what's happening in the moment — having a safe place to talk," she added.

Thus far, Ellie has been used only in research settings, where more than 600 subjects have submitted to her questioning for a variety of studies.

Lucas has applied for a grant from the National Institutes of Health to interview cancer patients about difficulties they're having with their illness or treatments. And a team at Emory University plans to use Ellie to ask veterans about their experiences with sexual trauma.

But as virtual-human technology improves, the hope is to use it on laptops and other inexpensive devices, increasing clinicians' ability to screen patients in remote locations. Lucas said she hoped Ellie would be "deployed" before too long to screen veterans returning from battle.

Although Ellie may not be able to do everything a therapist can, she's very consistent.

"She doesn't feel tired, she doesn't need to eat, she doesn't have a bad day," Lucas said.


Twitter: @LATerynbrown


©2015 the Los Angeles Times

Visit the Los Angeles Times at latimes.com

Distributed by Tribune Content Agency, LLC