Amazon Echo's Weirdest Talent Is Making You Feel Heard

Amazon's Echo is a lot of things – shopping companion, music player, portal to Wikipedia – but its most surprising feature says more about human comfort with next-gen electronics. Echo's cleverness is in the cloud; the black column of its local hardware is really just a gateway to that remote functionality. Yet it also has to satisfy a few core requirements, such as demonstrating attentiveness. The solution Amazon came up with is both simple and elegant.

Advertisement

When you speak Echo's trigger word – either "Alexa" or "Amazon" – the ring of lights around the top edge come alive in flickering blues. A lighter blue segment then pinpoints you, as if Echo is staring in your direction.

Sometimes, as the multi-microphone array figures out where exactly the voice is coming from, the patch of blue shifts about from side to side, like a myopic person glancing around to room to figure out who is calling them. Invariably, though, it fixes its virtual gaze by the time you're asking your question or issuing your command.

Figuring out how to make artificial intelligences – or the blunt, rough estimates of them we've managed to create today – more approachable and acceptable is arguably as significant an issue as the AI itself. One possibility is giving them a human face, making increasingly realistic automata that resemble just another person.

Advertisement

Most people with a passing interest in the field, however, have heard of the uncanny valley: the sense of disquiet experienced when an artificial being is so very close to appearing real that it only succeeds in producing discomfort.

If a full replicant is too far, then, how close need we go to balance approachability with something that doesn't intimidate with its otherworldliness? It's a question many in the robotics world are asking and without a common answer.

Robotic companion startup Jibo, for instance, has turned to physical movement to give its tabletop 'bot a sense of personality. More like a chunky, articulated pawn from a chess set than anything else, its LCD face and sinuous motions borrow from nature but without copying it wholesale.

"People and organic things move very differently from the way we think about machines and things today," Jibo creator Cynthia Breazeal told me when we spoke about the company's investment in kinematics. "Organic things move in arcs, they move in ways that trigger our brain to think "organic", whereas machines tend to move in rectilinear ways, very abrupt."

What strikes me as interesting about Echo is quite how approachable non-geeks have found it. Plugging one in while on a family vacation recently, after a few minutes of demonstration – yes, you have to say "Alexa" each time; no, you can't hold a natural language conversation – people who I would normally think of as relatively uninterested in technology were talking to it as a matter of course.

Advertisement

In fact, they were the people who quickly began asking more philosophical questions (which Echo, of course, couldn't answer), eager to strike up a dialog rather than treat the rudimentary AI as a simple search engine in a box. Echo became "her" and, just as Alexa would always "look" their way when listening, they would always make "eye contact" with her when talking.

Nobody confused Echo for anything other than a gadget. Nobody mistook her at-times stilted voice for something more than a distant computer, reading but not understanding. At the same time, Alexa as a virtual presence was quickly accepted in a way that Siri or a Google web search – for all their rich capabilities – aren't.

Recommended

Advertisement