One of the most unexpected things about having children is how the quest to mold perfect little humans ultimately becomes a proj

admin2018-01-01  33

问题     One of the most unexpected things about having children is how the quest to mold perfect little humans ultimately becomes a project of making yourself a better person. Though hardly revolutionary, this epiphany came to me recently when I was talking to an inanimate object. Amazon’s Echo speaker, in front of my 18-month-old Jack.
    "Echo, turn on the lights. Echo, set my thermostat to 72 degrees. Echo, play ’Wheels on the Bus,’ " I commanded the gadget, which understands and responds to an ever growing set of orders (including, no surprise, "Echo, buy more diapers"). Every time I said "Echo," Jack’s eyes shot up to the cylinder-shaped speaker atop the refrigerator, its glowing blue halo indicating it was listening. Then, one day, the inevitable happened: "Uggo!" Jack barked. "Bus!"
    After I explained to Jack that it’s not nice to call someone an uggo, I saw myself through my son’s words—and didn’t like how I looked. Sure, Echo doesn’t care how you talk to it. But to Jack, I must have seemed like a tyrant. And by imitation, he became my little dictator. This dilemma is likely only to grow as voice-based artificial intelligence becomes more commonplace. Already, Apple’s iPhones and iPads have Siri; Google-powered devices come with a similar feature, Google Now; and Microsoft has Cortana. Soon we’ll be regularly talking to digital Moneypennys at home, work and everywhere else.
    Like most parents, my wife and I hope Jack grows up to be kind. Like most toddlers, he needs some help with this. My exchanges with my technology have clearly been setting a bad example. But how exactly to talk to our technology is far from clear. "The issue of ’please’ is huge. It’s one of the foundations of etiquette," says Lizzie Post, president of the Emily Post Institute and the great-great-granddaughter of America’s best-known arbiter of manners. "Kids model the behavior of the parent, and if you want your child to be using the word please often, you need to use it often too."
    So now I say "please" as much as I can. I say it to my wife, my son’s teddy bear, Siri, Echo, Cortana, even my dog. But not everybody agrees that speaking to computers the way we’d like to be spoken to is the best way forward. Oren Etzioni, CEO of the Allen Institute for Artificial Intelligence in Seattle, is one. "I don’t say ’please’ and ’thank you’ to my toaster," he argues. "Why should I say it to [Echo]?"
    Etzioni believes that the machines we have now, our smartphones and tablets, are effectively appliances. "It seems to me that we reserve politeness as a social lubricant," he says. "It has a purpose. "And as a father, Etzioni is concerned that his son will over anthropomorphize smart devices. "I’d be worried that he’d get confused in the same way that we don’t want our kids to think Superman is real and then jump off something," he says.
    If you’ve ever been fooled by an online customer-service chatbot or an automated phone system, you’ll agree that this technology is evolving quickly. Coming generations will find it even harder to differentiate between bots and people, as they encounter even more artificially intelligent assistants backed by machine learning—computers that teach themselves through repeated interactions with human beings.
    At Microsoft, for instance, there’s a personality team dedicated to helping Cortana get a better grasp of manners and mannerisms. The technology is being infused with cultural cues to make it more likable. For example, Cortana’s avatar bows to Japanese users, who prefer formality. "Having a personality designed into the system, knowing some of the nuances of the way humans communicate, how they use different adjectives and how they say ’thank you’ and ’please’—we think it’s an important part of getting that overall speech and dialogue system right," says Marcus Ash, program manager for Cortana.
    Meanwhile, Hound, a voice-assistant app available for a broad range of devices, not only processes the magic words (please, thank you, you’re welcome, excuse me, sorry) but also softens its responses when users speak them. "When you say ’hello’ to Hound, you might hear one type of response, but when you say ’hey’ or ’yo,’ you will definitely hear a different one," says Key van Mohajer, a co-founder and the CEO of SoundHound.
    For humans, etiquette is a kind of social algorithm for managing feelings. Computers will get better at understanding this—but that will likely take decades. Which is more than enough time for me to solve this uggo problem.
Introduce briefly the efforts made by Microsoft’s personality team and SoundHound.

选项

答案making efforts to help smart device have a better grasp of "manners and mannerisms" / to add cultural elements (cultural cues) to the machine / "to make it more likable" / let the machine have a kind of "personality" / learn the small differences (nuances) in human communication / to further improve the overall speech and dialogue system / SoundHound: help the machine use those polite terms / what’s more, make it "soften its responses" when users speak them / + able to use different types of responses

解析
转载请注明原文地址:https://kaotiyun.com/show/JeSO777K
0

相关试题推荐
随机试题
最新回复(0)