The surprising repercussions of making AI assistants sound human

To help rid Alexa of its cyborgian lilt, Amazon recently upgraded its speech synthesis markup language tags, which developers use to code more natural verbal patterns into Alexa’s skills, or apps. The new tags allow Alexa to do things like whisper, pause, bleep out expletives, and vary the speed, volume, emphasis, and pitch of its speech. This means Alexa and other digital assistants might soon sound less robotic and more human. Read more at Wired.



Categories: Technology

Tags:

%d bloggers like this: