top of page

AI chatbots are already talking to you. But can they tell you how they "feel"?


The next generation of AI-powered chatbots could be so good at answering your questions that you won't be able to tell them apart from their real-world counterparts.


Five9, the company behind the new answering machine, believes its new technology, which uses AI to break phrases into sounds and tones, will save companies money on labor expenses.


Using the human voice to teach AI is critical to ensuring a positive caller experience.


What is the mechanism behind it?


Five9 auditioned performers in London for the latest voice and selected Joseph Vaughn to record a series of scripts for the company.


The AI ​​computer was then able to reproduce not only phrases, but also various emotions, decomposing the sound into sounds and tones, not words.


The program was trained to recognize combinations of words and tones in the interlocutor's speech, as well as his emotional state.


"We capture all the sound data and all the combination of frequencies and vibrations that are inherent in the voice, which as a human would recognize as voice, but the machine just guesses the sounds," Rhyan Johnson, an engineer at the Wellsaid Laboratories involved in the project.


“Eventually, sounds and patterns come together to form what we recognize as the human voice. We can strive for perfection, but since the human voice is imperfect, we will be content with human naturalness,” he added.


Five9 says its AI agents have handled more than 82 million calls to healthcare providers like Covid Clinic, large retailers like Pizza Hut, insurance companies, banks, small businesses, and state and city governments.


Next year, their revolutionary Virtual Voiceover technology will be available.

Comments


bottom of page