Hey Alexa, can you cheer me up? Voice assistants such as Alexa and Google has been rapidly evolving and advancing over the last several years. The algorithms are getting more and more sophisticated and the ability to embed with smart devices is making these types of voice assistant devices a must have in every modern household. Even though the algorithms are getting more complex and realistic, the question remains, can voice assistant devices have feelings?  Can Google or Alexa know that you are feeling down or it is all business and no emotion?

Well, it was just a matter of time when someone came up with a way to include emotions in voice recognition device algorithms. A company by the name Smartmedical Corp has developed an emotion recognition program called Empath. The original algorithm identifies your emotion by analyzing the physical properties of your voice. Based on tens of thousands of voice samples, empath detects your anger, joy, sadness, calmness, and vigour. According to the website, Empath Web API is available to developers. By just adding sample codes to your website, you can integrate your apps with vocal emotion recognition technology.

BrighterSight might want to try this out and perhaps test out the concept of Dr. Robot.

Check out more below.

https://webempath.net/lp-eng/

On a related note, check out the links below:

https://doc.ai/

 

http://brightersight.ca/eds-blog/index.php/2018/01/31/ai-and-deep-learning-is-the-future-of-medicine/