Microsoft Research studies how AI could be used to help read

Microsoft Research studies how AI could be used to help read expressions within meetings


What you need to know
A Microsoft Research project studied the use of AI to read people's non-verbal communications during virtual meetings.
The study used a bot within Teams calls to identify various emotions.
The study suggests positive results from using AI to enhance communication.
A Microsoft study used an AI tool to monitor people's expressions and non-verbal communication during video calls. The AI is called AffectiveSpotlight, and it uses a neural network to classify the expressions of people. AffectiveSpotlight was tested against random selection to help presenters see people's reactions. The study was recently highlighted by NewScientist.
Generally, people are good at reading non-verbal communication. It's normal to pick up on subtle, and not-so-subtle, facial expressions and other cues within conversations. That type of communication tends to suffer on video calls. Not only are people's video feeds smaller than people appear in real life, but we're also often looking at several people at once on a grid.

Related Keywords

, Creative Assembly , Games Workshop , Lenovo , Microsoft , Affective Responses , Audience Members , Microsoft Teams , Total War , Super Nintendo Style , Xbox Game , படைப்பு சட்டசபை , விளையாட்டுகள் பணிமனை , லெனோவோ , மைக்ரோசாஃப்ட் , உவ்ர்ச்சி சார்ந்த பதில்கள் , பார்வையாளர்கள் உறுப்பினர்கள் , மைக்ரோசாஃப்ட் அணிகள் , மொத்தம் போர் , அருமை நிண்டெண்டோ நடை , க்ஷ்போக்ஷ விளையாட்டு ,

© 2025 Vimarsana