Scientists have developed a robotic head that matches the expressions of nearby humans in real time, thanks to its pliable blue face. Called Eva, the autonomous bot uses deep learning, a form of artificial intelligence (AI), to 'read' and then mirror the expressions on human faces, via a camera. Eva can express the six basic emotions – anger, disgust, fear, joy, sadness, and surprise, as well as an 'array of more nuanced' reactions. Artificial 'muscles' – cables and motors to be precise – pull on specific points on Eva's face, replicating muscles beneath our skin. The scientists, at Columbia University in New York, say human-like facial expressions on the faces of robots could build trust between humans and their robotic co-workers and care-givers.