Cambridge psychologists and computer scientists have developed a mobile phone technology which can tell if a caller is happy, angry, bored or sad. The EmotionSence technology will enable psychologists to show links between moods, location and people. It uses speech-recognition software and phone sensors attached to standard smart phones to assess how people’s emotions are influenced by day-to-day factors. The sensors analyze voice samples and then place them into 5 emotional categories: happiness, sadness, fear, anger and a neutral category (such as boredom or passivity). Scientists then cross-reference these emotions against surroundings, the time of day and the caller’s relationship with the person they are speaking to. Results from a pilot scheme revealed that callers are happier at home, sadder at work and display more intense emotions in the evenings.