The Little Known Field of Affective Computing: What Is It, How Is It Used, and How Can It Be Misused?
Category Computer Science Tuesday - May 2 2023, 04:34 UTC - 10 months ago Affective computing is an emerging field that explores how technology can be used to recognize and respond to our emotions. It is currently used by organizations to screen candidates, identify the angriest clients, and boost sales. While it can be misused, affective computing can also be used to recognize and assess emotional states in a caring and ethical manner.
Tuesday - May 2 2023, 04:34 UTC - 10 months ago
Affective computing is an emerging field that explores how technology can be used to recognize and respond to our emotions. It is currently used by organizations to screen candidates, identify the angriest clients, and boost sales. While it can be misused, affective computing can also be used to recognize and assess emotional states in a caring and ethical manner.
The little-known field of affective computing is a growing presence in our lives, including the workplace. Is it OK to have your feelings monitored if it's supposed to enhance your well-being? Pretend you're a manager who can see how your team members are feeling, in real time. You could gauge their mood—after an influx of new employees, say, or an organizational change—and act accordingly. That's the goal of affective computing research being led by Pierrich Plusquellec, a professor at Université de Montréal's School of Psychoeducation, and Pamela Lirio, a professor at its School of Industrial Relations. We spoke to them to find out more.
--- What is affective computing? --- .
Pierrich Plusquellec: It's a scientific discipline that connects computing and feelings in two ways: by creating machines that effectively simulate emotions and by recognizing different emotional states. There are algorithms today that can detect our feelings in real time based on the sound of our voices, our facial expressions, our movements and even the way we type on a keyboard. This field of research was launched less than 30 years ago by Rosalind Picard and is generating a lot of buzz in the industry today. It's got tremendous potential. Affective computing is everywhere: it can even be found in the phones we're so addicted to.
--- How is affective computing used by various organizations today? --- .
Pamela Lirio: One example is in call centers, where it's used to identify the angriest clients, who are then sent to more experienced employees. In this case, it's used to perform an initial screening. It's also used by human resources. Big companies like Vodafone, Hilton, Urban Outfitters and Unilever use automatic facial expression recognition to screen candidates who are expected to submit video applications. This helps them create a shortlist from an especially large pool of candidates that recruiting teams would otherwise lack the resources to handle. They do this through recruiting platforms like HireVue that use affective computing. However, it's still best practice for HR professionals and/or the frontline manager to make the final hiring decision after looking at the data provided.
--- There are many ways this technology could be misused, aren't there? --- .
Pierrich Plusquellec: It all depends on how we use it! It's just a tool, like a knife, which can be used to sculpt glorious works of art or to kill someone. The risk is real: affective computing can be horribly misused. I've read in Le Point, the French newsmagazine, that facial or emotional recognition technology is used to torture Uyghurs in China. Meanwhile, closer to home, the commercial real-estate company Cadillac Fairview used facial recognition on unwitting customers to boost sales at Carrefour Laval mall. It installed cameras to capture the facial expressions of shoppers and assess their mood. This is illegal, so the company had to stop. That's why this kind of work has to be done in a way that's caring and ethical, and why users have to have control over their data.
--- How can that be done? --- .
Pierrich Plusquellec: Well, for example, during the pandemic, many people didn't pay enough attention to how they felt. As a result, a lot of them now have misdiagnosed emotional disorders, since there weren't enough resources available to actually determine what was going on. So what we're trying to do is design ethical systems that will help track mental health in an unobtrusive, non-intrusive manner, to nudge us all to pay more attention to emotional signals from our bodies and minds.