2. Emotions as (Personal) Data?
One thing that may come to mind when thinking about emotions legally, and especially in the context of artificial intelligence, is that they are some kind of data. Maybe even personal data. After all, what can be more personal than emotions? Than your feelings that you so carefully keep to yourself? Well, let’s briefly consider this hypothesis.
Personal data under the GDPR is defined as “any information relating to an identified or identifiable natural person.” An identifiable natural person being the one who can (at least theoretically) be identified by someone somewhere, regardless of whether directly or indirectly. The slightly problematic thing about emotions in this context is that they are universal. Sadness, happiness, anger, or excitement don’t tell me anything that would make me identify the subject experiencing these emotions. But this is an overly simplistic approach.
First of all, emotional data never exists in a vacuum. Quite to the contrary, it is inferred by processing large quantities of (sometimes more, sometimes less, but always) personal data. It is deduced by analyzing our health data such as blood pressure and heart rate, as well as our biometric data like eye movements, facial scans, or voice scans. And by combining all these various data points used, it is in fact possible to identify a person.[3] Even the GDPR testifies to this fact by explaining already in the definition of personal data that indirect identification can be achieved by referencing “one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of [a] natural person.”[4]
The easiest examples are, of course, various emotion recognition systems in wearable and personal devices such as the ones Jane has, where the data is directly connected with her user profile and social media data, making the identification that much simpler. However, even when we are not dealing with personal devices, it is still possible to indirectly identify people. For instance, a person standing in front of a smart billboard and receiving an ad based on their emotional state combined with other noticeable characteristics.[5] Why? Well, because identification is relative and highly context-specific. For instance, it is not the same if I say “I saw a sad-looking girl” or if I say “Look at that sad-looking girl across the street”. By narrowing the context and the number of other possible individuals I could be referring to identification becomes a very probable possibility, even though all I used was very generic information.[6]
Furthermore, whether someone is identifiable will also heavily depend on what we mean by that word. Namely, we could mean identifying as ‘knowing by name and/or other citizen data’. This would, however, be ridiculous as that data is changeable, can be faked and manipulated, and not to mention the fact that not all people have it. (Think illegal immigrants who often don’t have access to any form of official identification.) Are people without an ID per definition not identifiable? I think not. Or, if they are, there is something seriously wrong with how we think about identification. This is also becoming a rather common argument for considering data processing operations GDPR relevant, with increasingly many authors taking a broad notion of identification as ‘individuation’[7], ‘distinction’,[8] and even ‘targeting’.[9] All of which are things all of these systems were designed to do.
So, it would appear that emotions and emotional data might very well be within the scope of the GDPR, regardless of whether the company processing it also uses it to identify a person. However, even if they aren’t, the data used to infer emotions will most certainly always be personal. This in turn makes the GDPR applicable. We are not getting into the nitty gritty of what this means or all the ways in which the provisions of the GDPR are being infringed by most (all?) providers of emotion recognition technologies at this point. They are after all still busy arguing that the emotional data isn’t personal in the first place.