Ask me
Log In
Ask me

The Artificial Intelligence Act (AIA) regarding emotion recognition states the following:



In this regulation, the term ""emotion perception system"" should be defined as an AI system whose purpose is to identify or infer the emotions or intentions of individuals based on their biometric data. This concept applies to emotions or intentions such as happiness, sadness, anger, surprise, dislike, confusion, excitement, shame, contempt, satisfaction, and pleasure. It does not include physical conditions, such as pain or fatigue; including, for example, systems used to determine the fatigue state of professional pilots or drivers for accident prevention purposes. It also does not apply to the simple recognition of obvious expressions, gestures, or movements, unless they are used for the identification or inference of emotions. These expressions can be simple facial expressions, such as a furrowed brow or a smile, or gestures such as movements of the hands, arms, or head, or characteristics of a person's voice, such as a raised voice or whisper."


There are serious concerns about the scientific basis of AI systems aimed at identifying or inferring emotions, especially because the expression of emotions varies significantly across different cultures and situations, and even within the same individual. The main shortcomings of such systems are limited reliability, lack of specificity, and limited generalizability. Therefore, AI systems that identify or infer the emotions or intentions of individuals based on their biometric data can produce discriminatory results and may infringe on the rights and freedoms of the individuals concerned. Given the unequal power dynamics in the workplace or education sector, and the inherently offensive nature of these systems, they could lead to harmful or adverse treatment of specific individuals or groups. Therefore, the launch, deployment, or use of AI systems intended for determining the emotional state of individuals in situations related to the workplace and education should be prohibited. This prohibition should not apply to AI systems launched solely for medical or security reasons, such as systems intended for therapeutic purposes.


PitchPatterns does not analyze human emotions with the aim of determining how a person feels or what their intentions are. Firstly, no biometric data of clients is created. To determine whether a voice recording is considered biometric data, the mandatory characteristics of biometric data must be considered: 1) personal data after specific technical processing; 2) which relate to the physical, physiological, or behavioral characteristics of a natural person; 3) allow for or confirm the unique identification of said natural person.


Our system does not create biometric data of customers/callers to identify the respective person; the data would need to be specifically processed, and a personal “profile” would need to be created, which PitchPatterns does not do.


Regarding call center employees, technical support staff – agents – PitchPatterns does not attempt to predict how an agent feels, what the emotional state of the respective agent is, etc. The PitchPatterns system is capable of recognizing ""apparent"" emotions. Essentially, an agent is an actor who, regardless of their internal emotional state, must be able to be polite in customer service, must be able to listen to the caller, or convince them. Even if the agent is having a bad day, they must not raise their voice against the customer, sound irritated, uninterested, etc. Our system shows the situations in which the agent should think about how they sound. Therefore, these are called ""apparent"" (portrayed) emotions."


The question of whether these can even be considered ""emotions"" in the classical sense is debatable, as emotions reflect a person's inner world. We indicate how a person sounds, what the tone of the conversation is, not trying to predict how depressed the employee is, or how tired they are, etc. Employers are interested in how the employee sounds when talking to a customer, not how the employee feels today. Like a good actor, an agent must be able to ""play their role"" regardless of internal feelings."


It is clearly stated in this regulation that the term ""emotion recognition system"" should be defined as an AI system whose purpose is to identify or infer the emotions or intentions of individuals based on their biometric data. PitchPatterns' purpose is not to identify or infer the emotions or intentions of employees or clients based on their biometric data. PitchPatterns is unequivocally not an emotion recognition system in the sense understood by the MIA, therefore the restrictions imposed by the MIA do not apply to PitchPatterns."