Now, an Emotion Analysis Platform allows to analyze human affective states based on state-of-the-art AI algorithms. This platform offers easy-to-use and intuitive emotion analysis. No background knowledge is required in the science of emotions. Emotions play a fundamental role in social life, in the way we consume or make decisions, and emotional intelligence is the main building block of human communication.
With the help of modern techniques in the fields of Artificial Intelligence and Machine Learning, the Tawni project can propose that machines understand human behavior and develop empathy. This allows adapting the technology to the needs and preferences of human users. It is a powerful technology that must be used responsibly.
TAWNY is a European company, based on European values and principles in terms of privacy, data protection and ethical development of technology, and believes that unlocking the potential of Emotion AI will allow humanity to enter a new era of human-machine interaction.
Many renowned companies, such as BMW or Redbull, already rely on TAWNY technology to ensure that their products and services are developed with the user’s centricity in mind. Researchers rely on algorithms to better understand human behavior and help contribute new, cutting-edge methods to the affective computing scientific community. The TAWNY Platform is used in many different scenarios.
Typical use cases revolve around testing products, services, or advertisements in user studies. Just record your study participants on video (for example, with a webcam) while they are reacting or interacting with your product. You can then review these recordings for additional information on how your test users really perceive your product. For our customers, this is a valuable addition or even replacement of traditional market research and user techniques, such as questionnaires and the like.
When defining categories that describe the usual message, facial expressions try to convey (for example, happy, sad) Defining different (muscular) actions of the face and combinations of these (for example, facial action units such as elevated eyebrows or depressed lips corners) , locating expressions in a continuous and multidimensional space (for example, in the two dimensions valence and excitement). These different approaches are also related to each other.
For example, categories of emotions can generally be described by one or a set of (combinations of) facial action units, and clusters can generally be identified in multidimensional models that each can be related to a particular category of emotion. On the TAWNY Platform, we currently try to recognize the presence of four different categories of emotions and also try to locate the expression in the valence and excitement dimensions. The four categories of emotions on the TAWNY Platform are: Friendliness / Happiness; Surprise; Focus / anger; Indifference / sadness.
If you are not completely new to this topic, you can recognize these four categories as four of the basic emotions defined by researcher Paul Ekman. In fact, it defined six basic emotions, namely: happy, surprised, angry, sad, disgusted, fearful (and additionally neutral).
You may ask yourself: why does TAWNY have a different name and why do they offer only four of these emotions? Let’s start with the name: as you can see, they take the original terms, but add some additional ones. One reason is that – as has already been described – although these terms do convey some meaning, proxies (ie, facial expression) for real feeling are still analyzed.
Many users tend to take the terms literally which makes it difficult to draw the right conclusions. By adding more terms, we want to clarify that the actual interpretation of facial expression can vary – for example, depending on the context. So, taking the “sympathy / happiness” category as an example, it means that expressions in that category usually occur when the observed person reacts with sympathy to the stimulus or the situation made them happy or what happened created a positive feeling in the person.
This general idea is especially important because the original names of the Ekman category sound quite intense, while feelings as intense as anger generally do not occur when, for example, there is an application test – except when the user experience is really bad .
However, certain parts of facial expressions that are categorized as angry, usually occur when a person is focused or focused on something. The obvious unit of facial action here is the forehead, that is, pulling the eyebrows together. If a user interface makes the user very focused or focused, it may be because it is difficult to understand.
So this is a very subtle form of that category of anger, but it is very relevant, for example, for UX tests. So, basically, using these more diverse terms, the company wants to make it clear that there are some nuances within each category, and especially the subtle examples are very relevant to many emotional analysis use cases. The reason for focusing on these four categories of emotions is that they consider them to be the most relevant categories and interpreted in typical use cases of fields such as market research, UX tests, behavior research, etc., the company says.
In summary, here are the descriptions of the four categories:
- Friendliness / Happiness. Facial expressions that are related to the person who reacts positively to the stimulus; it can be described as sympathy for what has been seen or experienced; it usually means that the stimulus induced a state of happiness; with regard to units of action, relevant expressions can often contain a smile, from subtle to intense.
- A surprise. Expressions related to the person being surprised by the stimulus; it usually means that something happened that was not expected by the person; it can also mean that something has not behaved the way the person expected; surprises can be positive or negative; with regard to units of action, the relevant expressions generally contain raised eyebrows and optionally a low jaw.
- Focus / Anger. Expressions related to the person who gets angry or confused because of the stimulus; In typical use cases, this means that the person does or is even forced to concentrate / concentrate on the stimulus; it can be caused by the person who does not like or also does not understand the stimulus; on the user, these experiences can also mean that the task is more demanding; it can also mean that the stimulus really gets the person’s attention, which can be interpreted positively if that’s what you want to achieve; with regard to units of action, the relevant expressions usually contain frowning and optionally pressed lips / closed teeth.
- Indifference / Sadness. Expressions related to the person who shows signs of indifference, disappointment or even sadness; This often means that the stimulus did not behave as desired or a particular action did not lead to the intended result; In test settings that are not very exciting (maybe even boring), people tend to show these expressions too when nothing in particular happens, which means that it is especially important to look for significant changes here; as far as units of action are concerned, the relevant expressions generally contain corners of the lower lips.
The good news is that through huge progress in areas like machine learning and especially Deep Learning, computers have become much better at analyzing images and videos, which is exactly what it takes to recognize facial expressions. Simply put: Feeding a computer with large amounts of sample images that have been annotated by human experts, so-called Deep Neural Networks can be created that can automatically recognize different facial expressions, and this is one of the techniques that is used in this technology.
Want to read GSMNigeria more news? See Related Posts below
Want to read GSMNigeria more news? See Related Posts below