Motivation and emotion/Book/2018/Affective computing
What is affective computing and how can it be used?
Overview[edit | edit source]
"The study and development of systems and devices that can recognize, interpret, process, and simulate human affects."
The above quote crudely defines “Affective Computing” as giving robots/devices and computer programs commonly known as “apps” human emotions (KALIOUBY, PICARD & BARON-COHEN, 2006). Within this chapter of Affective Computing, we will come to understand how emotion theory such as James-Lange theory and facial feedback hypothesis function within affective computing, and how such emotion theory has been utilised within affective computing. Affective computing points towards a fascinating future that is just around the corner of human civilisation, in which all individual's will be affected, due to the fast growing use of technology within today's world, affective computing stands to help and propel the human race to possible new heights.
What is affective computing?[edit | edit source]
How can a machine "Understand" and "Produce" human emotion?
- Affective computing is an interesting field where technology is meeting human psychological processes. The idea of giving technology emotion is a scary thought within today's world especially if you have seen movies such as The Terminator or 2001: A Space Odyssey, these movies incite emotions of "fear" for what may happen to society when technology decides that humanity should stop to exist (Picard, n.d.). Instead affective computing's goal is more of to allow machines to first recognise the emotion an individual is experiencing, and then either depending on the device/program record the data or carry out a specific instruction to help alleviate or increase identified emotion (Picard, n.d., Reeve, 2017, Morris & Aguilera, 2012, KALIOUBY, PICARD & BARON-COHEN, 2006).
- Currently affective computing could be said to still be in an "infancy" stage, but when we start look at today's technology we can see how affective technology is all around us, albeit not highly noticeable but nonetheless it is there (Picard, n.d.). Before we can look around us at where affective technology is within our life, we first must understand the fundamentals such as; how can a machine recognise/interpret emotion, what is emotion, and emotion theory (Reeve, 2017). Affective computing's ultimate end goal would of course be technology that can recognise and interpret emotion, and then respond, as for too what degree technology should be able to access/use of emotion is still up for debate (Picard, n.d., Carr, Hofree, Sheldon, Saygin & Winkielman, 2017, KALIOUBY, PICARD & BARON-COHEN, 2006).
Affective computing and emotion[edit | edit source]
How does a machine recognise/interpret human emotion?
- Affective computing is that of giving technology the ability to perceive and react to emotion, and as such for understanding of what could enable a "lifeless inanimate object" the ability to recognise and interpret emotion in which affective computing strives for it is fundamental to first understand what is emotion and how it comes about (Reeve, 2017, Morris & Aguilera, 2012, KALIOUBY, PICARD & BARON-COHEN, 2006). Emotion has been best studied by psychology, by which psychologists have theorised, defined, and tested ideologies of emotion, which is why understanding psychological emotion theory is fundamental (Burton, Westen & Kowalski, 2012).
- What is emotion?
- Emotion is that of a biological or cognitive reaction to a "life event" which causes; feelings, bodily arousal, sense of purpose, and social-expressive (Fehr & Stern, 1970, Boehner, DePaula, Dourish & Sengers, 2007). For machines to first recognise/detect emotion they "start" backwards from the outcome of emotion, this is best explained through the biological perspective of emotion psychology, such as James-Lange theory in which; emotion is that of an evaluative response that typically includes physiological arousal, subjective experience, and behavioural expression (Burton, Westen & Kowalski, 2012). Affective computing is not always designed to give emotion but usually to identify emotion and then increase emotion or decrease the identified emotion, as such machines work backwards, requiring the person to display emotion before the machine is able to identify emotion due to emotion starting from a stimulus which can be subjective from individual to individual (Reeve, 2017, Fehr & Stern, 1970).
- Although the biological perspective of emotion psychology which is so fundamental for affective computing, it should always be noted that because emotion relies on a stimuli which can be subjective due to the individual imparting their own meaning or ideologies on that stimuli, which in emotion psychology causes a lot of criticism for both sides biological perspectives and cognitive perspectives (Reeve, 2017, Boehner, DePaula, Dourish & Sengers, 2007). There are still fundamental emotions that have been found to be universal across culture, gender, and race, and are usually called primary emotions, and it is usually these emotions that affective computing relies on (Boehner, DePaula, Dourish & Sengers, 2007, Reeve, 2017). Table 1. outlines the different aspects of emotion; Biological and cognitive.
Aspects of Emotion
|Biological Aspects of Emotion||Cognitive Aspects of Emotion|
|Autonomic Nervous System (ANS):
|Subcortical brain circuits:
Emotion theory[edit | edit source]
- The James-Lange theory was quick to becoming popular due to being a first in attempting the explanation of physiological changes in regards to emotion (Fehr & Stern, 1970, Reeve, 2017). The ideology of the theory was that instead of emotion producing the physical changes such as increased heart rate, sweating, pupil dilation, and hormone release, was that physical changes happened before emotion (Fehr & Stern, 1970, Reeve, 2017).
- Significant stimulus event → Physiological changes/reaction → Emotion (Reeve, 2017).
- Although psychologists criticised this theory due to other ideologies being that the process of events should be; Significant stimulus event → Emotion → Physiological changes/reaction (Reeve, 2017, Fehr & Stern, 1970).
- With James-Lange theory affective computing is able to apply some of the ideology from James-Lange theory that emotion happens after/because of physiological changes through the use of ANS, and with ANS it is possible to accurately identify six distinct emotions; anger, fear, sadness, disgust, happiness, and embarrassment, with this ideology affective computing is able to utilise ANS for certain emotions that have strong ANS reactions (Reeve, 2017, Fehr & Stern, 1970).
Facial Feedback Hypothesis
- Facial feedback hypothesis is an aspect of emotion psychology that affective computing is able to utilise in enabling machines to detect emotion from facial expressions, through three functions; movement of face muscles, facial temperature change, and glandular changes such as sebaceous glands (Krumhuber & Scherer, 2016), Lewinski, den Uyl & Butler, 2014, Reeve, 2017). Facial feedback utilises functions such as electrodermal activity (other wise known as skin conductance through the understanding of electrical properties of skin), heart rate motioring (e.g. ECG which is the process of recording the electrical activity of the heart), electroencephalogram other wise known as EEG is that of monitoring the electrical activity of the brain, and last but not least eye tracking in which pupil dilation and eye movement that provide individuals with emotion and motivational information (Krumhuber & Scherer, 2016, Reeve, 2017, Lewinski, den Uyl & Butler, 2014).
- Facial feedback hypothesis has scientifically been found to be an accurate way for not only detecting emotion but also identifying emotion, not just for humans but also affective computing devices are able to just as accurately as humans (Lewinski, den Uyl & Butler, 2014). Through determining facial expression which is one of the functions of facial feedback, software such as FaceReader by NoldusNoldus has a similar accuracy to that of humans, 88% accuracy of emotion for FaceReader software as reported by Peter Lewinski, Tim M., and Crystal Butler in the study of Automated Facial Coding: Validation of Basic Emotions and FACS AUs in FaceReader, (2014) (Strack, Martin & Stepper, 1988, Yitzhak et al., 2017).
How can a machine "see" emotion?[edit | edit source]
- Machines are being given the ability to "see" emotion through sensors such as cameras (which can monitor facial feedback), skin conductors (Electrodermal activity), heart rate monitors (ECG), and sensors that monitor brain wave patterns through the use of EEG, all of these sensors when paired with a biological perspective of emotion psychology can be used to accurately detect primary emotions within human beings (Boehner, DePaula, Dourish & Sengers, 2007, Calvo, 2015, Krumhuber & Scherer, 2016, Lewinski, den Uyl & Butler, 2014). Of all these sensors that an affective computing machine could have the most vital requirement for affective computing are human beings for we will be the ones that the affective computing machines will be designed to respond to.
How can affective computing be used?[edit | edit source]
- The emoSPARK is a personal kept device that's goal is to improve mood through the use of content from Facebook and YouTube, or provide knowledge via Wikipedia, all through its connection with the internet (Prakash, 2018). EmoSPARK is a device that utilises face tracking software, algorithms, and speech input, to learn and decipher its users emotions so as to enhance its own conversational skills, along with increasing desired content such as music and games (Prakash, 2018). The emoSpark is also able to "mimic" eight primary human emotions; sadness, trust, disgust, fear, anger, surprise, and anticipation, this ability to "mimic" human emotion can be seen as stemming from James-Lange theory of emotion as mentioned before the autonomic nervous system is able to be used to distinguish between six differing emotions (Reeve, 2017). EmoSPARK is just one of many potential applications of affective computing within today's world (Morris & Aguilera, 2012).
- Mobile phones have in some cultures become an extension of the self along with the ability to inter-connect with other individuals whilst on the move, it could be said that the mobile phone is the first step in gaining direct connection with human emotion, after all the device has the potential to track facial feedback, acquire emotional information through user input, and through accelerometer's  within today's phone determine some types of activity being conducted by the user (mobile, social, and wearable computing and the evo). Although the device itself is useless without software (Apps) it is in the end the software that provides the mobile phone with its "affective computing", as it has been found that software in relation to emotional health as in the study Mobile, Social, and Wearable Computing and the Evolution of Psychological Practice by Margaret Morris and Adrian Aguilera (2012), found that there were 9,000 consumer health apps that could be used, within these apps it was common for "mood" to be monitored. Although Just monitoring mood might be seen as pointless by some, it has actual real world psychological benefits, as it will enable users to monitor and track emotional patterns, which has the potential for the learning of situational triggers to unwanted emotion, enabling health practitioners to better understand their patients "feelings" (Morris & Aguilera, 2012).
- Mobile phones can also be paired with wearable technology, for example the Apple watch  a device that has the ability to track a multitude of physiological features such as heart rate through the use of electrocardiography  (ECG), with this technology mobile phones are in a way becoming augmented for affective computing, for mobile phones now stand on the edge of becoming a "strong" affective computing device, as armed with the understanding of James-Lange theory and Facial feedback hypothesis, we can see how technology is becoming adapted for human emotion gearing in the sense that each device will work towards the common goal of improving the individuals mental health (Morris & Aguilera, 2012, Prakash, 2018, Reeve, 2017).
Summary[edit | edit source]
Affective computing still has a long way to go before devices and software are truely and properly implemented. Affective computing although a young field, has made enourmous strides in not only understanding emotion but also creating/mimicking emotion as seen by the emoSPARK, and with devices such as the mobile phone always being improved through software enabling users to inform their "extension" of self allowing for not only themselves but also health practitioners better psychological understandings of themselves (Morris & Aguilera, 2012, Reeve, 2017, Prakash, 2018).
Affective computing achieves what it has achieved partially to already researched ideologies such as emotion psychology and its respective theories such as facial feedback hypothesis and the James-Lange theory that the biological aspect of emotion psychology was founded on (Reeve, 2017, Calvo, 2015, KALIOUBY, PICARD & BARON-COHEN, 2006). Due to psychology having researched a great deal about emotions affective computing has been able to make quick strides to creating and building the future of societies technology, but before it can be fully implemented it will still require further research and study, before society itself may permit "emotion" machines, due to the psychological distinction between nonhuman and human (Carr, Hofree, Sheldon, Saygin & Winkielman, 2017, Waytz & Norton, 2014). The future of affective computing looks bright in the terms of devices and software helping individuals overcome and increase mental health, but there will come a point where a spontaneously elicit conflict or discomfort will happen due to the ideology that humans will struggle to cross the bridge of seeing computers as inanimate objects to semi-conscious beings (Carr, Hofree, Sheldon, Saygin & Winkielman, 2017, Waytz & Norton, 2014).
See also[edit | edit source]
- Artificial Consciousness (Wikipedia)
- Emotion (Wikipedia)
- History of artificial intelligence (Wikipedia)
References[edit | edit source]
Boehner, K., DePaula, R., Dourish, P., & Sengers, P. (2007). How emotion is made and measured. International Journal Of Human-Computer Studies, 65, 275-291. https://doi.org/10.1016/j.ijhcs.2006.11.016
Burton, L., Westen, D., & Kowalski, R. (2012). Psychology (3rd ed., pp. 400 - 402). Milton, Qld.: John Wiley and Sons Australia.
Calvo, R. (2015). The Oxford handbook of affective computing. New York: Oxford University Press.
Carr, E., Hofree, G., Sheldon, K., Saygin, A., & Winkielman, P. (2017). Is that a human? Categorization (dis)fluency drives evaluations of agents ambiguous on human-likeness. Journal Of Experimental Psychology: Human Perception And Performance, 43, 651-666. https://doi.org/10.1037/xhp0000304
EEG (Electroencephalogram). (2018). Retrieved from https://kidshealth.org/en/parents/eeg.html
Farzanfar, R. (2006). When computers should remain computers: a qualitative look at the humanization of health care technology. Health Informatics Journal, 12, 239-254. https://doi.org/10.1177/1460458206066663
Fehr, F., & Stern, J. (1970). Peripheral physiological variables and emotion: The James-Lange theory revisited. Psychological Bulletin, 74, 411-424. https://doi.org/10.1037/h0032958
KALIOUBY, R., PICARD, R., & BARON-COHEN, S. (2006). Affective Computing and Autism. Annals Of The New York Academy Of Sciences, 1093, 228-248. https://doi.org/10.1196/annals.1382.016
Krumhuber, E., & Scherer, K. (2016). The Look of Fear from the Eyes Varies with the Dynamic Sequence of Facial Actions. Swiss Journal Of Psychology, 75, 5-14. https://doi.org/10.1024/1421-0185/a000166
Krumhuber, E., Tamarit, L., Roesch, E., & Scherer, K. (2012). FACSGen 2.0 animation software: Generating three-dimensional FACS-valid facial expressions for emotion research. Emotion, 12, 351-363. https://doi.org/10.1037/a0026632
Lewinski, P., den Uyl, T., & Butler, C. (2014). Automated facial coding: Validation of basic emotions and FACS AUs in FaceReader. Journal Of Neuroscience, Psychology, And Economics, 7, 227-236. https://doi.org/10.1037/npe0000028
Morris, M., & Aguilera, A. (2012). Mobile, social, and wearable computing and the evolution of psychological practice. Professional Psychology: Research And Practice,43, 622-626. https://doi.org/10.1037/a0029041
Picard, R. Affective Computing. M.I.T Media Laboratory Perceptual Computing Section Technical, Report No. 321, 8. Retrieved from https://affect.media.mit.edu/pdfs/95.picard.pdf
Prakash, A. (2018). Seminar_Report on EmoSPARK. Retrieved from https://www.slideshare.net/ANANDPRAKASH79/seminarreport-on-emospark
Reeve, J. (2017). Understanding motivation and emotion (7th ed., pp. 287 - 337). Wiley.
Strack, F., Martin, L., & Stepper, S. (1988). Inhibiting and facilitating conditions of the human smile: A nonobtrusive test of the facial feedback hypothesis. Journal Of Personality And Social Psychology, 54, 768-777. https://doi.org/10.1037/0022-35220.127.116.118
Waytz, A., & Norton, M. (2014). Botsourcing and outsourcing: Robot, British, Chinese, and German workers are for thinking—not feeling—jobs. Emotion, 14, 434-444. https://doi.org/10.1037/a0036054
Williams, K., Cheung, C., & Choi, W. (2000). Cyberostracism: Effects of being ignored over the Internet. Journal Of Personality And Social Psychology, 79, 748-762. https://doi.org/10.1037/0022-3518.104.22.1688
Yitzhak, N., Giladi, N., Gurevich, T., Messinger, D., Prince, E., Martin, K., & Aviezer, H. (2017). Gently does it: Humans outperform a software classifier in recognizing subtle, nonstereotypical facial expressions. Emotion, 17, 1187-1198. https://doi.org/10.1037/emo0000287
[edit | edit source]
- "Affective computing". Wikipedia. 2018-08-31. https://en.wikipedia.org/w/index.php?title=Affective_computing&oldid=857332956.