class: center, middle, inverse, title-slide # CX Emotion 2020 ## Facial and Physiological Expressions of Emotion, with a Look at Bias ### Damien Dupré ### Dublin City University - July 22nd, 2020 --- layout: true <div class="custom-footer"><span>CX Emotion - Facial and Physiological Expressions of Emotion - Dupré (2020) </span></div> --- # My Journey into Emotion Science #### Developement of the DynEmo Facial Expression Database (Master) * Dynamic and spontaneous emotions * Assessed with self-reports and by observers #### Analysis of Emotional User Experience of Innovative Tech. (Industrial PhD) * Understand users' acceptance of technologies from their emotional response * Based on multivariate self-reports #### Evaluation of Emotions from Facial and Physiological Measures (Industrial PostDoc) * Applications to marketing, sports and automotive industries * Dynamic changes with trend extraction techniques (2 patents) #### Performance Prediction using Machine Learning (Academic PostDoc) * Application to sport analytics * Big Data treatment (> 1 million users with activities recorded in the past 5 years) --- class: inverse, mline, center, middle # 1. A Closer Look at Emotion Science --- # Why Measuring Emotional Experiences? The emotional experience determines our perceptions and leads our decisions: * In every life (e.g., the Phineas Gage Case; see also [Bechara, Damasio, & Damasio, 2000](https://doi.org/10.1093/cercor/10.3.295)) * In consumer behaviours (e.g., emotions are trigged from product designs and can facilitate buying decisions; [Mahlke & Minge, 2008](https://doi.org/10.1007/978-3-540-85099-1_5); [Bloch, 1995](https://doi.org/10.1177/002224299505900302)) <img src="media/phineas_gage.jpg" width="60%" style="display: block; margin: auto;" /> .center.tiny[Modeling the path of the tamping iron through the Gage skull and its effects on white matter structure.<br />Credit: Van Horn, Irimia, Torgerson, Chambers, Kikinis & Toga (2012) [🔗](https://doi.org/10.1371/journal.pone.0037454)] --- # What Emotions are? > *“Everyone knows what an emotion is, until asked to give a definition. Then, it seems no one knows.”* - [Fehr & Russell (1984, p. 464)](https://doi.org/10.1037/0096-3445.113.3.464)
.center.tiny[K-fee commercial. Credit: Cobblestone Filmproduktion (2004) [🔗](https://www.youtube.com/watch?v=sA_wUTLhSAU)] --- # Characteristics of Emotions .left-column[ <img src="media/emo_event.png" width="27%" style="display: block; margin: auto;" /> <img src="media/emo_appraisal.png" width="27%" style="display: block; margin: auto;" /> <img src="media/emo_sync.png" width="27%" style="display: block; margin: auto;" /> <img src="media/emo_change.png" width="27%" style="display: block; margin: auto;" /> <img src="media/emo_behaviour.png" width="27%" style="display: block; margin: auto;" /> <img src="media/emo_intensity.png" width="27%" style="display: block; margin: auto;" /> <img src="media/emo_rapidity.png" width="27%" style="display: block; margin: auto;" /> .center.tiny[Adapted from Scherer (2005) [🔗](https://doi.org/10.1177/0539018405058216)] ] .right-column[ ] -- ### Event Focus -- ### Appraisal Driven -- ### Response Synchronization -- ### Rapidity of Change -- ### Behavioural Impact -- ### Intense Response -- ### Short Duration --- # What Emotions are not? Affective states refer to **"valenced (good versus bad) states"** ([Gross, 2010, p. 212](https://doi.org/10.1177/1754073910361982)): * Attitudes are **relatively stable beliefs about the goodness or badness of something or someone** * Moods are **less stable than attitudes**, and unlike attitudes, often **do not have specific objects** * Emotions are **the shortest** lived of these three affective processes. They are responses to situations that are perceived as **relevant to an individual’s current goals**
.center.tiny[There are many types of affective states, including attitudes, moods, and emotions.<br />Adapted from Gross (2010) [🔗](https://doi.org/10.1177/1754073910361982)] --- # Holistic Definition of Emotions > Emotion is defined as *“an **episode of interrelated, synchronized changes** in the states of **all or most of the five organismic subsystems** in **response to the evaluation of an external or internal stimulus** event as relevant to major concerns of the organism”.* - [Scherer (2001, p. 93)](https://global.oup.com/academic/product/appraisal-processes-in-emotion-9780195130072) -- <img src="media/inside_out.gif" width="80%" style="display: block; margin: auto;" /> .center.tiny[Inside Out. Credit: Pixar Animation Studios (2015) [🔗](https://i.imgur.com/ifdEAOx.gif)] --- # Functions of Emotions Emotions have 5 functions each corresponding to a specific component/subsystem: * Evaluation of objects and events (**Cognitive Component**) * System regulation (**Physiological Component**) * Preparation and direction of action (**Motivational Component**) * Communication of reaction and behavioural intention (**Motor/Expressive Component**) * Monitoring of internal state (**Subjective Feeling Component**) --
.center.tiny[The dynamic architecture of the component process model.<br />Adapted from Scherer (2009) [🔗](https://doi.org/10.1080/02699930902928969)] --- # Developement of Affective Computing Despite the usefulness of the Component Process Model, research on emotions have lead to a "conceptual and definitional chaos" ([Buck, 1990, p. 330](https://doi.org/10.1207/s15327965pli0104_15)): * There is still no consensual agreement between researchers * Some assumptions of the broad audience are not supported by scientific evidences .pull-left[ In parallel, multiple tools and databases have been developed to investigate emotions. With the **increase in computer processing power** and the **development of machine learning algorithm**, computer scientists have created models to automatically recognize emotions... **What Could Possibly Go Wrong?** ] .pull-right[ <img src="media/rise_affective_computing.png" width="80%" style="display: block; margin: auto;" /> .center.tiny[Credit: The Guardian (2019) [🔗](https://www.theguardian.com/technology/2019/mar/06/facial-recognition-software-emotional-science)] ] --- class: inverse, mline, center, middle # 2. Emotion Recognition from Facial Expressions and Physiological Changes --- class: title-slide, middle ## 2.1 Measuring Emotions from Facial Expression --- # Current Methods ![:col_header Type, Measure, Tool] ![:col_row Invasive, Electric Activity of Face Muscles, Facial Electromyography (fEMG)] ![:col_row Non-Invasive, Human Observer Visual Recognition, Manual Annotation Software] ![:col_row Non-Invasive, Landmark Spatial Configuration, Automatic Recognition Classifiers] <img src="media/affectiva.jpg" width="75%" style="display: block; margin: auto;" /> .center.tiny[Credit: Affectiva [🔗](https://www.affectiva.com/news-item/affectiva-launches-multi-modal-automotive-in-cabin-ai-to-improve-road-safety-and-accelerate-autonomy-2/)] --- # Automatic Facial Expression Recognition Development of the technology: * First attempt by reported by [Suwa, Sugie, & Fujimura (1978)](https://books.google.ie/books?id=P4s-AQAAIAAJ) * Numerous academic systems since (see revue by [Zeng, Pantic, Roisman, & Huang, 2009]()) * VicarVision to develop the first commercial automatic classifier ([den Uyl & van Kuilenburg, 2005](http://www.vicarvision.nl/pub/fc_denuyl_and_vankuilenburg_2005.pdf)) * Today more than 20 companies for applications to automotive, sport, health, human resources, security or marketing purposes ([Dupré, Andelic, Morrison, & McKeown, 2018](https://doi.org/10.1109/PERCOMW.2018.8480127)) .pull-left[ A process in 3 steps: * Face Detection * Facial Landmark Detection * Classification Result is a recognition probability for a labelled category (e.g., Action Unit, Basic Emotion, Dimensions) ] .pull-right[ <img src="media/automatic_steps.png" width="70%" style="display: block; margin: auto;" /> .center.tiny[Credit: Dupré, Andelic, Morrison & McKeown (2018) [🔗](https://doi.org/10.1109/PERCOMW.2018.8480127)] ] --- # Facial Expression Categorization Emotion categories/dimensions are inferred from facial expressions either: .pull-left[ * Directly, by matching Action Units to prototypical expressions of emotions (Emotion coded by the FACS; [Ekman, Friesen, & Hager, 2002](https://www.paulekman.com/facial-action-coding-system/)) <img src="media/emfacs_example.jpg" width="85%" style="display: block; margin: auto;" /> .center.tiny[Credit: Bartlett, Littlewort, Frank, Lainscsek, Fasel, & Movellan (2006) [🔗](https://www.doi.org/10.1109/FGR.2006.55)] ] .pull-right[ * Indirectly, by generalizing features learnt from training with specific databases (pictures or video, posed or spontaneous) <img src="media/affdex_example.jpg" width="42%" style="display: block; margin: auto;" /> .center.tiny[Credit: ThinkApps [🔗](http://thinkapps.com/blog/development/machine-intelligence-affectiva-interview/)] ] --- # Example of Automatic Recognition (1) Four video clips expressed by males were taken from the [DynEmo database](https://dynemo.univ-grenoble-alpes.fr/?lang=en) and judged by human observers as expressing happiness, surprise, fear and disgust:
.center.tiny[Expression of Disgust correctly recognized, the others led to subtle or incorrect recognition.] --- # Example of Automatic Recognition (2) Four video clips expressed by females were taken from the [DynEmo database](https://dynemo.univ-grenoble-alpes.fr/?lang=en) and judged by human observers as expressing happiness, surprise, fear and disgust:
.center.tiny[All four expressions are partially correctly recognized but some non relevant emotions are also recognized.] --- class: title-slide, middle ## 2.2 Measuring Emotions from Physiological Changes --- # Physiology and Emotions Emotions modulate and influence physiological rhythms ([Kreibig, Samson, & Gross, 2013](https://doi.org/10.1111/psyp.12064)): <img src="media/physio.png" width="70%" style="display: block; margin: auto;" /> .center.tiny[Conceptual representation of the relationship between physiological rythms and emotions.] --- # Current Methods Possible measurements include: * Heart Rate and Heart Rate Variability (with ECG, PPG, ...) * Breathing Rate (with mechanical belt, thermal camera, wifi, ...) * Skin Temperature (with wearable thermometer, thermal/infrared camera, laser, ...) * Sudation (with EDA also called GSR) -- ![:col_header Wired Sensors, Wearable Sensors, Remote Sensors] ![:col_row <img src="media/sensors_wired.jpg" width="100%" style="display: block; margin: auto;" /> <a href="https://doi.org/10.1109/T-AFFC.2011.15"> Credit: Koelstra et al. (2011) 🔗</a> , <img src="media/sensors_wearable.jpg" width="100%" style="display: block; margin: auto;" /> <a href="https://www.adinstruments.com/partners/equivital"> Credit: ADinstruments & Equivital 🔗</a> , <img src="media/sensors_remote.png" width="100%" style="display: block; margin: auto;" /> <a href="https://www.cu-bx.com/"> Credit: ContinUse Biometrics 🔗</a> ] ![:col_list Multiple Wired Sensors, Watches / Wristbands / Belts / Smart Devices / ..., RGB Camera / Thermal Camera / Laser / Wifi / ...] ![:col_list Gold Standard, Lower Accuracy but Competitive, Still in Development] ![:col_list Restricted to Lab. Settings, Measures "in the wild", Long Distance (e.g. through walls)] --- # Emotion Dimensions from Physiology By identifying the common changes between simultaneous physiological rhythms, it is possible to reveal the underlying contribution of emotions in term of Activation/Arousal and Valence/Pleasure
.center.tiny[Simulation of physiological measures (5s) including Heart Rate (HR), Breathing Rate (BR), Skin Temperature (ST) and Skin Conductance Level (SCL). All measures are centred and scaled.] <img src="slides_files/figure-html/unnamed-chunk-29-1.png" width="864" style="display: block; margin: auto;" /> .center.tiny[Underlying trend behind the multivariate changes identified with a Generalized Additive Model and potentially contributing to changes in Arousal/Activation.] --- # Example of Emotional Trends Multivariate physiological analysis of emotions during a mountain biking experience ([Dupré, Bland, Bolster, Morrison, & McKeown, 2017](https://doi.org/10.1007/978-3-319-60822-8_4)) --- class: inverse, mline, center, middle # 3. Current Challenges --- class: title-slide, middle ## 3.1 Accuracy of Sensors --- # Sensors in a Specific Context Sensors are designed to measure facial expression or physiology in a controlled laboratory setting. They may not be accurate once applied to the real world or to different context. -- .pull-left[ Face recognition depends on: .small[ - Face orientation (e.g., inclination, rotation) - Face features (e.g., glasses, beard, face mask) - Context light - Morphological facial configurations ] <img src="media/interstellar_affdex.gif" width="100%" style="display: block; margin: auto;" /> .center.tiny[Interstellar by Affdex. Credit: Affectiva [🔗](https://www.youtube.com/watch?v=NsmAldoVwDs)] ] -- .pull-right[ Physiological measures are sensitive to: .small[ - Vibrations - Correct positioning - Singular genetic predispositions (e.g., morphology, sudation level, situs inversus) ] <img src="media/heart_rate_pepper.gif" width="100%" style="display: block; margin: auto;" /> .center.tiny[Credit: Techoob.com [🔗](https://www.youtube.com/watch?v=x2YF8a0SoNU)] ] --- # No Accuracy Standard The precision of devices is not monitored and no standard provides safe guards to users: .pull-left[ * Academic benchmarks (e.g., [Bent, Goldstein, Kibbe, & Dunn, 2020](https://doi.org/10.1038/s41746-020-0226-6)) are used to support legal class-actions over misled consumers on the ability of accurately monitoring physiological changes ([Sawyer, 2019](https://doi.org/10.1080/07303084.2019.1649563)) * According a benchmark of 8 commercially available systems, the accuracy of automatic classifiers of facial expressions varies between 48% and 62% ([Dupré, Krumhuber, Küster, & McKeown, 2020](https://doi.org/10.1371/journal.pone.0231968)) ] .pull-right[ <img src="media/tweet_ai__tech.png" width="100%" style="display: block; margin: auto;" /> ] --- class: title-slide, middle ## 3.2 Bias in Training Databases --- # Limitations of WEIRD Databases Lack of diversity when using facial expression databases with Western/White, Educated, Industrialized, Rich, and Democratic participants ([Arnett, 2008](https://doi.org/10.1037/0003-066X.63.7.602); [Raji & Buolamwini, 2019](https://doi.org/10.1145/3306618.3314244)). Factors influencing the accuracy to recognize emotion (e.g., [Bryant & Howard, 2019](https://doi.org/10.1145/3306618.3314284); [Rhue, 2018](http://dx.doi.org/10.2139/ssrn.3281765)): - Identity - Gender - Ethnicity - Age > *"While most users will get a spot-on result, we acknowledge that the ethnicity classifiers currently offered (Black, White, Asian, Hispanic, ‘Other’) fall short of representing the richly diverse and rapidly evolving tapestry of culture and race."* - [Brackeen (2017)](https://www.kairos.com/blog/we-ve-retired-our-diversity-recognition-app-here-s-why) --- # Prototypical Expressions Both facial expressions and physiological rhythms are proxies to infer emotions **based on theoretical assumptions** .pull-left[ In the case of facial expressions, a majority of databases used to train automatic classifiers considers: .small[ - Six emotions are universal (happiness, surprise, sadness, disgust, fear, anger) - These 6 emotions have prototypical representations ] As a result, automatic classifiers cannot recognize the diversity of facial expressions: .small[ - More than 6 categories of facial expressions - Difficulty to identify subtle and mixed expressions ] ] .pull-right[ <img src="media/six_basic_emotion.jpg" width="100%" style="display: block; margin: auto;" /> .center.tiny[Credit: Ekman, Friesen, & Hager (2002) [🔗](https://www.paulekman.com/facial-action-coding-system/)] ] --- # Meaning is Context Dependent A same facial expression can be interpreted differently according to the context in which the expression is produced Examples of athletes' victory (e.g., raging or crying after wining; see [Martinez, 2019](https://doi.org/10.1073/pnas.1902661116)) <img src="media/automatic_rec.png" width="100%" style="display: block; margin: auto;" /> .center.tiny[Emotion recognized as 'Anger' but the context reveals an experience closer to 'Intense Joy'.] --- # Inter-Individual Differences For both facial expressions and physiological changes, whereas some similar patterns can be found, there is a important variability of possible responses to a same trigger event --- # Absence of Scientific Support Despite the development of automatic classifiers on the idea that emotional categories can be inferred from sensors, there is **no scientific evidence** of reliable expressive and physiological patterns corresponding to emotional categories: .pull-left[ * No one-to-one mapping between patterns and categories ([Kappas, 2003](https://doi.org/10.1007/978-1-4615-1063-5_11)) * Facial expression often communicates something other than an emotional state ([Barrett, Adolphs, Marsella, Martinez, & Pollak, 2019](https://doi.org/10.1177/1529100619832930)) ] .pull-right[ <img src="media/tweet_truthbegolduk.png" width="100%" style="display: block; margin: auto;" /> ] --- class: title-slide, middle ## 3.3 Ethical Considerations --- # Current Challenges With regard to what has been said, we have two problems here, and data/emotion privacy is not one of them: .pull-left[ 1. Physiological and expressive measures are prone to errors 2. Models used by automatic classifiers to categorise emotions are not reliable Therefore, **should we use these automatic classifiers?** ] .pull-right[ <img src="media/ai_now.png" width="80%" style="display: block; margin: auto;" /> .center.tiny[Credit: Tech Xplore (2019) [🔗](https://techxplore.com/news/2019-12-ai-watchdogs-rips-emotion-tech.html)] ] > *"Regulators should ban the use of affect recognition in important decisions that impact people's lives and access to opportunities. Until then, AI companies should stop deploying it."* - [AI Now Institute (2019)](https://ainowinstitute.org/AI_Now_2019_Report.html) --- # Future Directions Most of automatic classifiers of facial expressions have already moved from a classification in categories to a classification in dimensions such as Valence/Pleasure and Arousal/Activation: * More reliable scientific evidences for a dimensional perspective * Not restricted to specific patterns .pull-left[ Additionally, errors in face and physiological measures are reducing with improved techniques and materials. > *"All models are wrong, but some are useful"* - [Box (1979)](https://doi.org/10.1016/B978-0-12-438150-6.50018-2) ] .pull-right[ <img src="media/tweet_digitaltrends.png" width="100%" style="display: block; margin: auto;" /> ] --- class: inverse, mline, left, middle <img class="circle" src="https://github.com/damien-dupre.png" width="250px"/> # Thanks for your attention, find me at... [<svg style="height:0.8em;top:.04em;position:relative;" viewBox="0 0 512 512"><path d="M459.37 151.716c.325 4.548.325 9.097.325 13.645 0 138.72-105.583 298.558-298.558 298.558-59.452 0-114.68-17.219-161.137-47.106 8.447.974 16.568 1.299 25.34 1.299 49.055 0 94.213-16.568 130.274-44.832-46.132-.975-84.792-31.188-98.112-72.772 6.498.974 12.995 1.624 19.818 1.624 9.421 0 18.843-1.3 27.614-3.573-48.081-9.747-84.143-51.98-84.143-102.985v-1.299c13.969 7.797 30.214 12.67 47.431 13.319-28.264-18.843-46.781-51.005-46.781-87.391 0-19.492 5.197-37.36 14.294-52.954 51.655 63.675 129.3 105.258 216.365 109.807-1.624-7.797-2.599-15.918-2.599-24.04 0-57.828 46.782-104.934 104.934-104.934 30.213 0 57.502 12.67 76.67 33.137 23.715-4.548 46.456-13.32 66.599-25.34-7.798 24.366-24.366 44.833-46.132 57.827 21.117-2.273 41.584-8.122 60.426-16.243-14.292 20.791-32.161 39.308-52.628 54.253z"/></svg> @damien_dupre](http://twitter.com/damien_dupre) [<svg style="height:0.8em;top:.04em;position:relative;" viewBox="0 0 496 512"><path d="M165.9 397.4c0 2-2.3 3.6-5.2 3.6-3.3.3-5.6-1.3-5.6-3.6 0-2 2.3-3.6 5.2-3.6 3-.3 5.6 1.3 5.6 3.6zm-31.1-4.5c-.7 2 1.3 4.3 4.3 4.9 2.6 1 5.6 0 6.2-2s-1.3-4.3-4.3-5.2c-2.6-.7-5.5.3-6.2 2.3zm44.2-1.7c-2.9.7-4.9 2.6-4.6 4.9.3 2 2.9 3.3 5.9 2.6 2.9-.7 4.9-2.6 4.6-4.6-.3-1.9-3-3.2-5.9-2.9zM244.8 8C106.1 8 0 113.3 0 252c0 110.9 69.8 205.8 169.5 239.2 12.8 2.3 17.3-5.6 17.3-12.1 0-6.2-.3-40.4-.3-61.4 0 0-70 15-84.7-29.8 0 0-11.4-29.1-27.8-36.6 0 0-22.9-15.7 1.6-15.4 0 0 24.9 2 38.6 25.8 21.9 38.6 58.6 27.5 72.9 20.9 2.3-16 8.8-27.1 16-33.7-55.9-6.2-112.3-14.3-112.3-110.5 0-27.5 7.6-41.3 23.6-58.9-2.6-6.5-11.1-33.3 2.6-67.9 20.9-6.5 69 27 69 27 20-5.6 41.5-8.5 62.8-8.5s42.8 2.9 62.8 8.5c0 0 48.1-33.6 69-27 13.7 34.7 5.2 61.4 2.6 67.9 16 17.7 25.8 31.5 25.8 58.9 0 96.5-58.9 104.2-114.8 110.5 9.2 7.9 17 22.9 17 46.4 0 33.7-.3 75.4-.3 83.6 0 6.5 4.6 14.4 17.3 12.1C428.2 457.8 496 362.9 496 252 496 113.3 383.5 8 244.8 8zM97.2 352.9c-1.3 1-1 3.3.7 5.2 1.6 1.6 3.9 2.3 5.2 1 1.3-1 1-3.3-.7-5.2-1.6-1.6-3.9-2.3-5.2-1zm-10.8-8.1c-.7 1.3.3 2.9 2.3 3.9 1.6 1 3.6.7 4.3-.7.7-1.3-.3-2.9-2.3-3.9-2-.6-3.6-.3-4.3.7zm32.4 35.6c-1.6 1.3-1 4.3 1.3 6.2 2.3 2.3 5.2 2.6 6.5 1 1.3-1.3.7-4.3-1.3-6.2-2.2-2.3-5.2-2.6-6.5-1zm-11.4-14.7c-1.6 1-1.6 3.6 0 5.9 1.6 2.3 4.3 3.3 5.6 2.3 1.6-1.3 1.6-3.9 0-6.2-1.4-2.3-4-3.3-5.6-2z"/></svg> @damien-dupre](http://github.com/damien-dupre) [<svg style="height:0.8em;top:.04em;position:relative;" viewBox="0 0 512 512"><path d="M326.612 185.391c59.747 59.809 58.927 155.698.36 214.59-.11.12-.24.25-.36.37l-67.2 67.2c-59.27 59.27-155.699 59.262-214.96 0-59.27-59.26-59.27-155.7 0-214.96l37.106-37.106c9.84-9.84 26.786-3.3 27.294 10.606.648 17.722 3.826 35.527 9.69 52.721 1.986 5.822.567 12.262-3.783 16.612l-13.087 13.087c-28.026 28.026-28.905 73.66-1.155 101.96 28.024 28.579 74.086 28.749 102.325.51l67.2-67.19c28.191-28.191 28.073-73.757 0-101.83-3.701-3.694-7.429-6.564-10.341-8.569a16.037 16.037 0 0 1-6.947-12.606c-.396-10.567 3.348-21.456 11.698-29.806l21.054-21.055c5.521-5.521 14.182-6.199 20.584-1.731a152.482 152.482 0 0 1 20.522 17.197zM467.547 44.449c-59.261-59.262-155.69-59.27-214.96 0l-67.2 67.2c-.12.12-.25.25-.36.37-58.566 58.892-59.387 154.781.36 214.59a152.454 152.454 0 0 0 20.521 17.196c6.402 4.468 15.064 3.789 20.584-1.731l21.054-21.055c8.35-8.35 12.094-19.239 11.698-29.806a16.037 16.037 0 0 0-6.947-12.606c-2.912-2.005-6.64-4.875-10.341-8.569-28.073-28.073-28.191-73.639 0-101.83l67.2-67.19c28.239-28.239 74.3-28.069 102.325.51 27.75 28.3 26.872 73.934-1.155 101.96l-13.087 13.087c-4.35 4.35-5.769 10.79-3.783 16.612 5.864 17.194 9.042 34.999 9.69 52.721.509 13.906 17.454 20.446 27.294 10.606l37.106-37.106c59.271-59.259 59.271-155.699.001-214.959z"/></svg> damien-datasci-blog.netlify.app](https://damien-datasci-blog.netlify.app) [<svg style="height:0.8em;top:.04em;position:relative;" viewBox="0 0 512 512"><path d="M476 3.2L12.5 270.6c-18.1 10.4-15.8 35.6 2.2 43.2L121 358.4l287.3-253.2c5.5-4.9 13.3 2.6 8.6 8.3L176 407v80.5c0 23.6 28.5 32.9 42.5 15.8L282 426l124.6 52.2c14.2 6 30.4-2.9 33-18.2l72-432C515 7.8 493.3-6.8 476 3.2z"/></svg> damien.dupre@dcu.ie](mailto:damien.dupre@dcu.ie)