Facial_Action_Coding_System

Facial Action Coding System

Facial Action Coding System

System of classifying human facial movements


The Facial Action Coding System (FACS) is a system to taxonomize human facial movements by their appearance on the face, based on a system originally developed by a Swedish anatomist named Carl-Herman Hjortsjö.[1] It was later adopted by Paul Ekman and Wallace V. Friesen, and published in 1978.[2] Ekman, Friesen, and Joseph C. Hager published a significant update to FACS in 2002.[3] Movements of individual facial muscles are encoded by the FACS from slight different instant changes in facial appearance. It has proven useful to psychologists and to animators.

Muscles of head and neck

Background

Blind athlete expressing joy in athletic competition. The fact that unsighted persons use the same expressions as sighted people shows that expressions are innate.

In 2009, a study was conducted to study spontaneous facial expressions in sighted and blind judo athletes. They discovered that many facial expressions are innate and not visually learned.[4]

Method

Using the FACS[5] human coders can manually code nearly any anatomically possible facial expression, deconstructing it into the specific "action units" (AU) and their temporal segments that produced the expression. As AUs are independent of any interpretation, they can be used for any higher order decision making process including recognition of basic emotions, or pre-programmed commands for an ambient intelligent environment. The FACS manual is over 500 pages in length and provides the AUs, as well as Ekman's interpretation of their meanings.

The FACS defines AUs, as contractions or relaxations of one or more muscles. It also defines a number of "action descriptors", which differ from AUs in that the authors of the FACS have not specified the muscular basis for the action and have not distinguished specific behaviors as precisely as they have for the AUs.

For example, the FACS can be used to distinguish two types of smiles as follows:[6]

The FACS is designed to be self-instructional. People can learn the technique from a number of sources including manuals and workshops,[7] and obtain certification through testing.[8]

Although the labeling of expressions currently requires trained experts, researchers have had some success in using computers to automatically identify the FACS codes.[9] One obstacle to automatic FACS code recognition is a shortage of manually coded ground truth data.[10]

Uses

Use in medicine

The use of the FACS has been proposed for use in the analysis of depression,[11] and the measurement of pain in patients unable to express themselves verbally.[12]

Cross-species applications

The original FACS has been modified to analyze facial movements in several non-human primates, namely chimpanzees,[13] rhesus macaques,[14] gibbons and siamangs,[15] and orangutans.[16] More recently, it was developed also for domestic species, including dogs,[17] horses[18] and cats.[19] Similarly to the human FACS, the animal FACS has manuals available online for each species with the respective certification tests.[20]

Thus, the FACS can be used to compare facial repertoires across species due to its anatomical basis. A study conducted by Vick and others (2006) suggests that the FACS can be modified by taking differences in underlying morphology into account. Such considerations enable a comparison of the homologous facial movements present in humans and chimpanzees, to show that the facial expressions of both species result from extremely notable appearance changes. The development of FACS tools for different species allows the objective and anatomical study of facial expressions in communicative and emotional contexts. Furthermore, a cross-species analysis of facial expressions can help to answer interesting questions, such as which emotions are uniquely human.[21]

The Emotional Facial Action Coding System (EMFACS)[22] and the Facial Action Coding System Affect Interpretation Dictionary (FACSAID)[23] consider only emotion-related facial actions. Examples of these are:

More information Emotion, Action units ...

Computer-generated imagery

FACS coding is also used extensively in computer animation, with facial expressions being expressed as vector graphics of AUs.[24] FACS vectors are used as weights for blendshapes corresponding to each AU, with the resulting face mesh then being used to render the finished face.[25] Deep learning techniques can be used to determine the FACS vectors from face images obtained during motion capture acting or other performances.[26]

Codes for action units

For clarification, the FACS is an index of facial expressions, but does not actually provide any bio-mechanical information about the degree of muscle activation. Though muscle activation is not part of the FACS, the main muscles involved in the facial expression have been added here.

Action units (AUs) are the fundamental actions of individual muscles or groups of muscles.

Action descriptors (ADs) are unitary movements that may involve the actions of several muscle groups (e.g., a forward‐thrusting movement of the jaw). The muscular basis for these actions has not been specified and specific behaviors have not been distinguished as precisely as for the AUs.

For the most accurate annotation, the FACS suggests agreement from at least two independent certified FACS encoders.

Intensity scoring

Intensities of the FACS are annotated by appending letters A–E (for minimal-maximal intensity) to the action unit number (e.g. AU 1A is the weakest trace of AU 1 and AU 1E is the maximum intensity possible for the individual person).

  • A Trace
  • B Slight
  • C Marked or pronounced
  • D Severe or extreme
  • E Maximum

Other letter modifiers

There are other modifiers present in FACS codes for emotional expressions, such as "R" which represents an action that occurs on the right side of the face and "L" for actions which occur on the left. An action which is unilateral (occurs on only one side of the face) but has no specific side is indicated with a "U" and an action which is bilateral but has a stronger side is indicated with an "A" for asymmetric.

List of AUs and ADs (with underlying facial muscles)

Main codes

More information AU number, FACS name ...

Head movement codes

More information AU number, FACS name ...

Eye movement codes

More information AU number, FACS name ...

Visibility codes

More information AU number, FACS name ...

Gross behavior codes

These codes are reserved for recording information about gross behaviors that may be relevant to the facial actions that are scored.

More information AU number, FACS name ...

See also


References

  1. Hjortsjö CH (1969). Man's face and mimic language. free download: Carl-Herman Hjortsjö, Man's face and mimic language" Archived 2022-08-06 at the Wayback Machine
  2. Ekman P, Friesen W (1978). Facial Action Coding System: A Technique for the Measurement of Facial Movement. Palo Alto: Consulting Psychologists Press.
  3. Ekman P, Friesen WV, Hager JC (2002). Facial Action Coding System: The Manual on CD ROM. Salt Lake City: A Human Face.
  4. Matsumoto, D., & Willingham, B. (2009). "Spontaneous facial expressions of emotion of blind individuals". Journal of Personality and Social Psychology, 96(1), 1-10
  5. Freitas-Magalhães (2012). "Microexpression and macroexpression". In Ramachandran VS (ed.). Encyclopedia of Human Behavior. Vol. 2. Oxford: Elsevier/Academic Press. pp. 173–183. ISBN 978-0-12-375000-6.
  6. Del Giudice M, Colle L (May 2007). "Differences between children and adults in the recognition of enjoyment smiles". Developmental Psychology. 43 (3): 796–803. doi:10.1037/0012-1649.43.3.796. PMID 17484588.
  7. Rosenberg EL. "Example and web site of one teaching professional". Archived from the original on 2009-02-06. Retrieved 2009-02-04.
  8. "Facial Action Coding System". Paul Ekman Group. Retrieved 2019-10-23.
  9. Facial Action Coding System. Retrieved July 21, 2007.
  10. Song, Juan; Liu, Zhilei (10 Mar 2023). "Self-supervised Facial Action Unit Detection with Region and Relation Learning". arXiv:2303.05708 [cs.CV].
  11. Reed LI, Sayette MA, Cohn JF (November 2007). "Impact of depression on response to comedy: a dynamic facial coding analysis". Journal of Abnormal Psychology. 116 (4): 804–9. CiteSeerX 10.1.1.307.6950. doi:10.1037/0021-843X.116.4.804. PMID 18020726.
  12. Parr LA, Waller BM, Vick SJ, Bard KA (February 2007). "Classifying chimpanzee facial expressions using muscle action". Emotion. 7 (1): 172–81. doi:10.1037/1528-3542.7.1.172. PMC 2826116. PMID 17352572.
  13. Parr LA, Waller BM, Burrows AM, Gothard KM, Vick SJ (December 2010). "Brief communication: MaqFACS: A muscle-based facial movement coding system for the rhesus macaque". American Journal of Physical Anthropology. 143 (4): 625–30. doi:10.1002/ajpa.21401. PMC 2988871. PMID 20872742.
  14. Waller BM, Lembeck M, Kuchenbuch P, Burrows AM, Liebal K (2012). "GibbonFACS: A Muscle-Based Facial Movement Coding System for Hylobatids". International Journal of Primatology. 33 (4): 809–821. doi:10.1007/s10764-012-9611-6. S2CID 18321096.
  15. Caeiro CC, Waller BM, Zimmermann E, Burrows AM, Davila-Ross M (2012). "OrangFACS: A Muscle-Based Facial Movement Coding System for Orangutans (Pongo spp.)" (PDF). International Journal of Primatology. 34: 115–129. doi:10.1007/s10764-012-9652-x. S2CID 17612028.
  16. Waller BM, Peirce K, Caeiro CC, Scheider L, Burrows AM, McCune S, Kaminski J (2013). "Paedomorphic facial expressions give dogs a selective advantage". PLOS ONE. 8 (12): e82686. Bibcode:2013PLoSO...882686W. doi:10.1371/journal.pone.0082686. PMC 3873274. PMID 24386109.
  17. Wathan J, Burrows AM, Waller BM, McComb K (2015-08-05). "EquiFACS: The Equine Facial Action Coding System". PLOS ONE. 10 (8): e0131738. Bibcode:2015PLoSO..1031738W. doi:10.1371/journal.pone.0131738. PMC 4526551. PMID 26244573.
  18. Caeiro CC, Burrows AM, Waller BM (2017-04-01). "Development and application of CatFACS: Are human cat adopters influenced by cat facial expressions?" (PDF). Applied Animal Behaviour Science. 189: 66–78. doi:10.1016/j.applanim.2017.01.005. ISSN 0168-1591.
  19. "Home". animalfacs.com. Retrieved 2019-10-23.
  20. Vick SJ, Waller BM, Parr LA, Smith Pasqualini MC, Bard KA (March 2007). "A Cross-species Comparison of Facial Morphology and Movement in Humans and Chimpanzees Using the Facial Action Coding System (FACS)". Journal of Nonverbal Behavior. 31 (1): 1–20. doi:10.1007/s10919-006-0017-z. PMC 3008553. PMID 21188285.
  21. Friesen W, Ekman P (1983), EMFACS-7: Emotional Facial Action Coding System. Unpublished manuscript, vol. 2, University of California at San Francisco, p. 1
  22. Walsh, Joseph (2016-12-16). "Rogue One: the CGI resurrection of Peter Cushing is thrilling – but is it right?". The Guardian. ISSN 0261-3077. Retrieved 2023-10-23.
  23. Gudi, Amogh; Tasli, H. Emrah; Den Uyl, Tim M.; Maroulis, Andreas (2015). "Deep learning based FACS Action Unit occurrence and intensity estimation". 2015 11th IEEE International Conference and Workshops on Automatic Face and Gesture Recognition (FG). pp. 1–5. doi:10.1109/FG.2015.7284873. ISBN 978-1-4799-6026-2. S2CID 6283665. Retrieved 2023-10-23.

Share this article:

This article uses material from the Wikipedia article Facial_Action_Coding_System, and is written by contributors. Text is available under a CC BY-SA 4.0 International License; additional terms may apply. Images, videos and audio are available under their respective licenses.