• Search Menu
  • Advance Articles
  • Editor's Choice
  • Braunwald's Corner
  • ESC Guidelines
  • EHJ Dialogues
  • Issue @ a Glance Podcasts
  • CardioPulse
  • Weekly Journal Scan
  • European Heart Journal Supplements
  • Year in Cardiovascular Medicine
  • Asia in EHJ
  • ESC Content Collections
  • Author Guidelines
  • Submission Site
  • Why publish with EHJ?
  • Open Access Options
  • Submit from medRxiv
  • Author Resources
  • Self-Archiving Policy
  • Read & Publish
  • Advertising and Corporate Services
  • Advertising
  • Reprints and ePrints
  • Sponsored Supplements
  • Journals Career Network
  • About European Heart Journal
  • Editorial Board
  • About the European Society of Cardiology
  • ESC Publications
  • War in Ukraine
  • ESC Membership
  • ESC Journals App
  • Developing Countries Initiative
  • Dispatch Dates
  • Terms and Conditions
  • Journals on Oxford Academic
  • Books on Oxford Academic

Article Contents

Introduction, the power of non-verbal communication, in academic settings, the role of body language in interviews and evaluations, cultural considerations, the impact of body language on collaboration, declarations, unspoken science: exploring the significance of body language in science and academia.

ORCID logo

  • Article contents
  • Figures & tables
  • Supplementary Data

Mansi Patil, Vishal Patil, Unisha Katre, Unspoken science: exploring the significance of body language in science and academia, European Heart Journal , 2023;, ehad598, https://doi.org/10.1093/eurheartj/ehad598

  • Permissions Icon Permissions

Scientific presentations serve as a platform for researchers to share their work and engage with their peers. Science and academia rely heavily on effective communication to share knowledge and foster collaboration. Science and academia are domains deeply rooted in the pursuit of knowledge and the exchange of ideas. While the focus is often on the content of research papers, lectures, and presentations, there is another form of communication that plays a significant role in these fields: body language. Non-verbal cues, such as facial expressions, gestures, posture, and eye contact, can convey a wealth of information, often subtly influencing interpersonal dynamics and the perception of scientific work. In this article, we will delve into the unspoken science of body language, exploring its significance in science and academia. It is essential to emphasize on the importance of body language in scientific and academic settings, highlighting its impact on presentations, interactions, interviews, and collaborations. Additionally, cultural considerations and the implications for cross-cultural communication are explored. By understanding the unspoken science of body language, researchers and academics can enhance their communication skills and promote a more inclusive and productive scientific community.

Communication is a multi-faceted process, and words are only one aspect of it. Research suggests that non-verbal communication constitutes a substantial portion of human interaction, often conveying information that words alone cannot. Body language has a direct impact on how people perceive and interpret scientific ideas and findings. 1 For example, a presenter who maintains confident eye contact, uses purposeful gestures, and exhibits an open posture is likely to be seen as more credible and persuasive compared with someone who fidgets, avoids eye contact, and displays closed-off body language ( Figure 1 ).

Types of non-verbal communications.2 Non-verbal communication comprises of haptics, gestures, proxemics, facial expressions, paralinguistics, body language, appearance, eye contact, and artefacts.

Types of non-verbal communications. 2 Non-verbal communication comprises of haptics, gestures, proxemics, facial expressions, paralinguistics, body language, appearance, eye contact, and artefacts.

In academia, body language plays a crucial role in various contexts. During lectures, professors who use engaging body language, such as animated gestures and expressive facial expressions, can captivate their students and enhance the learning experience. Similarly, students who exhibit attentive and respectful body language, such as maintaining eye contact and nodding, signal their interest and engagement in the subject matter. 3

Body language also influences interactions between colleagues and supervisors. For instance, in a laboratory setting, researchers who display confident and open body language are more likely to be perceived as competent and reliable by their peers. Conversely, individuals who exhibit closed-off or defensive body language may inadvertently create an environment that inhibits collaboration and knowledge sharing. The impact of haptics in research collaboration and networking lies in its potential to enhance interpersonal connections and convey emotions, thereby fostering a deeper sense of empathy and engagement among participants.

Interviews and evaluations are critical moments in academic and scientific careers. Body language can significantly impact the outcomes of these processes. Candidates who display confident body language, including good posture, firm handshakes, and appropriate gestures, are more likely to make positive impressions on interviewers or evaluators. Conversely, individuals who exhibit nervousness or closed-off body language may unwittingly convey a lack of confidence or competence, even if their qualifications are strong. Recognizing the power of body language in these situations allows individuals to present themselves more effectively and positively.

Non-verbal cues play a pivotal role during interviews and conferences, where researchers and academics showcase their work. When attending conferences or presenting research, scientists must be aware of their body language to effectively convey their expertise and credibility. Confident body language can inspire confidence in others, making it easier to establish professional connections, garner support for research projects, and secure collaborations.

Similarly, during job interviews, body language can significantly impact the outcome. The facial non-verbal elements of an interviewee in a job interview setting can have a great effect on their chances of being hired. The face as a whole, the eyes, and the mouth are features that are looked at and observed by the interviewer as they makes their judgements on the person’s effective work ability. The more an applicant genuinely smiles and has their eyes’ non-verbal message match their mouth’s non-verbal message, they will be more likely to get hired than those who do not. As proven, that first impression can be made in only milliseconds; thus, it is crucial for an applicant to pass that first test. It paints the road for the rest of the interview process. 4

While body language is a universal form of communication, it is important to recognize that its interpretation can vary across cultures. Different cultures have distinct norms and expectations regarding body language, and what may be seen as confident in one culture may be interpreted differently in another. 5 It is crucial for scientists and academics to be aware of these cultural nuances to foster effective cross-cultural communication and understanding. Awareness of cultural nuances is crucial in fostering effective cross-cultural communication and understanding. Scientists and academics engaged in international collaborations or interactions should familiarize themselves with cultural differences to avoid misunderstandings and promote respectful and inclusive communication.

Collaboration lies at the heart of scientific progress and academic success. Body language plays a significant role in building trust and establishing effective collaboration among researchers and academics. Open and inviting body language, along with active listening skills, can foster an environment where ideas can be freely exchanged, leading to innovative breakthroughs. In research collaboration and networking, proxemics can significantly affect the level of trust and rapport between researchers. Respecting each other’s personal space and maintaining appropriate distances during interactions can foster a more positive and productive working relationship, leading to better communication and idea exchange ( Figure 2 ). Furthermore, being aware of cultural variations in proxemics can help researchers navigate diverse networking contexts, promoting cross-cultural understanding and enabling more fruitful international collaborations.

Overcoming the barrier of communication. The following factors are important for overcoming the barriers in communication, namely, using culturally appropriate language, being observant, assuming positive intentions, avoiding being judgemental, identifying and controlling bias, slowing down responses, emphasizing relationships, seeking help from interpreters, being eager to learn and adapt, and being empathetic.

Overcoming the barrier of communication. The following factors are important for overcoming the barriers in communication, namely, using culturally appropriate language, being observant, assuming positive intentions, avoiding being judgemental, identifying and controlling bias, slowing down responses, emphasizing relationships, seeking help from interpreters, being eager to learn and adapt, and being empathetic.

On the other hand, negative body language, such as crossed arms, lack of eye contact, or dismissive gestures, can signal disinterest or disagreement, hindering collaboration and stifling the flow of ideas. Recognizing and addressing such non-verbal cues can help create a more inclusive and productive scientific community.

Effective communication is paramount in science and academia, where the exchange of ideas and knowledge fuels progress. While the scientific community often focuses on the power of words, it is crucial not to send across conflicting verbal and non-verbal cues. While much attention is given to verbal communication, the significance of non-verbal cues, specifically body language, cannot be overlooked. Body language encompasses facial expressions, gestures, posture, eye contact, and other non-verbal behaviours that convey information beyond words.

Disclosure of Interest

There are no conflicts of interests from all authors.

Baugh   AD , Vanderbilt   AA , Baugh   RF . Communication training is inadequate: the role of deception, non-verbal communication, and cultural proficiency . Med Educ Online   2020 ; 25 : 1820228 . https://doi.org/10.1080/10872981.2020.1820228

Google Scholar

Aralia . 8 Nonverbal Tips for Public Speaking . Aralia Education Technology.   https://www.aralia.com/helpful-information/nonverbal-tips-public-speaking/ (22 July 2023, date last accessed)

Danesi   M . Nonverbal communication. In: Understanding Nonverbal Communication : Boomsburry Academic , 2022 ; 121 – 162 . https://doi.org/10.5040/9781350152670.ch-001

Google Preview

Cortez   R , Marshall   D , Yang   C , Luong   L . First impressions, cultural assimilation, and hireability in job interviews: examining body language and facial expressions’ impact on employer’s perceptions of applicants . Concordia J Commun Res   2017 ; 4 . https://doi.org/10.54416/dgjn3336

Pozzer-Ardenghi   L . Nonverbal aspects of communication and interaction and their role in teaching and learning science. In: The World of Science Education . Netherlands : Brill , 2009 , 259 – 271 . https://doi.org/10.1163/9789087907471_019

Email alerts

Citing articles via, looking for your next opportunity, affiliations.

  • Online ISSN 1522-9645
  • Print ISSN 0195-668X
  • Copyright © 2024 European Society of Cardiology
  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Accessibility
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2023 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

  • Social Anxiety Disorder
  • Bipolar Disorder
  • Kids Mental Health
  • Therapy Center
  • When To See a Therapist
  • Types of Therapy
  • Best Online Therapy
  • Best Couples Therapy
  • Best Family Therapy
  • Managing Stress
  • Sleep and Dreaming
  • Understanding Emotions
  • Self-Improvement
  • Healthy Relationships
  • Relationships in 2023
  • Student Resources
  • Personality Types
  • Verywell Mind Insights
  • 2023 Verywell Mind 25
  • Mental Health in the Classroom
  • Editorial Process
  • Meet Our Review Board
  • Crisis Support

Understanding Body Language and Facial Expressions

Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

research body language in communication

Steven Gans, MD is board-certified in psychiatry and is an active supervisor, teacher, and mentor at Massachusetts General Hospital.

research body language in communication

Body language refers to the nonverbal signals that we use to communicate. These nonverbal signals make up a huge part of daily communication. In fact, body language may account for between 60% to 65% of all communication.

Examples of body language include facial expressions, eye gaze, gestures, posture, and body movements. In many cases, the things we  don't  say can convey volumes of information.

So, why is body language important? Body language can help us understand others and ourselves. It provides us with information about how people may be feeling in a given situation. We can also use body language to express emotions or intentions.

Facial expressions, gestures, and eye gaze are often identified as the three major types of body language, but other aspects such as posture and personal distance can also be used to convey information. Understanding body language is important, but it is also essential to pay attention to other cues such as context. In many cases, you should look at signals as a group rather than focus on a single action.

This article discusses the roles played by body language in communication, as well as body language examples and the meaning behind them—so you know what to look for when you're trying to interpret nonverbal actions.

Click Play to Learn How To Read Body Language

This video has been medically reviewed by David Susman, PhD .

Facial Expressions

Think for a moment about how much a person is able to convey with just a facial expression. A smile can indicate approval or happiness . A frown can signal disapproval or unhappiness.

In some cases, our facial expressions may reveal our true feelings about a particular situation. While you say that you are feeling fine, the look on your face may tell people otherwise.

Just a few examples of  emotions  that can be expressed via facial expressions include:

The expression on a person's face can even help determine if we trust or believe what the individual is saying.

There are many interesting findings about body language in psychology research. One study found that the most trustworthy facial expression involved a slight raise of the eyebrows and a slight smile. This expression, the researchers suggested, conveys both friendliness and confidence .

Facial expressions are also among the most universal forms of body language. The expressions used to convey fear, anger, sadness, and happiness are similar throughout the world.

Researcher Paul Ekman has found support for the universality of a variety of facial expressions tied to particular emotions including joy, anger, fear, surprise, and sadness.

Research even suggests that we make judgments about people's intelligence based upon their faces and expressions.

One study found that individuals who had narrower faces and more prominent noses were more likely to be perceived as intelligent. People with smiling, joyful expression were also judged as being more intelligent than those with angry expressions.

The eyes are frequently referred to as the "windows to the soul" since they are capable of revealing a great deal about what a person is feeling or thinking.

As you engage in conversation with another person, taking note of eye movements is a natural and important part of the communication process.

Some common things you may notice include whether people are making direct eye contact or averting their gaze, how much they are blinking, or if their pupils are dilated.

The best way to read someone's body language is to pay attention. Look out for any of the following eye signals.

When a person looks directly into your eyes while having a conversation, it indicates that they are interested and paying attention . However, prolonged eye contact can feel threatening.

On the other hand, breaking eye contact and frequently looking away might indicate that the person is distracted, uncomfortable, or trying to conceal his or her real feelings.

Blinking is natural, but you should also pay attention to whether a person is blinking too much or too little.

People often blink more rapidly when they are feeling distressed or uncomfortable. Infrequent blinking may indicate that a person is intentionally trying to control his or her eye movements.  

For example, a poker player might blink less frequently because he is purposely trying to appear unexcited about the hand he was dealt.

Pupil size can be a very subtle nonverbal communication signal. While light levels in the environment control pupil dilation, sometimes emotions can also cause small changes in pupil size.

For example, you may have heard the phrase "bedroom eyes" used to describe the look someone gives when they are attracted to another person. Highly dilated eyes, for example, can indicate that a person is interested or even aroused.   

Mouth expressions and movements can also be essential in reading body language. For example, chewing on the bottom lip may indicate that the individual is experiencing feelings of worry, fear, or insecurity.

Covering the mouth may be an effort to be polite if the person is yawning or coughing, but it may also be an attempt to cover up a frown of disapproval.

Smiling is perhaps one of the greatest body language signals, but smiles can also be interpreted in many ways.

A smile may be genuine, or it may be used to express false happiness, sarcasm, or even cynicism.

When evaluating body language, pay attention to the following mouth and lip signals:

  • Pursed lips. Tightening the lips might be an indicator of distaste, disapproval, or distrust.
  • Lip biting. People sometimes bite their lips when they are worried, anxious, or stressed.
  • Covering the mouth. When people want to hide an emotional reaction, they might cover their mouths in order to avoid displaying smiles or smirks.
  • Turned up or down. Slight changes in the mouth can also be subtle indicators of what a person is feeling. When the mouth is slightly turned up, it might mean that the person is feeling happy or optimistic . On the other hand, a slightly down-turned mouth can be an indicator of sadness, disapproval, or even an outright grimace.

Gestures can be some of the most direct and obvious body language signals. Waving, pointing, and using the fingers to indicate numerical amounts are all very common and easy to understand gestures.

Some gestures may be cultural , however, so giving a thumbs-up or a peace sign in another country might have a completely different meaning than it does in the United States.

The following examples are just a few common gestures and their possible meanings:

  • A clenched fist  can indicate anger in some situations or solidarity in others.
  • A thumbs up and thumbs down  are often used as gestures of approval and disapproval.  
  • The "okay" gesture , made by touching together the thumb and index finger in a circle while extending the other three fingers can be used to mean "okay" or "all right."   In some parts of Europe, however, the same signal is used to imply you are nothing. In some South American countries, the symbol is actually a vulgar gesture.
  • The V sign , created by lifting the index and middle finger and separating them to create a V-shape, means peace or victory in some countries. In the United Kingdom and Australia, the symbol takes on an offensive meaning when the back of the hand is facing outward.

The Arms and Legs

The arms and legs can also be useful in conveying nonverbal information. Crossing the arms can indicate defensiveness. Crossing legs away from another person may indicate dislike or discomfort with that individual.

Other subtle signals such as expanding the arms widely may be an attempt to seem larger or more commanding, while keeping the arms close to the body may be an effort to minimize oneself or withdraw from attention.

When you are evaluating body language, pay attention to some of the following signals that the arms and legs may convey:

  • Crossed arms  might indicate that a person feels defensive, self-protective, or closed-off.
  • Standing with hands placed on the hips  can be an indication that a person is ready and in control, or it can also possibly be a sign of aggressiveness .
  • Clasping the hands behind the back  might indicate that a person is feeling bored, anxious, or even angry.
  • Rapidly tapping fingers or fidgeting  can be a sign that a person is bored, impatient, or frustrated.
  • Crossed legs  can indicate that a person is feeling closed-off or in need of privacy. 

How we hold our bodies can also serve as an important part of body language.

The term posture refers to how we hold our bodies as well as the overall physical form of an individual.

Posture can convey a wealth of information about how a person is feeling as well as hints about personality characteristics, such as whether a person is confident, open, or submissive.

Sitting up straight, for example, may indicate that a person is focused and paying attention to what's going on. Sitting with the body hunched forward, on the other hand, can imply that the person is bored or indifferent.

When you are trying to read body language, try to notice some of the signals that a person's posture can send.

  • Open posture  involves keeping the trunk of the body open and exposed. This type of posture indicates friendliness, openness, and willingness.
  • Closed posture  involves hiding the trunk of the body often by hunching forward and keeping the arms and legs crossed. This type of posture can be an indicator of hostility, unfriendliness, and anxiety .

Personal Space

Have you ever heard someone refer to their need for personal space? Have you ever started to feel uncomfortable when someone stands just a little too close to you?

The term proxemics , coined by anthropologist Edward T. Hall, refers to the distance between people as they interact. Just as body movements and facial expressions can communicate a great deal of nonverbal information, so can the physical space between individuals.

Hall  described four levels  of social distance that occur in different situations.

Intimate Distance: 6 to 18 inches 

This level of physical distance often indicates a closer relationship or greater comfort between individuals. It usually occurs during intimate contact such as hugging, whispering, or touching.

Personal Distance: 1.5 to 4 feet

Physical distance at this level usually occurs between people who are family members or close friends. The closer the people can comfortably stand while interacting can be an indicator of the level of intimacy in their relationship.

Social Distance: 4 to 12 feet.

This level of physical distance is often used with individuals who are acquaintances.

With someone you know fairly well, such as a co-worker you see several times a week, you might feel more comfortable interacting at a closer distance.

In cases where you do not know the other person well, such as a postal delivery driver you only see once a month, a distance of 10 to 12 feet may feel more comfortable.

Public Distance: 12 to 25 feet

Physical distance at this level is often used in public speaking situations. Talking in front of a class full of students or giving a presentation at work are good examples of such situations.

It is also important to note that the level of personal distance that individuals need to feel comfortable can vary from culture to culture.

One oft-cited example is the difference between people from Latin cultures and those from North America. People from Latin countries tend to feel more comfortable standing closer to one another as they interact, while those from North America need more personal distance.

Roles of Nonverbal Communication

Body language plays many roles in social interactions. It can help facilitate the following:

  • Earning trust : Engaging in eye contact, nodding your head while listening, and even unconsciously mirroring another person's body language are all signals that you and someone else are bonding.
  • Emphasizing a point : The tone of voice you use and the way you engage listeners with your hand and arm gestures, or by how you take up space, are all ways that affect how your message comes across.
  • Revealing truths : When someone's body language doesn't match what they're saying, we might intuitively pick up on the fact that they are withholding information, or perhaps not being honest about how they feel.
  • Tuning in to your own needs : Our own body language can reveal a lot about how we're feeling. For instance, are you in a slumped posture, clenching your jaw and/or pursing your lips? This may be a signal that the environment you're currently in is triggering you in some way. Your body might be telling you that you're feeling unsafe, stressed, or any number of emotions.

Remember, though, that your assumptions about what someone else's body language means may not always be accurate.

What does body language tell you about a person?

Body language can tell you when someone feels anxious, angry, excited, or any emotion. It may also suggest personality traits (i.e., whether someone is shy or outgoing). But, body language can be misleading. It is subject to a person's mood, energy level, and circumstances.

While in some cases, a lack of eye contact indicates untrustworthiness, for instance, it doesn't mean you automatically can't trust someone who isn't looking at you in the eyes. It could be they are distracted and thinking about something else. Or, again, it could be a cultural difference at play.

How to Improve Your Nonverbal Communication

The first step in improving your nonverbal communication is to pay attention. Try to see if you can pick up on other people's physical cues as well as your own.

Maybe when someone is telling you a story, you tend to look at the floor. In order to show them you're paying attention, you might try making eye contact instead, and even showing a slight smile, to show you're open and engaged.

What is good body language?

Good body language, also known as positive body language, should convey interest and enthusiasm. Some ways to do this include maintaining an upright and open posture, keeping good eye contact, smiling, and nodding while listening.

Using body language with intention is all about finding balance. For instance, when shaking someone's hand before a job interview, holding it somewhat firmly can signal professionalism. But, gripping it too aggressively might cause the other person pain or discomfort. Be sure to consider how other people might feel.

In addition, continue to develop emotional intelligence . The more in touch you are with how you feel, the easier it often is to sense how others are receiving you. You'll be able to tell when someone is open and receptive, or, on the other hand, if they are closed-off and need some space.

If we want to feel a certain way, we can use our body language to our advantage. For example, research found that people who maintained an upright seated posture while dealing with stress had higher levels of self-esteem and more positive moods compared to people who had slumped posture.

Of course, it's verbal and nonverbal communication—as well as the context of a situation—that often paints a full picture.

There isn't always a one-size-fits-all solution for what nonverbal cues are appropriate. However, by staying present and being respectful, you'll be well on your way to understanding how to use body language effectively.

A Word From Verywell

Understanding body language can go a long way toward helping you better communicate with others and interpreting what others might be trying to convey. While it may be tempting to pick apart signals one by one, it's important to look at these nonverbal signals in relation to verbal communication, other nonverbal signals, and the situation.

You can also learn more about how to improve your nonverbal communication to become better at letting people know what you are feeling—without even saying a word.

Foley GN, Gentile JP. Nonverbal communication in psychotherapy . Psychiatry (Edgmont) . 2010;7(6):38-44.

Tipper CM, Signorini G, Grafton ST. Body language in the brain: constructing meaning from expressive movement . Front Hum Neurosci . 2015;9:450. doi:10.3389/fnhum.2015.00450

Todorov A, Baron SG, Oosterhof NN. Evaluating face trustworthiness: a model based approach. Soc Cogn Affect Neurosci. 2008;3(2):119-27. doi:10.1093/scan/nsn009

Ekman P. Darwin's contributions to our understanding of emotional expressions. Philos Trans R Soc Lond, B, Biol Sci. 2009;364(1535):3449-51. doi:10.1098/rstb.2009.0189

Kleisner K, Chvátalová V, Flegr J. Perceived intelligence is associated with measured intelligence in men but not women. PLoS ONE. 2014;9(3):e81237. doi:10.1371/journal.pone.0081237

D'agostino TA, Bylund CL. Nonverbal accommodation in health care communication. Health Commun . 2014;29(6):563-73. doi:10.1080/10410236.2013.783773

Marchak FM. Detecting false intent using eye blink measures. Front Psychol. 2013;4:736. doi:10.3389/fpsyg.2013.00736

Jiang J, Borowiak K, Tudge L, Otto C, Von kriegstein K. Neural mechanisms of eye contact when listening to another person talking. Soc Cogn Affect Neurosci. 2017;12(2):319-328. doi:10.1093/scan/nsw127

Roter DL, Frankel RM, Hall JA, Sluyter D. The expression of emotion through nonverbal behavior in medical visits. Mechanisms and outcomes . J Gen Intern Med. 2006;21 Suppl 1:S28-34. doi:10.1111/j.1525-1497.2006.00306.x

Montgomery KJ, Isenberg N, Haxby JV. Communicative hand gestures and object-directed hand movements activated the mirror neuron system. Soc Cogn Affect Neurosci. 2007;2(2):114-22. doi:10.1093/scan/nsm004

Vacharkulksemsuk T, Reit E, Khambatta P, Eastwick PW, Finkel EJ, Carney DR. Dominant, open nonverbal displays are attractive at zero-acquaintance . Proc Natl Acad Sci USA. 2016;113(15):4009-14. doi:10.1073/pnas.1508932113

Hall ET. A system for the notation of proxemic behavior . American Anthropologist. October 1963;65(5):1003-1026. doi:10.1525/aa.1963.65.5.02a00020.

Hughes H, Hockey J, Berry G. Power play: the use of space to control and signify power in the workplace . Culture and Organization. 2019;26(4):298-314. doi:10.1080/14759551.2019.1601722

Chemelo VDS, Né YGS, Frazão DR, et al. Is there association between stress and bruxism? A systematic review and meta-analysis.  Front Neurol . 2020;11:590779. doi:10.3389/fneur.2020.590779

Jarick M, Bencic R.  Eye contact is a two-way street: arousal is elicited by the sending and receiving of eye gaze information.   Front Psychol . 2019;10:1262. doi:10.3389/fpsyg.2019.01262

Fred HL. Banning the handshake from healthcare settings is not the solution to poor hand hygiene .  Tex Heart Inst J . 2015;42(6):510-511. doi:10.14503/THIJ-15-5254

Nair S, Sagar M, Sollers J 3rd, Consedine N, Broadbent E. Do slumped and upright postures affect stress responses? A randomized trial .  Health Psychol . 2015;34(6):632-641. doi:10.1037/hea0000146

Hehman, E, Flake, JK and Freeman, JB. Static and dynamic facial cues differentially affect the consistency of social evaluations .  Personality and Social Psychology Bulletin . 2015; 41(8): 1123-34. doi:10.1177/0146167215591495.

Pillai D, Sheppard E, Mitchell P. Can people guess what happened to others from their reactions? Gilbert S, ed. PLoS ONE . 2012;7(11):e49859. doi:10.1371/journal.pone.0049859.

  • Ekman P. Emotions Revealed: Recognizing Faces and Feelings to Improve Communication and Emotional Life. 2nd ed. New York: Holt; 2007.
  • Pease A, Pease B. The Definitive Book of Body Language. Orion Publishing Group; 2017.

By Kendra Cherry, MSEd Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts.

Book cover

Encyclopedia of Personality and Individual Differences pp 527–531 Cite as

Body Language

  • Ursula Hess 3  
  • Reference work entry
  • First Online: 01 January 2020

64 Accesses

Nonverbal communication

Body language refers to the aspect of communication that is transmitted nonverbally (and automatically) through body movements.

As such, body language strictly speaking refers only to behaviors such as facial, vocal and postural expressions, touch, proxemics, and gaze but not to stable features such as physical attractiveness and facial morphology or (conscious) behavioral choices such as hair style, clothing, and adornment. Yet, these elements of nonverbal behavior often interact with elements of body language, and hence the two terms are often used interchangeably.

The study of body language has a long tradition in psychology. In his seminal book on The Expression of the Emotions in Man and Animals , Darwin ( 1872 /1965) already commented extensively on body language in humans and animals. His goal was to demonstrate the evolutionary continuity between emotion expressions in humans and animals, and he focused therefore on the many...

This is a preview of subscription content, log in via an institution .

Buying options

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Ambady, N., & Rosenthal, R. (1992). Thin slices of expressive behavior as predictors of interpersonal consequences: A meta-analysis. Psychological Bulletin, 111 , 256–274.

CrossRef   Google Scholar  

Ambady, N., Hallahan, M., & Rosenthal, R. (1995). On judging and being judged accurately in zero-acquaintance situations. Journal of Personality and Social Psychology, 69 , 518–529. https://doi.org/10.1037/0022-3514.69.3.518 .

Atkinson, A. P. (2013). Bodily expressions of emotion: Visual cues and neural mechanisms. In J. A. P. Vuilleumier (Ed.), The Cambridge handbook of human affective neuroscience (pp. 198–222). New York: Cambridge University Press.

Banse, R., & Scherer, K. R. (1996). Acoustic profiles in vocal emotion expression. Journal of Personality and Social Psychology, 70 , 614–636.

CrossRef   PubMed   Google Scholar  

Bente, G., Leuschner, H., Al Issa, A., & Blascovich, J. J. (2010). The others: Universals and cultural specificities in the perception of status and dominance from nonverbal behavior. Consciousness and Cognition, 19 (3), 762–777.

Berry, D. S., & McArthur, L. Z. (1986). Perceiving character in faces: The impact of age-related craniofacial changes on social perception. Psychological Bulletin, 100 , 3–10.

Birdwhistell, R. L. (1970). Kinesics and context: Essays on body motion communication . Philadelphia: University of Pennsylvania Press.

Google Scholar  

van der Borg, J. A., Schilder, M. B., Vinke, C. M., & de Vries, H. (2015). Dominance in domestic dogs: A quantitative analysis of its behavioural measures. PloS One, 10 (8), e0133978.

CrossRef   PubMed   PubMed Central   Google Scholar  

Bull, N., & Gidro-Frank, L. (1950). Emotions induced and studied in hypnotic subjects: Part II: The findings. Journal of Nervous and Mental Disease, 112 , 97–120.

Burgoon, J. (1991). Relational message interpretations of touch, conversational distance, and posture. Journal of Nonverbal Behavior, 15 , 233–259.

Chartrand, T. L., & Lakin, J. L. (2013). The antecedents and consequences of human behavioral mimicry. Annual Review of Psychology, 64 , 285–308.

Clynes, M. (1977). Sentics: The touch of emotions . New York: Doubleday and Company.

Darwin, C. (1872/1965). The expression of the emotions in man and animals . Chicago: The University of Chicago Press. (Originally published, 1872).

Duncan, S., Jr. (1969). Nonverbal communication. Psychological Bulletin, 72 (2), 118–137. https://doi.org/10.1037/h0027795 .

Efron, D. (1972). Gesture, race and culture . The Hague: Mouton.

Hall, E. T. (1963). A system for the notation of proxemic behavior. American Anthropologist, 65 , 1003–1026.

Hall, J. A. (1996). Touch, status, and gender at professional meetings. Journal of Nonverbal Behavior, 20 (1), 23–44.

Hall, J. A., & Veccia, E. M. (1990). More “touching” observations: New insights on men, women, and interpersonal touch. Journal of Personality and Social Psychology, 59 (6), 1155–1162. https://doi.org/10.1037/0022-3514.59.6.1155 .

Hareli, S., & Hess, U. (2010). What emotional reactions can tell us about the nature of others: An appraisal perspective on person perception. Cognition and Emotion, 24 , 128–140.

Henley, N. M. (1973). Status and sex: Some touching observations. Bulletin of the Psychonomic Society, 2 , 91–93.

Hertenstein, M. J., Keltner, D., App, B., Bulleit, B. A., & Jaskolka, A. R. (2006). Touch communicates distinct emotions. Emotion, 6 , 528–533.

Hess, U., & Fischer, A. (2013). Emotional mimicry as social regulation. Personality and Social Psychology Review, 17 , 142–157.

Hess, U., & Fischer, A. (2014). Emotional mimicry: Why and when we mimic emotions. Social and Personality Psychology Compass, 8 , 45–57.

Hess, U., & Fischer, A. (in press). The role of emotional mimicry in intergroup relations Oxford Encyclopedia of Intergroup Communication . Oxford: Oxford University Press.

Hinde, R. A. (1972). Non-verbal communication . Cambridge, UK: Cambridge University Press.

Iverson, J. M., Tencer, H. L., Lany, J., & Goldin-Meadow, S. (2000). The relation between gesture and speech in congenitally blind and sighted language-learners. Journal of Nonverbal Behavior, 24 (2), 105–130.

Juslin, P. N., & Scherer, K. R. (2005). Vocal expression of affect. In J. A. Harrigan, R. Rosenthal, & K. R. Scherer (Eds.), The new handbook of methods in nonverbal behavior research (pp. 65–135). New York: Oxford University Press.

Keating, C. F., Mazur, A., & Segall, M. H. (1981). A cross-cultural exploration of physiognomic traits of dominance and happiness. Ethology and Sociobiology, 2 , 41–48.

Kita, S. (2009). Cross-cultural variation of speech-accompanying gesture: A review. Language and Cognitive Processes, 24 (2), 145–167. https://doi.org/10.1080/01690960802586188 .

Kita, S., & Essegbey, J. (2001). Pointing left in Ghana: How a taboo on the use of the left hand influences gestural practice. Gesture, 1 (1), 73–95.

Knutson, B. (1996). Facial expressions of emotion influence interpersonal trait inferences. Journal of nonverbal behavior, 20 , 165–182.

Leary, T. F. (1957). Interpersonal diagnosis of personality . New York: Ronald Press.

Marsh, A. A., Ambady, N., & Kleck, R. E. (2005). The effects of fear and anger facial expressions on approach- and avoidance-related behaviors. Emotion, 5 , 119–124.

McDaniel, E., & Andersen, P. (1998). International patterns of interpersonal tactile communication: A field study. Journal of Nonverbal Behavior, 22 (1), 59–75. https://doi.org/10.1023/a:1022952509743 .

Mehrabian, A. (1969). Significance of posture and position in the communication of attitude and status relationships. Psychological Bulletin, 71 (5), 359–372. https://doi.org/10.1037/h0027349 .

Miller, R. E., Murphy, J. V., & Mirsky, I. A. (1959). Nonverbal communication of affect. Journal of Clinical Psychology, 15 , 155–158.

Morris, D., Collett, P., Marsh, P., & O’shaughnessy, M. (1979). Gestures . New York: Stein & Day.

Rantala, J., Salminen, K., Raisamo, R., & Surakka, V. (2013). Touch gestures in communicating emotional intention via vibrotactile stimulation. International Journal of Human-Computer Studies, 71 (6), 679–690. https://doi.org/10.1016/j.ijhcs.2013.02.004 .

Rule, N. O., Ambady, N., & Hallett, K. C. (2009). Female sexual orientation is perceived accurately, rapidly, and automatically from the face and its features. Journal of Experimental Social Psychology, 45 , 1245–1251.

Scherer, K. R. (1978). Personality inference from voice quality: The loud voice of extraversion. European Journal of Social Psychology, 8 , 467–487.

Schneider, S., Christensen, A., Häußinger, F. B., Fallgatter, A. J., Giese, M. A., & Ehlis, A.-C. (2013). Show me how you walk and I tell you how you feel – A functional near-infrared spectroscopy study on emotion perception based on human gait. NeuroImage ., No Pagination Specified. https://doi.org/10.1016/j.neuroimage.2013.07.078 .

Schuller, B., Batliner, A., Steidl, S., & Seppi, D. (2011). Recognising realistic emotions and affect in speech: State of the art and lessons learnt from the first challenge. Speech Communication, 53 (9–10), 1062–1087. https://doi.org/10.1016/j.specom.2011.01.011 .

Senior, C., Phillips, M. L., Barnes, J., & David, A. S. (1999). An investigation into the perception of dominance from schematic faces: A study using the world-wide web. Behavior Research Methods, Instruments and Computers, 31 , 341–346.

Tracy, J. L., & Matsumoto, D. (2008). The spontaneous expression of pride and shame: Evidence for biologically innate nonverbal displays. Proceedings of the National Academy of Sciences, 105 (33), 11655–11660.

Weisfeld, G. E., & Beresford, J. M. (1982). Erectness of posture as an indicator of dominance or success in humans. Motivation and Emotion, 6 (2), 113–131. https://doi.org/10.1007/BF00992459 .

Wiener, M., Devoe, S., Rubinow, S., & Geller, J. (1972). Nonverbal behavior and nonverbal communication. Psychological Review, 79 (3), 185–214. https://doi.org/10.1037/h0032710 .

Zebrowitz, L. A. (1997). Reading faces: Window to the soul? Boulder: Westview Press.

Zebrowitz, L. A., Fellous, J., Mignault, A., & Andreoletti, C. (2003). Trait impressions as overgeneralized responses to adaptively significant facial qualities: Evidence from connectionist modeling. Personality and Social Psychology Review, 7 , 194–215.

Download references

Author information

Authors and affiliations.

Department of Psychology, Humboldt University of Berlin, Berlin, Germany

Ursula Hess

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Ursula Hess .

Editor information

Editors and affiliations.

Oakland University, Rochester, MI, USA

Virgil Zeigler-Hill

Todd K. Shackelford

Section Editor information

Department of Psychology, Wake Forest University, Winston-Salem, NC, USA

John F. Rauthmann

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this entry

Cite this entry.

Hess, U. (2020). Body Language. In: Zeigler-Hill, V., Shackelford, T.K. (eds) Encyclopedia of Personality and Individual Differences. Springer, Cham. https://doi.org/10.1007/978-3-319-24612-3_647

Download citation

DOI : https://doi.org/10.1007/978-3-319-24612-3_647

Published : 22 April 2020

Publisher Name : Springer, Cham

Print ISBN : 978-3-319-24610-9

Online ISBN : 978-3-319-24612-3

eBook Packages : Behavioral Science and Psychology Reference Module Humanities and Social Sciences Reference Module Business, Economics and Social Sciences

Share this entry

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Effective Communication

Conflict resolution skills, improving emotional intelligence (eq).

  • Empathy: How to Feel and Respond to the Emotions of Others

Anger Management

Managing conflict with humor.

  • Gaslighting: Turning Off the Gas on Your Gaslighter

Setting Healthy Boundaries in Relationships

  • Online Therapy: Is it Right for You?
  • Mental Health
  • Health & Wellness
  • Children & Family
  • Relationships

Are you or someone you know in crisis?

  • Bipolar Disorder
  • Eating Disorders
  • Grief & Loss
  • Personality Disorders
  • PTSD & Trauma
  • Schizophrenia
  • Therapy & Medication
  • Exercise & Fitness
  • Healthy Eating
  • Well-being & Happiness
  • Weight Loss
  • Work & Career
  • Illness & Disability
  • Heart Health
  • Childhood Issues
  • Learning Disabilities
  • Family Caregiving
  • Teen Issues
  • Communication
  • Emotional Intelligence
  • Love & Friendship
  • Domestic Abuse
  • Healthy Aging
  • Aging Issues
  • Alzheimer’s Disease & Dementia
  • Senior Housing
  • End of Life
  • Healthy Living
  • Aging in Place
  • Online Therapy
  • Meet Our Team
  • Jeanne Segal, Ph.D.
  • Harvard Health Partnership
  • Audio Meditations

What is body language?

The importance of nonverbal communication, types of nonverbal communication, how nonverbal communication can go wrong, how to improve nonverbal communication, how to read body language, nonverbal communication and body language.

Your facial expressions, gestures, posture, and tone of voice are powerful communication tools. Here’s how to read and use body language to build better relationships at home and work.

research body language in communication

While the key to success in both personal and professional relationships lies in your ability to communicate well, it’s not the words that you use but your nonverbal cues or “body language” that speak the loudest. Body language is the use of physical behavior, expressions, and mannerisms to communicate nonverbally, often done instinctively rather than consciously.

Whether you’re aware of it or not, when you interact with others, you’re continuously giving and receiving wordless signals. All of your nonverbal behaviors—the gestures you make, your posture, your tone of voice, how much eye contact you make—send strong messages. They can put people at ease, build trust, and draw others towards you, or they can offend, confuse, and undermine what you’re trying to convey. These messages don’t stop when you stop speaking either. Even when you’re silent, you’re still communicating nonverbally.

In some instances, what comes out of your mouth and what you communicate through your body language may be two totally different things. If you say one thing, but your body language says something else, your listener will likely feel that you’re being dishonest. If you say “yes” while shaking your head no, for example. When faced with such mixed signals, the listener has to choose whether to believe your verbal or nonverbal message. Since body language is a natural, unconscious language that broadcasts your true feelings and intentions, they’ll likely choose the nonverbal message.

[Read: Effective Communication]

However, by improving how you understand and use nonverbal communication, you can express what you really mean, connect better with others, and build stronger, more rewarding relationships.

Your nonverbal communication cues—the way you listen, look, move, and react—tell the person you're communicating with whether or not you care, if you're being truthful, and how well you're listening. When your nonverbal signals match up with the words you're saying, they increase trust, clarity, and rapport. When they don't, they can generate tension, mistrust, and confusion.

If you want to become a better communicator, it's important to become more sensitive not only to the body language and nonverbal cues of others, but also to your own.

Nonverbal communication can play five roles:

  • Repetition: It repeats and often strengthens the message you're making verbally.
  • Contradiction: It can contradict the message you're trying to convey, thus indicating to your listener that you may not be telling the truth.
  • Substitution: It can substitute for a verbal message. For example, your facial expression often conveys a far more vivid message than words ever can.
  • Complementing: It may add to or complement your verbal message. As a boss, if you pat an employee on the back in addition to giving praise, it can increase the impact of your message.
  • Accenting: It may accent or underline a verbal message. Pounding the table, for example, can underline the importance of your message.

Source:  The Importance of Effective Communication , Edward G. Wertheim, Ph.D.

The many different types of nonverbal communication or body language include:

Facial expressions. The human face is extremely expressive, able to convey countless emotions without saying a word. And unlike some forms of nonverbal communication, facial expressions are universal. The facial expressions for happiness, sadness, anger, surprise, fear, and disgust are the same across cultures.

Body movement and posture. Consider how your perceptions of people are affected by the way they sit, walk, stand, or hold their head. The way you move and carry yourself communicates a wealth of information to the world. This type of nonverbal communication includes your posture, bearing, stance, and the subtle movements you make.

Gestures. Gestures are woven into the fabric of our daily lives. You may wave, point, beckon, or use your hands when arguing or speaking animatedly, often expressing yourself with gestures without thinking. However, the meaning of some gestures can be very different across cultures. While the “OK” sign made with the hand, for example, usually conveys a positive message in English-speaking countries, it's considered offensive in countries such as Germany, Russia, and Brazil. So, it's important to be careful of how you use gestures to avoid misinterpretation.

Eye contact. Since the visual sense is dominant for most people, eye contact is an especially important type of nonverbal communication. The way you look at someone can communicate many things, including interest, affection, hostility, or attraction. Eye contact is also important in maintaining the flow of conversation and for gauging the other person's interest and response.

Touch. We communicate a great deal through touch. Think about the very different messages given by a weak handshake, a warm bear hug, a patronizing pat on the head, or a controlling grip on the arm, for example.

Space. Have you ever felt uncomfortable during a conversation because the other person was standing too close and invading your space? We all have a need for physical space, although that need differs depending on the culture, the situation, and the closeness of the relationship. You can use physical space to communicate many different nonverbal messages, including signals of intimacy and affection, aggression or dominance.

Voice. It's not just what you say, it's how you say it. When you speak, other people “read” your voice in addition to listening to your words. Things they pay attention to include your timing and pace, how loud you speak, your tone and inflection, and sounds that convey understanding, such as “ahh” and “uh-huh.” Think about how your tone of voice can indicate sarcasm, anger, affection, or confidence.

Can nonverbal communication be faked?

There are many books and websites that offer advice on how to use body language to your advantage. For example, they may instruct you on how to sit a certain way, steeple your fingers, or shake hands in order to appear confident or assert dominance. But the truth is that such tricks aren't likely to work (unless you truly feel confident and in charge). That's because you can't control all of the signals you're constantly sending about what you're really thinking and feeling. And the harder you try, the more unnatural your signals are likely to come across.

However, that doesn't mean that you have no control over your nonverbal cues. For example, if you disagree with or dislike what someone's saying, you may use negative body language to rebuff the person's message, such as crossing your arms, avoiding eye contact, or tapping your feet. You don't have to agree, or even like what's being said, but to communicate effectively and not put the other person on the defensive, you can make a conscious effort to avoid sending negative signals—by maintaining an open stance and truly attempting to understand what they're saying, and why.

What you communicate through your body language and nonverbal signals affects how others see you, how well they like and respect you, and whether or not they trust you. Unfortunately, many people send confusing or negative nonverbal signals without even knowing it. When this happens, both connection and trust in relationships are damaged, as the following examples highlight:

  • Jack believes he gets along great with his colleagues at work, but if you were to ask any of them, they would say that Jack is “intimidating” and “very intense.” Rather than just look at you, he seems to devour you with his eyes. And if he takes your hand, he lunges to get it and then squeezes so hard it hurts. Jack is a caring guy who secretly wishes he had more friends, but his nonverbal awkwardness keeps people at a distance and limits his ability to advance at work.
  • Arlene is attractive and has no problem meeting eligible men, but she has a difficult time maintaining a relationship for longer than a few months. Arlene is funny and interesting, but even though she constantly laughs and smiles, she radiates tension. Her shoulders and eyebrows are noticeably raised, her voice is shrill, and her body is stiff. Being around Arlene makes many people feel anxious and uncomfortable. Arlene has a lot going for her that is undercut by the discomfort she evokes in others.
  • Ted thought he had found the perfect match when he met Sharon, but Sharon wasn't so sure. Ted is good looking, hardworking, and a smooth talker, but seemed to care more about his thoughts than Sharon's. When Sharon had something to say, Ted was always ready with wild eyes and a rebuttal before she could finish her thought. This made Sharon feel ignored, and soon she started dating other men. Ted loses out at work for the same reason. His inability to listen to others makes him unpopular with many of the people he most admires.

These smart, well-intentioned people struggle in their attempt to connect with others. The sad thing is that they are unaware of the nonverbal messages they communicate.

[Read: Tips for Building a Healthy Relationship]

If you want to communicate effectively, avoid misunderstandings, and enjoy solid, trusting relationships both socially and professionally, it's important to understand how to use and interpret body language and improve your nonverbal communication skills.

Reconnect and rebuild your relationship

Ritual offers online counseling, practical tools, and proven interventions to help you heal and strengthen your relationships and improve your communication skills.

Nonverbal communication is a rapidly flowing back-and-forth process that requires your full focus on the moment-to-moment experience. If you're planning what you're going to say next, checking your phone, or thinking about something else, you're almost certain to miss nonverbal cues and not fully understand the subtleties of what's being communicated. As well as being fully present, you can improve how you communicate nonverbally by learning to manage stress and developing your emotional awareness.

Learn to manage stress in the moment

Stress compromises your ability to communicate. When you're stressed out, you're more likely to misread other people, send confusing or off-putting nonverbal signals, and lapse into unhealthy knee-jerk patterns of behavior. And remember: emotions are contagious. If you are upset, it is very likely to make others upset, thus making a bad situation worse.

If you're feeling overwhelmed by stress, take a time out. Take a moment to calm down before you jump back into the conversation. Once you've regained your emotional equilibrium, you'll feel better equipped to deal with the situation in a positive way.

The fastest and surest way to calm yourself and manage stress in the moment is to employ your senses—what you see, hear, smell, taste, and touch—or through a soothing movement. By viewing a photo of your child or pet, smelling a favorite scent, listening to a certain piece of music, or squeezing a stress ball, for example, you can quickly relax and refocus. Since everyone responds differently, you may need to experiment to find the sensory experience that works best for you.

Develop your emotional awareness

In order to send accurate nonverbal cues, you need to be aware of your emotions and how they influence you. You also need to be able to recognize the emotions of others and the true feelings behind the cues they are sending. This is where emotional awareness comes in.

[Read: Improving Emotional Intelligence (EQ)]

Being emotionally aware enables you to:

  • Accurately read other people, including the emotions they're feeling and the unspoken messages they're sending.
  • Create trust in relationships by sending nonverbal signals that match up with your words.
  • Respond in ways that show others that you understand and care.

Many of us are disconnected from our emotions—especially strong emotions such as anger, sadness, fear—because we've been taught to try to shut off our feelings. But while you can deny or numb your feelings, you can't eliminate them. They're still there and they're still affecting your behavior. By developing your emotional awareness and connecting with even the unpleasant emotions, though, you'll gain greater control over how you think and act. To start developing your emotional awareness, practice the mindfulness meditation in HelpGuide's free Emotional Intelligence Toolkit .

Speak to a Licensed Therapist

BetterHelp is an online therapy service that matches you to licensed, accredited therapists who can help with depression, anxiety, relationships, and more. Take the assessment and get matched with a therapist in as little as 48 hours.

Once you've developed your abilities to manage stress and recognize emotions, you'll start to become better at reading the nonverbal signals sent by others. It's also important to:

Pay attention to inconsistencies. Nonverbal communication should reinforce what is being said. Is the person saying one thing, but their body language conveying something else? For example, are they telling you “yes” while shaking their head no?

Look at nonverbal communication signals as a group. Don't read too much into a single gesture or nonverbal cue. Consider all of the nonverbal signals you are receiving, from eye contact to tone of voice and body language. Taken together, are their nonverbal cues consistent—or inconsistent—with what their words are saying?

Trust your instincts. Don't dismiss your gut feelings. If you get the sense that someone isn't being honest or that something isn't adding up, you may be picking up on a mismatch between verbal and nonverbal cues.

Evaluating nonverbal signals

Eye contact – Is the person making eye contact? If so, is it overly intense or just right?

Facial expression – What is their face showing? Is it masklike and unexpressive, or emotionally present and filled with interest?

Tone of voice – Does the person's voice project warmth, confidence, and interest, or is it strained and blocked?

Posture and gesture – Is their body relaxed or stiff and immobile? Are their shoulders tense and raised, or relaxed?

Touch – Is there any physical contact? Is it appropriate to the situation? Does it make you feel uncomfortable?

Intensity – Does the person seem flat, cool, and disinterested, or over-the-top and melodramatic?

Timing and place – Is there an easy flow of information back and forth? Do nonverbal responses come too quickly or too slowly?

Sounds – Do you hear sounds that indicate interest, caring or concern from the person?

More Information

  • About Nonverbal Communications - Different categories of nonverbal communication, along with a detailed list of signals. (Adam Blatner, M.D.)
  • Body Language: Understanding Nonverbal Communication - Particularly as it applies to the workplace. (MindTools)
  • Take Control of Your Nonverbal Communication (video) - How to notice and use body language. (Harvard Business Review)
  • The Importance of Nonverbal Communication (PDF) - Piece by Edward G. Wertheim, Ph.D. about the communication process. (Northeastern University)

More in Communication

Tips to avoid conflict and improve work and personal relationships

research body language in communication

Tips for handling conflicts, arguments, and disagreements

research body language in communication

Boost your emotional intelligence to help you be happy and successful

research body language in communication

How to feel and respond to the emotions of others

research body language in communication

Tips and techniques for getting anger under control

research body language in communication

Using laughter and play to resolve disagreements

research body language in communication

Turning Off the Gas on Your Gaslighter

5 ways to deal with gaslighting

research body language in communication

Strengthen your connections and improve your self-esteem

research body language in communication

Professional therapy, done online

BetterHelp makes starting therapy easy. Take the assessment and get matched with a professional, licensed therapist.

Help us help others

Millions of readers rely on HelpGuide.org for free, evidence-based resources to understand and navigate mental health challenges. Please donate today to help us save, support, and change lives.

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • Advanced Search
  • Journal List
  • Healthcare (Basel)

Logo of healthcare

Body Language Analysis in Healthcare: An Overview

Rawad abdulghafor.

1 Department of Computer Science, Faculty of Information and Communication Technology, International Islamic University Malaysia, Kuala Lumpur 53100, Malaysia

Sherzod Turaev

2 Department of Computer Science and Software Engineering, College of Information Technology, United Arab Emirates University, Al-Ain, Abu Dhabi P.O. Box 15556, United Arab Emirates

Mohammed A. H. Ali

3 Department of Mechanical Engineering, Faculty of Engineering, University of Malaya, Kuala Lumpur 50603, Malaysia

Associated Data

Not applicable.

Given the current COVID-19 pandemic, medical research today focuses on epidemic diseases. Innovative technology is incorporated in most medical applications, emphasizing the automatic recognition of physical and emotional states. Most research is concerned with the automatic identification of symptoms displayed by patients through analyzing their body language. The development of technologies for recognizing and interpreting arm and leg gestures, facial features, and body postures is still in its early stage. More extensive research is needed using artificial intelligence (AI) techniques in disease detection. This paper presents a comprehensive survey of the research performed on body language processing. Upon defining and explaining the different types of body language, we justify the use of automatic recognition and its application in healthcare. We briefly describe the automatic recognition framework using AI to recognize various body language elements and discuss automatic gesture recognition approaches that help better identify the external symptoms of epidemic and pandemic diseases. From this study, we found that since there are studies that have proven that the body has a language called body language, it has proven that language can be analyzed and understood by machine learning (ML). Since diseases also show clear and different symptoms in the body, the body language here will be affected and have special features related to a particular disease. From this examination, we discovered that it is possible to specialize the features and language changes of each disease in the body. Hence, ML can understand and detect diseases such as pandemic and epidemic diseases and others.

1. Introduction

Body language constitutes one of the languages of communication. The types of languages are classified into verbal and non-verbal languages. Body language includes non-verbal language, where the movements and behaviors of the body are used instead of words to express and convey information. Body language may involve hand movements, facial expressions and hints, eye movements, tone of voice, body movements and positions, gestures, use of space, and the like. This research will focus on interpretations of the human body language, classified under kinesiology.

Body language is entirely different from sign language, a complete language—like verbal language—with its own basic rules and complex grammar systems [ 1 , 2 ]. On the other hand, body language does not contain grammatical rules and is usually a language belonging to or classified according to cultures [ 3 ]. Interpretations of body language may differ from country to country and from one culture to another. There exists some controversy over whether body language can be regarded as a universal language for all people. Some researchers have concluded that most communication among individuals involves physical symbols or gestures since the interaction of body language here facilitates speedy information transmission and understanding [ 4 ]. According to [ 5 ], body language speaks more and better content than verbal language. When, for example, an individual speaks over the phone to someone about an inquiry, the information becomes mysterious due to the physical language’s restrictions. However, an individual sitting directly in front of an audience has fewer restrictions and does not have an audience. The information with body language is more easily transmitted and received, even more so if the speaker is standing, allowing more freedom of movement. Thus, it follows that body language enhances communication. This work attempts to prove that body language enhances workplace positivity.

Several experiments were performed in [ 6 ] on facial expressions and body movements affected by human emotions. The study has shown that facial expressions and body movements can accurately determine human emotions. It also proved that combining facial features and activities with body movements is essential for analyzing human expressions. Three different stages of experiments were conducted to determine whether it is necessary to combine the two expressions or not. It was confirmed that it is essential to connect them for identification. Reading someone’s eyes should also not be ignored. It is considered an important factor in expressing and understanding human emotions. We are generally able to know what others want from their eye movements. For that, eye language has many effects. According to [ 7 ], the expansion and tightness of the eye size are affected by emotions and allow the observer to convey specific additional information. The human eye blinks, on average, 6 to 10 times per minute. However, when someone is attracted to someone else, the number of blinks is fewer. Study [ 8 ] discovered that human feelings could be identified and defined through body position. For example, when a person feels angry, they will push their body forward to express dominance over the other person, and their upper body is tilted and no longer upright. On the other hand, if someone feels intimidated by the opponent, they signal submission by retreating backward or moving their head back. Additionally, a person’s emotional state can be determined from their sitting position. Someone sitting on a chair with half of their upper body and head slightly tilted forward indicates attentiveness and eagerness to follow what is being said. However, sitting with legs and hands crossed suggests that they do not wish to engage and feel uncomfortable with what is being said or the person saying it [ 5 ].

Body language analysis is also essential to avoid confusion in a single movement’s meanings and purposes that carry more than one meaning. For example, the expressive movement of a person may be due to a physical handicap or a compulsive movement rather than an intentional one. Furthermore, a particular movement in the body of someone may not mean the same to another. For example, a person may rub their eyes due to itchiness and not fatigue. Foreign cultures also need careful analysis due to their social differences. Although most body movements are universal, there are also movements specific to each culture. This may vary from country to country, region to region, and even social group.

Pandemic and epidemic diseases constitute a global risk factor responsible for the death of millions of people worldwide. The ability to detect and treat casualties is limited, primarily due to the lack of human and technical resources. When patients are not physically accessible, remote diagnosis is required. All pandemic and epidemic diseases are characterized by distinct body movements affecting the face, shoulders, chest, and hands. AI technology has shown positive results in some reading of these gestures. Hence, the idea is to use body language to detect epidemic diseases early and provide treatment. It should be noted that the primary and vital catalyst for the proposal of this study is the COVID-19 disease, which is presently terrorizing the whole world. As researchers in information technology and computer science, we must play our part in rapidly detecting this disease.

This paper aims to study the previous literature and identify body language expressions that indicate disease. Body language is defined as certain expressions, movements, and gestures that point to the physical and emotional state of the bearer. Certain parts of the body can express different characteristics or feelings. Some studies have demonstrated the presence of certain emotional conditions as reflected in particular facial expressions (e.g., joy, sadness, surprise, and anger). Regarding the relationship between diseases and body language, it is known that diseases affect the body parts and qualities and are reflected in the movements and expressions of parts of the body. Different diseases affect different body parts and can be measured, identified, and used for diagnosis.

Hence, this paper is proposed to study some diseases that can be diagnosed by identifying and measuring the external movements of the body. In addition, this paper discusses the findings of previous studies to demonstrate the usefulness and contribution of AI in detecting diseases through body language. One of the biggest obstacles to treating COVID-19 patients effectively is speedy diagnosis. However, the large number of cases exceeds the capacity of most hospitals. Hence, AI offers a solution through ML. ML can detect disease symptoms as manifested in the patient’s body language and can be used to generate correct readings and predictions.

Therefore, the main contribution of this paper is to show the potential use of analyzing body language in health care. The importance of body language analysis in health care and patient body language analysis using AI will be discussed in the following sections. The added tables list previous studies that used ML to identify symptoms through body expressions. The findings demonstrate that a patient’s body language can be analyzed using ML for diagnostic purposes.

2. Methodology

The methods used to review in this work are as follows (also see Figure 1 ): first, the importance of body language analysis is highlighted to prove that the body movements can be read and analyzed to produce outcomes that are useful for many applications; second, body language analysis in health care is presented to show the importance of body language in medical diagnosis in research; third, ML is used successfully to identify characteristic symptoms; fourth, Table 1 show studies that used ML as a diagnostic tool and include the used algorithms. Each topic was discussed separately, as detailed in the following sections.

Some Studies of AI Methods for Body Language Elements to Identify the Symptoms.

An external file that holds a picture, illustration, etc.
Object name is healthcare-10-01251-g001.jpg

The Review Stages.

3. The Importance of Body Language Analysis

AI is one of the most significant technological developments, increasing in popularity and being used in all application areas. One of the most important of these applications is the use of AI in healthcare. Health is the most important human factor for life on this planet. Recently, the use and applications of AI in healthcare have played a significant role in helping doctors discover diseases and improve human health. The use of AI in health depends on the appearance of some symptoms on parts of the body. These symptoms affect and are reflected in the movements and expressions of the body, which are manifested as body language. From this point, these features of body language can be used to classify disease symptoms by detecting them in ML. In this section, we want to explain the importance of using body language by artificial intelligence. There are features that appear in body language that AI can analyze to solve many problems in many applications. For example, facial expressions can be analyzed to know human feelings and benefit from them in psychotherapy or examine subjects’ emotions in the study. Another example is analyzing the movements of the hand, shoulder, or leg, and using them to discover valuable features in medicine, security, etc. From this point, we want to show that body language has many benefits and applications, so this is important. Therefore, we want to suggest that body language can also be used to detect infectious diseases such as COVID-19 using ML.

Now, it is feasible to employ this technology in healthcare systems. Pandemic and epidemic diseases are considered an intractable matter that inferiorly affects human health, regarded as peoples’ most valuable asset. Additionally, the biggest worry is that new pandemics or epidemics will suddenly appear and become deadly, such as COVID-19, which has claimed nearly a million lives so far. This stimulates us to develop AI technologies to help detect the disease’s external symptoms by analyzing the patients’ body language. This work deals with general studies that prove the importance of body language processing in various fields.

Every computer user interacts with the device via mouse and keyboard. Currently, researchers are developing a computer system for interaction and response through body language such as hand gestures and movement. In [ 8 ], a comprehensive survey was completed evaluating the published literature recommending the visual interpretation of hand gestures when interacting with computing devices and introducing more advanced methods to analyze body language rather than mouse and keyboard movements. The study of [ 9 ] considered the problem of robot accuracy recognition. It proposed a fusion system to identify the fall movement types and abnormal directions with an accuracy rate of 99.37%. A facial coding system was developed in [ 10 ] to measure and analyze facial muscle movements and identify facial expressions. A database was created with a series of 1100 images. The system analyzed and classified facial creases and wrinkles to match their movements. The results showed that the performance improved, reaching 92%. Combining facial features and movements with body movements is essential for analyzing individual expressions. Three different experiments were conducted to determine whether facial expressions and body language should be combined and concluded in the affirmative. Another study [ 11 ] focused on deep learning techniques to identify emotions revealed in facial expressions. This research used pure convolutional neural network techniques to prove that deep learning using these neural networks successfully recognizes emotions by developing cognition, significantly improving the usability. A new model was invented in [ 12 ] that detected body gestures and movements with a pair of digital video images, which supplied a set of vector monitors with three dimensions.

The first study showed the relationship between the contraction of the internal muscles of the face and the facial movements as established by Hjortsjo 1970 [ 13 ] to develop a coding system by identifying the minor units of facial muscle movements and then drawing coordinates that defined the facial expressions. The recognition of people’s emotions has merited much attention. However, the issue of detecting facial emotions and expressions of speech, especially among researchers, is still problematic. The work presented in [ 14 ] offered a comprehensive survey to facilitate further research in this field. It focused on identifying gender-specific characteristics, setting an automatic framework to determine the physical manifestation of emotions, and identifying constant and dynamic body shape comments. It also examined recent studies on learning and emotion by identifying gestures through photos or video. Several methods combined speech, body, and facial gestures were also discussed to identify optimized emotions. The study concluded that the knowledge of a person’s feelings through overtones was still incomplete.

4. Body Language Analysis in Healthcare

A coding system was created to classify the facial expressions by analyzing more than 1100 pictures at work [ 10 ]. Three ways to classify facial expressions were compared: a method for analyzing image components in the gray field, measuring wrinkles, and a template for creating facial movements. The accuracy of performance of the coding system for the three roads was 89%, 57%, and 85%, respectively, while when assembling the methods, the performance accuracy reached 92%. Online learning is challenged by knowing students’ participation in learning processes. In work [ 15 ], an algorithm is introduced to learn about student interactions and see their problems. In this algorithm, two methods were used to collect evidence of student participation: the first method involved collecting facial expressions using a camera, and the second involved collecting hand movement data using mouse movements. The data were trained by building two groups; one group collected facial data with mouse data, and the second was without the mouse. It was discovered that the first group’s performance was better than the second group’s by 94.60% compared to 91.51%. Work [ 14 ] commented on recognizing facial and speech gestures that may provide a comprehensive survey of body language. It provided a framework for the automatic identification of dynamic and fixed emotional body gestures that combined facial and speech gestures to improve recognition of a person’s emotions. Paper [ 16 ] defines facial expressions by matching them with body positions. The work demonstrated that the effects and expressions are more evident when the major irritations on the face are similar to those highlighted in the body. However, the model produces different results according to the dependence on the properties, whether physical, dimensional, or latent. Another significant finding in the study is that expressions of fear bloom better when paired with facial expressions than when performing tasks.

In [ 17 ], the authors stated that the medical advisor must exhibit exciting communication qualities that make the patient feel comfortable making a correct decision. They advised doctors to know how to use facial expressions, eyes, hand gestures, and other body expressions. It was mentioned that a smile is the most robust expression that a doctor can use to communicate with their patients, as the doctor’s smile makes the patient feel comfortable. The patient’s sense of comfort makes them appear confident, and they answer the doctor’s questions with clear responses, credibility, and confidence. In addition, communicating with the eyes is very important to help the patient, as the lack of this from the doctor may suggest that the doctor does not care about them. The research in [ 18 ] concludes that the doctor’s appropriate nonverbal communication positively impacts the patient. Objective evidence has shown that the patient improves and recovers better and faster when the doctor uses a smile and direct eye communication with the patient compared to those who do not use a smile and direct eye with the patient. It was also concluded that patients who receive more attention, feeling, sensation, and participation by the doctor respond better to treatment, as the tone of voice, movement of the face and body, and eye gaze affect the patient. Clint [ 19 ] reported his first day on the job in the intensive care unit. He felt fear and anxiety on that day as the unit was comprehensive and informative. Clint was asking himself, “is it worth working in that unit?” He had a patient with her sister next to her. The patient glimpsed Clint’s nervousness and anxiety but did not dare ask him, so she whispered that the nurse was nervous to her sister. Then, her sister asked Clint, “you are worried and anxious today; why?” What is there to be so nervous about? Clint thought to hide his nervousness and anxiety and restore confidence; he smiled and replied, “I am not nervous.” However, sometimes, we have to ask our patients ridiculous questions that make us tense. Here, Clint states that he noticed from the patient’s looks that he could not persuade her to hide his stress. Clint made it clear that patients are affected by their body language and facial expressions. They can know their cases through their body language. From here, Clint realized that he was wrong. As anxiety and stress began on his patient, his condition may increase for that reason.

In one of Henry’s articles [ 20 ], he wrote that treating a patient with behaviors and body language has a more significant impact than using drugs. The work [ 21 ] concluded that non-verbal language between a doctor and their patient plays a vital role in treating the patient. The doctor can use non-verbal signals sent from the patient to collect information about the condition of the disease to help them decide on diagnosis and treatment. The research summarized that the non-verbal technique used by the doctor toward the patient affects them in obtaining information and helping them recover from the disease. For example, eye gaze, closeness to the patient, and facial and hand gestures to appear relaxed. The research suggests that there is a positive effect on the use of non-verbal cues on the patient. It is recommended that doctors be trained in incorporating non-verbal cues as a significant way of dealing with patients to speed up their treatment.

5. Patient’s Body Language Analysis Using AI

Different AI methods and techniques have been used to analyze patients’ body language. We briefly discuss some studies conducted so far in this area. More specifically, focusing on facial recognition, a pimple system was introduced in [ 22 ] to analyze facial muscles and thus identify different emotions. The proposed system automatically tracks faces using video and extracts geometric shapes for facial features. The study was conducted on eight patients with schizophrenia, and the study collected dynamical information on facial muscle movements. This study showed the possibility of identifying engineering measurements for individual faces and determining their exact differences for recognition purposes. Three methods were used in [ 23 ] to measure facial expressions to define emotions and identify persons with mental illness. The study’s proposed facial action coding system enabled the interpretation of emotional facial expressions and thus contributed to the knowledge of therapeutic intervention for patients with mental illnesses.

Many people suffer from an imbalance in the nervous system, which leads to paralysis of the patient’s movement and falls without prior warning. The study [ 24 ] was targeted to improve early warning signs detection and identification rate using a platform (R). Wireless sensor devices were placed on the chest and waist. The collected data were converted to an algorithm for analysis that extracted them and activated if there was a risk. The results showed that the patient at risk engaged in specific typical movements, which indicated an imminent fall. The authors further suggested applying this algorithm to patients with seizures to warn of an imminent attack and alert the emergency services.

In research [ 25 ], a computational framework was designed to monitor the movements of older adults to signal organ failures and other sudden drops in vital body functions. The system monitored the patient’s activity and determined its level using sensors placed on different body parts. The experiments show that this system identifies the correct locations in real-time with an accuracy of 95.8%. Another approach based on data analysis was presented in [ 26 ] for an intelligent home using sensors to monitor its residents’ movements and behaviors. This system helps detect behaviors and forecast diseases or injuries that residents may experience, especially older people. This study is helpful for doctors in providing remote care and monitoring their patients’ progress. The target object capture setup model proposed in [ 27 ] is based on the candidate region–suggestion network to detect the position grab of the manipulator combined with information for color and deep image capture using deep learning. It achieved a 94.3% crawl detection success rate on multiple target detection datasets through merging information for a color image. A paper [ 28 ] under review deals with the elderly and their struggle to continue living independently without relying on the support of others—the research project aimed to compare automated learning algorithms used to monitor their body functions and movements. Among the eight higher education algorithms studied, the support conveyor algorithm achieved the highest accuracy rate of 95%, using reference traits. Some jobs require prolonged sitting, resulting in long-term spinal injury and nervous disease. Some surveys helped design sitting position monitoring systems (SPMS) to assess the position of the seated person using sensors attached to the chair. The drawback of the proposed method was that it required too many sensors. This problem was resolved by [ 29 ], who designed an SPMS system that only needed four such sensors. This improved system defined six different sitting positions through several machine-learning algorithms applied to average body weight measurements. The positions were then analyzed and classified into any approach that would produce the highest level of accuracy, reaching from 97.20% to 97.94%. In most hospitals, medical doctors face anxiety about treating patients with mental illness regarding potential bodily harm, staff risks, and hospital tool damage. The study [ 30 ] devised a method to analyze the patient’s movements and identify the risk of harmful behavior by extracting visual data monitoring the patient’s movements from cameras installed in their rooms. The proposed method traced the movement points, accumulated them, and extracted their properties. The characteristics of the movement points were analyzed according to spacing, position, and speed. The study concluded that the proposed method could be used to explore features and characteristics for other purposes, such as analyzing the quality of the disease and determining its level of progression. In the study [ 31 ], wireless intelligent sensor applications and devices were designed to care for patient health, provide better patient monitoring, and facilitate disease diagnosis. Wireless sensors were installed on the body to periodically monitor the patient’s health, update the information, and send it to the service center. The researchers investigated the multi-level decision system (MDS) to monitor patient behaviors and match them with the stored historical data. This information allowed the decision makers in the medical centers to give treatment recommendations. The proposed system could also record new cases, store new disease data, and reduce the doctors’ effort and time spent examining the patients. The results proved accurate and reliable (MDS) in predicting and monitoring patients.

The study of [ 32 ] proposed the Short Time Fourier Transform application to monitor the patient’s movements and voice through sensors and microphones. The system transmitted sound and accelerometer data, analyzed the data to identify the patient’s conditions, and achieved high accuracy. Three experiments were conducted in reference [ 33 ], which involve the recognition of full-body expressions. The first experiment was about matching body expressions to incorporate all emotions, where fear was the most difficult emotion to express. At the same time, the second experiment focused on facial expressions strongly influenced by physical expression and, as a result, was ambiguous. In the last experiment, attention was given to expressions of the tone of a voice to identify emotional feelings related to the body. Finally, it was concluded that it was essential to pool the results of the three experiments to reveal true body expression.

A valuable study was conducted at the MIT Institute [ 34 ] to develop a system that detects pain in patients by analyzing data on brain activities using a wearable device to scan brain nerves. This was shown to help diagnose and treat patients with loss of consciousness and sense of touch. In this research, researchers use several fNIRS sensors specifically on the patient’s front to measure the activity of the frontal lobe, where the researchers developed ML models to determine the levels of oxygenated hemoglobin related to pain. The results showed that pain was detected with an accuracy of 87%.

The study [ 35 ] considered the heartbeat as a type of body language. Checking a patient’s heartbeat constitutes a crucial medical examination tool. The researcher suggested a one-dimensional (1D) convolutional neural network model CNN, which classified the vibrational signals of the regular and irregular heartbeats through an electrocardiogram. The model used the de-noising auto-encoder (DAE) algorithm, and the results showed that the proposed model classified the sound signals of the heart with an accuracy of up to 99%.

6. Discussion

We can conclude from this study that reading and understanding body language through AI will help automatically detect epidemic diseases. Counting epidemic patients is a significant obstacle to detecting every infected person. The most prominent example that is evident now is COVID-19 sufferers. All the developed, middle, and developing countries of the world have faced a significant problem examining the disease due to many infected people and the rapid spread. Thus, infections increased significantly, making it difficult to catch up to detect. We suggest conducting a study to determine the movements and gestures of the body with epidemic diseases, such as those with COVID-19. Indeed, the epidemic disease will have unique and distinct movements in some body parts. The thermal camera to detect high body temperature certainly plays a significant role in indicating a patient with a disease. Still, it is difficult to determine what kind of disease is affected, and secondly, there may be a patient with epidemic disease, but their temperature may not have significantly increased. Thirdly, it may be revealed that the high temperature of an epidemic may be delayed, and the patient is in a critical stage of treatment. We focus in this study on the interest in studying the body language of some epidemics, especially COVID-19, which changed our lives for the worse. We have learned a harsh lesson from this deadly enemy: not to stand still. We must help our people, countries, and the world defend and attack this disease. Hence, we propose studying the use of body language using AI. We hope to collect and identify body parts’ gestures that characterize the epidemic in the upcoming studies on which we are currently working.

Table 1 indicates some studies that have used ML to discover disease and symptoms through gestures, hands, and facial expressions. This table concludes that the CNN algorithms are the most common and efficient methods of identifying disease symptoms through facial expressions and hand gestures. Some studies indicate that analyzing other body parts is also helpful in identifying some types of diseases using different ML algorithms, such as SVM and LSTM. It appears to us here that combining the proposed CNN algorithm with a new proposed algorithm to determine facial expressions will lead to high-quality results for detecting some epidemic diseases. It is essential first to study the symptoms that characterize the epidemic disease and their reflection on body expressions and then use the algorithm to learn the machine that has a higher efficiency in identifying these expressions.

The studies in Table 1 are classified as follows:

  • (1) Studies on medical diagnosis using AI for analyzing body language.
  • (2) Studies on medical diagnosis using electronic devices and AI for analyzing body language.
  • (3) Studies on COVID-19 diagnosis using other methods.

This study aims to survey research using ML algorithms to identify body features, movements, and expressions. Each movement is affected by the disease, and each disease is characterized by a distinct and different effect on the body. This means some body parts will undergo certain changes that point to a specific disease. Thus, we propose that ML algorithms capture images of body movements and expressions, analyze them, and identify diseases. This study surveyed a selection of existing studies that use different ML algorithms to detect body movements and expressions. Since these studies do not discuss this epidemiology method, this study seeks to document the use of ML algorithms in discovering epidemics such as COVID-19. Our survey analysis concludes that the results achieved indicate the possibility of identifying the body movements and expressions and that ML and convolutional neural networks are the most proficient in determining body language.

From an epidemiological, diagnostic, and pharmacological standpoint, AI has yet to play a substantial part in the fight against coronavirus. Its application is limited by a shortage of data, outlier data, and an abundance of noise. It is vital to create unbiased time series data for AI training. While the expanding number of worldwide activities in this area is promising, more diagnostic testing is required, not just for supplying training data for AI models but also for better controlling the epidemic and lowering the cost of human lives and economic harm. Clearly, data are crucial in determining if AI can be used to combat future diseases and pandemics. As [ 91 ] previously stated, the risk is that public health reasons will override data privacy concerns. Long after the epidemic has passed, governments may choose to continue the unparalleled surveillance of their population. As a result, worries regarding data privacy are reasonable.

7. Conclusions

According to patient surveys, communication is one of the most crucial skills a physician should have. However, communication encompasses more than just what is spoken. From the time a patient first visits a physician, their nonverbal communication, or “body language”, determines the course of therapy. Bodily language encompasses all nonverbal forms of communication, including posture, facial expression, and body movements. Being aware of such habits can help doctors get more access to their patients. Patient involvement, compliance, and the result can all be influenced by effective nonverbal communication.

Pandemic and epidemic illnesses are a worldwide threat that might kill millions. Doctors have limited abilities to recognize and treat victims. Human and technological resources are still in short supply regarding epidemic and pandemic conditions. To better the treatment process and when the patient cannot travel to the treatment location, remote diagnosis is necessary, and the patient’s status should be automatically examined. Altering facial wrinkles, movements of the eyes and eyebrows, some protrusion of the nose, changing the lips, and the appearance of certain motions of the hands, shoulders, chest, head, and other areas of the body are all characteristics of pandemic and epidemic illnesses. AI technology has shown promise in understanding these motions and cues in some cases. As a result, the concept of allocating body language to identifying epidemic diseases in patients early, treating them before, and assisting doctors in recognizing them arose owing to the speed with which they spread and people died. It should be emphasized that the COVID-19 disease, which horrified the entire world and revolutionized the world’s life, was the significant and crucial motivator for the idea of this study after we studied the body language analysis research in healthcare and defined the automatic recognition frame using AI to recognize various body language elements.

As researchers in information technology and computer science, we must contribute to discussing an automatic gesture recognition model that helps better identify the external symptoms of epidemic and pandemic diseases to help humanity.


First author’s research has been supported by Grant RMCG20-023-0023, Malaysia International Islamic University, and the second author’s work has been endorsed by the United Arab Emirates University Start-Up Grant 31T137.

Funding Statement

This research was funded by Grant RMCG20-023-0023, Malaysia International Islamic University, and United Arab Emirates University Start-Up Grant 31T137.

Author Contributions

Conceptualization, R.A. and S.T.; methodology, R.A.; software, R.A.; validation, R.A. and S.T.; formal analysis, R.A.; investigation, M.A.H.A.; resources, M.A.H.A.; data curation, R.A.; writing—original draft preparation, R.A.; writing—review and editing, S.T.; visualization, M.A.H.A.; supervision, R.A. and S.T.; project administration, R.A. and S.T.; funding acquisition, R.A. and S.T. All authors have read and agreed to the published version of the manuscript.

Institutional Review Board Statement

Informed consent statement, data availability statement, conflicts of interest.

The authors declare no conflict of interest.

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Romeo Vitelli Ph.D.

Body Language

How universal is body language, new research suggests emotional body language may transcend culture.

Posted April 12, 2017

  • What Is Cognition?
  • Find a therapist near me

For all the importance we place on words, whether spoken or written, much of the communicating we do on a regular basis comes through body language .

According to pioneering research by Dr. Albert Mehrabian , only seven percent of the meaning we derive from human communication comes from the actual spoken words used. An additional 38 percent comes from tone of voice while a whopping 55 percent comes from body language alone.Though these findings remain controversial, there is no disputing that facial expressions, physical gestures, body posturing,and even our patterns of breathing can provide an amazing array of information for other people to interpret.

Researchers have long identified that certain kinds of body movements and facial expressions can convey information about the emotions we happen to be experiencing at the time. Even when physical movements are broken down into point-light displays that convey minimal information about how we move, research subjects are still able to interpret emotional states based solely on body language.

But are these emotional signals shaped by different cultures or are they universal to all humans? A new research article published in the journal Emotion attempts to answer this question through an ambitious cross-cultural study. Thalia Wheatley of Dartmouth College and a team of co-researchers travelled to Ratanakiri, Cambodia to study members of a remote Kreung hill tribe. One of the indigenous groups living in Cambodia's highlands, the Kreung are still largely isolated from the outside world except for occasional visitors.

With the assistance of representatives from the Cambodian government and local authorities who acted as translators, the researchers collected a series of videos featuring a Kreung male who was asked to display different emotions ( anger , disgust, fear , happiness , sadness). The participant used was an experienced performer of traditional dance and music in the Kreung community and had considerable experience in performing before an audience.

With each emotional display, the participant was presented with different scenarios and was asked to perform each scenario as if he were the character described. Scenarios included: " I am very mad that I lost the stuff in my home" (anger), "I want to vomit. This soup is spoiled" (disgust), "I am so scared. Why are there so many tigers in this forest?" (fear), "I am very happy to be sharing these stories with other people, (happiness), and "I feel so miserable when my child has gone far away", (sadness).

These videos were later used in a study involving twenty-eight Dartmouth students or employees (thirteen were female and the average age was 21.9) who were asked to judge which emotions were being displayed. The videos were displayed in a random order with no sound or other verbal cues) and replayed on a continuous loop. In the first study, All participants were given a choice of five emotional labels to endorse and asked to view each video carefully before making a choice.

Results showed an eighty-five percent success rate which was far greater than what would be expected by chance alone. Of the emotions studied, participants were most accurate in rating fear followed by anger, disgust, and sadness. Happiness was the emotion least likely to be rated accurately though participants still scored better than chance.

In another study, a set of videos were prepared featuring an American woman displaying three positive emotions (happiness, love, pride) and three negative emotions (anger, fear, sadness) using body language alone. The effectiveness of these videos was tested using thirty-four participants recruited through Amazon's Mechanical Turk. Also, to minimize the visual cues that would be received, the videos were converted into point-light displays featuring fourteen light points corresponding to the the major joints in the body as well as the torso and head.

The videos were then presented to twenty-six Kreung individuals (eleven of whom were female). Since Kreung don't formally document age, there was no way to distinguish between adults and adolescents who participated. All participants were presented the videos and a translator helped explain the experiment and what they would be required to do. Instead of being given specific emotional labels such as in the first experiment, the Kreung participants were asked to describe the emotions being displayed in their own words.

research body language in communication

Results showed that the Kreung participants tended to be quite accurate in guessing which emotions were being presented. The overall accuracy rate was sixty-two percent though their accuracy in detecting specific emotions such as anger and happiness was far higher (virtually everyone guessed anger correctly). They were also reasonably accurate in detecting sadness and, to a lesser extent, fear.

For emotions such as love and pride however, the Kreung participants did much worse and often misidentified these videos as examples of happiness. Overall, there was no significant difference between Kreung and American raters in detecting emotions such as anger, happiness, sadness, or fear though American participants did much better in detecting pride and love.

In a third study, sixteen Kreung participants were given only five words to choose from in identifying emotions (the Kreung words for: anger, disgust, fear, happiness, sadness). This was intended to make this study as similar to the first study as possible. As with the previous study, Kreung participants detected anger, disgust, and happiness at rates far above chance though their performance on sadness and fear was much lower.

So, what do these results suggest? While Kreung and American participants showed no significant difference in detecting emotions, there were still limitations with this kind of research considering differences in how the research was conducted. For example, Kreung were interviewed directly while American participants did their rating online and without any direct interaction with researchers.

Still, the results of these studies do seem to suggest that body movements can convey emotions such as anger, fear, sadness, and love even for individuals belonging to different cultures. By using remote tribal groups such as the Kreung who have yet to be assimilated as many other preliterate societies have been, Thalia Wheatley and her colleagues were able to show that emotional signals may well be universal since they reflect basic human needs and desires that all humans share.

As the world becomes more assimilated, studies such as this will likely become rarer with time. That may also mean that culture clashes will become more common, something we are already seeing firsthand in many countries. Learning more about how basic biology and social factors shape the way we communicate may well be vital in helping to understand ourselves better.

Parkinson, C., Walker, T. T., Memmi, S., & Wheatley, T. (2017). Emotions are understood from biological motion across remote cultures. Emotion, 17(3), 459-477. http://dx.doi.org/10.1037/emo0000194

Romeo Vitelli Ph.D.

Romeo Vitelli, Ph.D. is a psychologist in private practice in Toronto, Canada.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Teletherapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Therapy Center NEW
  • Diagnosis Dictionary
  • Types of Therapy

January 2024 magazine cover

Overcome burnout, your burdens, and that endless to-do list.

  • Coronavirus Disease 2019
  • Affective Forecasting
  • Neuroscience


  • The Magazine
  • Newsletters
  • Managing Yourself
  • Managing Teams
  • Work-life Balance
  • The Big Idea
  • Data & Visuals
  • Reading Lists
  • Case Selections
  • HBR Learning
  • Topic Feeds
  • Account Settings
  • Email Preferences

What Is Active Listening?

research body language in communication

Tips for practicing this essential communication skill.

Active listening requires mastering many skills, including reading body language and tone of voice, maintaining your attention, and being aware of and controlling your emotional response. In this article, the author explains what active listening is and how to improve this essential communication skill.

Are you a good listener at work? You might think you are because you put away distractions, stay quiet, and nod your head when someone is talking to you. You might even repeat back your conversation partner’s main points to demonstrate that you’ve heard and absorbed them. These are all smart things to do, but they can still leave the speaker feeling unheard or even dismissed.

  • Amy Gallo is a contributing editor at Harvard Business Review, cohost of the Women at Work podcast , and the author of two books: Getting Along: How to Work with Anyone (Even Difficult People) and the HBR Guide to Dealing with Conflict . She writes and speaks about workplace dynamics. Watch her TEDx talk on conflict and follow her on LinkedIn . amyegallo

Partner Center

The Role Of Body Language In Communication

Body language often plays a significant role in communication and can be as important as the words we say. It can involve eye contact, head movement, posture, gestures, and facial expressions, all of which can add meaning to our verbal communication. Non-human primates also frequently use body language to communicate. Today, body language may not always play a role in communication, as many of our interactions tend to happen online through text only. However, body language will likely continue to be a crucial element of communication as long as people continue to have face-to-face interactions. If you struggle to communicate effectively or have trouble understanding various body language cues, working with a therapist in person or online may be helpful.

What Is Body Language?

  • Facial expressions
  • Head movement
  • Eye contact

These can be universal to all humans, and people may perform them consciously or subconsciously to convey their thoughts and feelings. Experts say body language usually constitutes about half of what we are trying to communicate. 

For example, a person may not always need to verbally say "no" to communicate that something is wrong or that they disagree with what a person is saying. Instead, they can shake their head from side to side to share the same sentiment. Moreover, if a student slouches in their chair in class and doesn’t make eye contact with their teacher, this may signal that they are bored.

Body language can also enhance and complement our verbal communication skills. For instance, if someone in a store is asking for directions on where to find a product, and an employee merely says, "over there," this information may be too vague to be helpful to the customer.

At that point, the employee can be more specific with the location of the item by stating what aisle or department it is in. However, they may also gesture or point in the direction where the product is located. Even if the employee was not very specific and simply said "over there" while pointing, it would likely be more helpful than the original scenario with no body language.

Body language often plays a significant role in everyday interactions, which may be why it tends to be one of the most popular topics in communication studies. It is believed to have been of interest for thousands of years; even the Ancient Greeks interpreted the meanings behind human physical behavior. 

Body Language As A Form Of Unconscious Communication

The previous section discussed a couple of examples that show how movement can be used to enhance speech. However, body language psychology may also consider unconscious communication. Although these physical cues might be unintentional, they can still be interpreted by others.

Consider law enforcement as an example. A forensic psychologist or someone working with intelligence may be  trained to notice brief micro-expressions , or quick, unconscious expressions of emotion that can appear on a person’s face.

People in charge of investigations may be interested in these nonverbal cues because they can indicate whether a person is lying or trying to conceal something from the interrogator. These cues can happen in a split second, but if an observer slows or freezes a video, they might witness an apparent expression change at that moment.

Some other everyday situations where unconscious body language can occur may be during periods of nervousness or attraction. Specific expressions can vary from person to person. For example, someone might cough when placed in a scenario that makes them nervous, whereas another might touch their face or scratch themselves as though they have an itch.

People may be unaware of their body language in these situations because these cues tend to be performed subconsciously. However, they can be observable to others, and people might notice patterns over time. This may be especially true for people who interact with each other regularly, such as parents and their children, for example. 

Since people close to one another usually know each other's baseline or default personality, they can spot when something is off by noticing changes in body language. For instance, if a child lies to their mother about where they are going, they might exhibit distinct body cues that are out of the ordinary, such as avoiding eye contact or speaking more rapidly.

Evolution And The Origins Of Body Language

By researching non-human primates, we may better understand how we used body language early in our evolution as a species. The use of body language generally predates any spoken or written language that humans have created. Since they do not have the same vocal anatomy and brain size as humans do to produce speech, non-human primates frequently use body language to communicate with each other.

It is also generally believed that genetic differences may be similarly responsible for why we can speak, while our closest ancestors, chimpanzees and bonobos, cannot. A variation of the FOXP2 gene is suggested to be why this is the case, and humans may have a unique mutation. This mutation had likely occurred within the last four to six million years because that is when the last common ancestor to the Homo and Pan species lived. The mutation is believed to have stuck around, rather than gradually being bred out, because increased communication abilities likely enhanced our chance of survival.

Although they may not speak as we can, non-human primates can provide insight into why body language developed in the first place. We can observe them and see how they use nonverbal communication with one another to fulfill their need to communicate.

Gestures have often been noted in monkeys and great apes to produce different signals, some of which humans also use. For example, a hard touch or brush of the hand can tell another individual to stop, whereas a soft one or a light pull can be more inviting. Some species, such as orangutans, also embrace one another.

Others have unique forms of body language to communicate. Male gorillas may attempt to show dominance by standing on two legs and beating their chests. Despite being exclusive to gorillas, humans also typically have ways to assert power and strength nonverbally, such as standing with our feet at a wider stance than usual. Some primates, such as chimpanzees and bonobos, may pout; however, instead of signaling sadness or disappointment, pouting usually means wanting something related to food or grooming. 

In primates, gestures are often accompanied by facial expressions and eye contact. Baring teeth can be a universal sign of aggression among non-human primates. On the other hand, lip-smacking can be a friendly facial signal and may be a form of submission in some situations.

As our brains have grown and our facial structure has changed over time, humans have generally been able to utilize other types of body language in communication. While we may not show our teeth to express aggression, we frequently have other ways to convey the same message, such as scowling, glaring, or using unique gestures like the "middle finger"(which can tie in with language and culture).

The Importance Of Body Language In Modern Society

In today's digital age, many people rely on social media and text messaging to communicate with each other. Although virtual interaction may allow people to talk at their leisure and can minimize social pressure and anxiety for some, certain things can be lost in translation, so to speak. 

By being unable to see or hear the other person as you speak with them, you might miss critical nonverbal cues, as well as verbal ones, like vocal inflection. Online communication is generally becoming the primary modality for millions of people, and body language may continue to evolve to accommodate this shift.

Still, body language has likely been around for millions of years, and despite it being absent from certain situations, it can still be relevant. It may continue for the foreseeable future as long as people continue interacting face-to-face. Research has shown that body language can be vital for human cognitive functioning because it can enhance information transfer and lexical retrieval. 

For some, nonverbal communication may not come easily, and this difficulty may be exacerbated by the frequent use of technology, which may not allow for as many opportunities to learn and practice. If you struggle with communication, whether verbal or nonverbal, therapy can be helpful.

Benefits Of Online Therapy

Online therapy can be convenient if you struggle with communicating or need extra help and support with mental health-related concerns. You generally won't need to leave your house to work with a licensed therapist suited to your needs, and if you're worried about the ability to pick up on nonverbal cues like body language, video-chatting with your therapist may be an option, in addition to phone call or online chat sessions.

Effectiveness Of Online Therapy

A common reason for communication struggles can be social anxiety disorder. If you experience symptoms of social anxiety, it can be challenging to fully engage in conversation and pick up on body language cues. A 2022 study indicated that online therapy could be effective in treating social anxiety disorder . However, if communication difficulties stem from another cause, it may be helpful to know that online therapy is generally as effective as in-person therapy for a variety of mental health-related concerns, according to a growing body of evidence. 

Please continue reading for reviews of some of our therapists from people experiencing similar challenges.

Therapist Reviews

"I have been working with Heather for several months. She handles difficult conversations delicately but says what needs to be said. She is timely and thinks through her responses when we communicate via text. Occasionally, when I have a difficult question with multiple parts, she acknowledges that she saw my message and assures me she wants some time to be sure she gives me a thoughtful response and not just type back to be speedy and off-the-cuff. These responses are always well-phrased and include examples she knows I can relate to. Her follow-up of these difficult questions during our phone sessions is consistent, and she checks if anything needs clarification."

research body language in communication

"So far, Meashline has been a true gift. I've made a lot of progress with my anxiety, handling life challenges, understanding myself and what I want, and how to communicate with people who can't communicate. The list goes on. She guides me through every struggle and helps me develop effective tools I'll use. I could write for days about how helpful she's been. 10/10 recommend."

research body language in communication

What is the 7 %- 38 %- 55 rule?

Generally speaking, body language plays a large role in our ability to communicate as humans. Understanding how to read body language can give someone a deeper connection and understanding of what is truly being said and felt by someone else. 

The 7%-38%-55% rule suggests that a mere 7% of communication is done verbally. It then hypothesizes that 38% of communication comes across in our tone and voice inflection, leaving 55% of the communication to come from someone’s body movement and language. 

Whether these exact percentages are true or not, it does show us just how much of a role body language, hand gestures, and facial expressions play in communication — possibly showing our unspoken emotions. 

How much does body language contribute to communication?

Our body movements and hand gestures can convey emotions that we may not even be consciously aware of. Even if we only use subtle movements, someone who is using active listening skills can understand these additions to our verbal message. Seeking out body language tips, as well as signs of positive body language and negative body language can help us to use these skills more effectively socially. 

What are the 4 types of body language?

Generally speaking, people recognize four main types of body language. These can include soft and fluid, precise and bold, dynamic and determined, and light and bouncy movements. Each of these types can convey understanding and support our speech in a visual sense. 

What are the 3 V's of communication?

Many recognize that the three V’s of communication include visual, vocal, and verbal communication methods; which can be shown by positive body language, vocal inflection, and other ways. For example: Maintaining open posture and open body language as you welcome a new friend to a group can send the message that you’re genuinely a warm, safe person to be around. Alternatively, maintaining an open posture and maintaining eye contact can be a way to generate tension if you’re angry, signaling that you’re ready for conflict. 

What is the most effective body language used in speaking to someone face-to-face?

Many sources find that the most effective body language type for face-to-face communication is simply the management of your facial expression. A nice smile can be a great way to facilitate connection and conversation, for example. 

What are some examples of bad body language?

“Bad body language” is entirely subjective, and can be formed by a person’s unique experiences. However, common examples of body language that people may perceive negatively can include: 

  • Shifting one’s weight from side to side 
  • Tensing your cheek muscles 

A Body Language Guide: 15 Common Nonverbal Cues

How to figure out if a guy likes you, learn how your body communicates, top categories.

  • Body Language


The Critical Need to Read Body Language in Qualitative Research


Communication contradictions: whether we realize it or not, we’ve all experienced them in some form or another. What do I mean? A customer expresses delirious interest in your proposal, but you never hear from them again. A job applicant confidently expresses their ability to fulfill a role, but two weeks into the job it’s clear they are unqualified, perhaps even toxic. A research participant has a positive verbal reaction to a new product concept, but their body language is not quite so convincing.

What Interview Were You Watching?

Not long ago, I interviewed a respondent to whom I showed a rough concept for a new product. As soon as I presented the concept, I noticed her body language was screaming that the concept did not resonate. There were several tells:

  •  Her arms were folded. While this is often a misunderstood body language position, in this case her arms were folded and stiff, and her fingers were pressing into her biceps, indicating stress.
  •  She was leaning back. Instinctively, we lean away from things we dislike and lean into things that we prefer.
  •  Her feet were pointing toward the door. The feet are the most honest part of the body, as they are always pointed where the body wants to go (Navarro 2008, 76). This is why we recommend focus group facilities (or negotiators) use glass tables so the lower extremities can be viewed. We also recommend swivel chairs to help accentuate body movement.
  • One corner of her mouth quickly moved up, ever so slightly. This is clearly a sign of contempt, which can either indicate dislike or a feeling of superiority. In this case, an element of the concept rubbed her the wrong way.
  • A wayward index finger began to tap to an unknown beat. Our digits have a language all their own. This was a clear indication that she was frustrated and ready to move on.
  • She began to pick lint on her blazer, but there was no lint. This is a form of contempt, where she was essentially saying that she would rather pick imaginary lint off her jacket than tell me she disliked the concept.

“So,” I asked her, “tell me, what do you think?” “Oh, I like it,” she replied. “I’d buy it. Certainly.”

But that’s not what her body said. The unspoken signals I picked up suggested her reaction was not quite as rosy as her verbal feedback suggested. Eventually, I was able to determine what it was that bothered her about the concept, and it turned out to be valuable feedback.

After thanking her for her opinions, I hustled back to the observation room where the clients were huddled in the dark with glowing laptop screens. “She loved it!” one client exclaimed, high-fiving the other. “Killed it!” yelled the other.

I looked at them the way a dog sometimes does when it is trying to understand what its master just said: head cocked, one ear up, and one ear down. I thought to myself, “What interview were you watching?” That was when I advised my client to watch what participants do and not only listen to what they say. The concept just did not resonate with this respondent, and there were good reasons for that.

Lying: Ubiquitous but Not Always Sinister

Why would a person say one thing when their body language suggests something else? It is estimated that we hear as many as several hundred lies per day. Paul Eckman writes in his seminal book on lying, Telling Lies: Clues to Deceit in the Marketplace, Politics, and Marriage (Revised Edition), “Lies occur between friends (even your best friend won’t tell you), teacher and student, doctor and patient, husband and wife, witness and jury, lawyer and client, salesperson and customer.”

Let’s get this straight. There is no Pinocchio effect! (Navarro 2008, 230). The foremost global experts on lying all agree that there is no single sign that someone is fibbing (cf. DePaulo, Eckman, Ford, Frank, Freisen, Hartwig, Hwang, Levine, Matsumoto, Navarro, Skinner). Anyone who tells you otherwise is likely lying, and because liars lie about lying (John 8:44 NASB), it’s hard to know what to believe.

research body language in communication

Moreover, the major worldviews—Islam, Judaism, Eastern Orthodox Christianity, Western Christianity—all denounce lying, albeit the Jewish tradition has exceptions for keeping the peace, protection against theft and harm, or for the sake of decency and/or humility (Friedman, Weisel 2003, 8).

Lying is not only an abomination and hard to detect, but it is ubiquitous, too (Navarro 2008, 208).

But how does lying really work? A closer look yields that lying is used for social survival (Navarro 2008, 208), and it’s used quite often. “She’s not home” (sure she is, she’s sitting right next to you). “I already donated” (no, you didn’t). “I’m not available” (you could go, but you don’t really want to). “I love it” (it’s the same tie you gifted me last year). That’s right, to avoid rocking the social boat unnecessarily, we lie.

But it gets worse. Not only are lies shunned, omnipresent, and elusive, numerous studies prove definitively that we humans are very poor lie detectors. Enter Timothy B. Levine’s Duped and the Truth Default Theory (TDT). We humans are “hardwired” for the benefit of the doubt (Levine, 2020, 14). Moreover, the TDT posits that most of the time most people tell the truth unless there is some detrimental consequence to being honest. Then people will lie. (Levine 2020, 246–247). Furthermore, most lies are not caught in real time (Levine 2020, 244).

According to Levine, truth engenders trust, and trust greases the wheels of social harmony. Thus, we lean toward belief for the sake of social cohesion. But what about that swirl of deception around us?

The TDT accounts for that, stating that the most destructive lies are told by a few prolific liars with catastrophic consequences (Levine 2020, 247). Think Bernie Madoff—the convicted Ponzi-scheme mastermind. His lies were not the “I-gave-at-the-office” type. They were sociopathic lies, devastating to those who believed them.

So, you think to yourself, okay . . . lying is shunned, hard to catch in real time, exists everywhere, and woven into the social fabric. Why bother?

Rather, one should focus on whether someone is comfortable or uncomfortable with what you are saying or what is coming out of their own mouth. Why? The same behaviors and gestures that make us look guilty are also the ones we exhibit when we are uncomfortable.

Want the Truth? Just BLINK

Here’s another research study where we employed a technique to help foster emotions. The technique I used was called BLINK: Body Language Intuition Numinology Know-How.

We emote faster than we think, which is exactly why the BLINK technique is so powerful. The basic premise of this technique is that instead of asking someone a question and waiting for their reply, you posit a supposition and watch the reaction on their face. That is, did they show signs of disgust, contempt, happiness, fear, sadness, anger, or surprise—or no emotion at all?

This technique takes advantage of the fact that we think slower than we emote. Our emotions are controlled by the limbic system—the part of the brain typically referred to as the “frog brain” or “reptilian brain.” Ask a person a question, and wait for them to formulate an answer. Use the BLINK technique, and watch the answer flash across their face without them even knowing it.

Using the BLINK Technique in Real-Life Settings

At the Merrill Institute, I used the BLINK technique in a qualitative channel study among Value Added Resellers (VARs), and the technique yielded jaw-dropping results. Back to that in a minute. First, let’s understand the target audience in this study. These VARs resold computer software, hardware, and networking products to small- and medium-sized companies, providing value beyond order fulfillment.

I talked with VARs in New York City during a series of hour-long in-person interviews. The objective of the study was to understand the needs of VARs selling their devices. What kind of support would they require? Better pricing and margins? Lead generation support? Market development funds? What would be the ideal channel plan? After asking a variety of questions regarding channel needs, we turned our attention to the major brands in the space.

“So,” I asked one of the VARs, “would it matter to you who the manufacturer is behind this channel plan?”

“Yes,” he replied, and before he could say anything else, I interjected the following presupposition—“Suppose I told you that the manufacturer behind this plan is Brand X.” In an instant the VAR flashed a half smile of contempt, a quick sign of fear, and then a sure sign of disgust, but then he said, “Brand X would be fine.” I could sense something was amiss.

“You know,” I said to him, “I’m sensing that there’s more to the story about Brand X.”

He looked at me and said, “Brand X is fine from the perspective of customer awareness. Everyone knows Brand X, but I’ll never do business with them again after they went around my back directly to my clients.”

We later learned that this VAR had registered a sizeable deal (over 50 devices) with Brand X—only to find out later that Brand X tried to cut him out of the deal by going directly to VAR’s customer. The lesson? Deal registration and no channel conflict are sacred! We might have missed that had we not employed the BLINK technique.

This is just one of the many instances where the BLINK technique can be used. Another example: an HR executive could quickly assess whether a job candidate is less than forthcoming about their previous work experiences just by looking at their reactions.

Experienced and new moderators—and client observers—will often miss the subtle and sometimes not-so-subtle signs about what a person is really saying, paying too much attention to the spoken word and not enough to nonverbal cues. We founded the Merrill Institute for Body Language Training for this reason—so everyone involved in qualitative research can become a body language expert.


  • Ekman, Paul. Telling lies: Clues to deceit in the marketplace, politics, and marriage. New York: W. W. Norton & Co, 2009
  • John 8:44 NASB: “You are of your father the devil, and you want to do the desires of your father. He was a murderer from the beginning, and does not stand in the truth, because there is no truth in him. Whenever he speaks a lie, he speaks from his own nature; for he is a liar, and the father of lies.”
  • Friedman, Weisel, Jewish Law. “Should Moral Individuals Ever Lie? Insights from Jewish Law,” 2003 (p 8).
  • Hartwig, Maria. “Telling Lies: Fact, Fiction, and Nonsense, by Maria Hartwig: Should you believe Paul Ekman, world’s most famous deception researcher?”, Psychology Today, (2014): 15
  • Navarro, Joe, and Marvin Karlins. Essay. In What Every BODY Is Saying: an Ex-FBI Agent’s Guide to Speed-Reading People. New York, NY: Harper Collins, 2015
  • Timothy R. Levine. The University of Alabama Press, Tuscaloosa, Alabama, 2020
  • Matsumoto, David, PhD, Hyi Sung Hwang, PhD, San Francisco State University and Humintell, LLC, Lisa Skinner, JD, SSA, Federal Bureau of Investigation, Mark G. Frank, PhD, “Evaluating Truthfulness and Detecting Deception and New Tools to Aid Investigators,” University at Buffalo, State University of New York. In press, Page 2, U.S. Department of Justice, Federal Bureau of Investigation, FBI Law Enforcement Bulletin
  • BLINK technique
  • Body language
  • facial expressions
  • moderator training
  • nonverbal communication
  • qualitative research
  • Truth Default Theory

Related Articles

Invisible Women

Invisible Women: Data Bias in a World Designed for Men by Caroline Criado Perez, Abrams Press, 2019

Invisible Women: Data Bias in a World Designed for Men helps us understand the ways research and design built around the needs of men shapes all our everyday lives […]

research body language in communication

The Humanity at the Edge of War: What does the humanitarian response to the Ukrainian refugee crisis teach us about empathy?

The Russian invasion of Ukraine in spring 2022 has created an unprecedented humanitarian crisis. This article shares the stories of market researchers involved in relief efforts, while also discussing a disaster response model and empathy theories to explore how these events contribute to our understanding of empathy. […]

research body language in communication

Cheaters and Repeaters

© 2022 Qualitative Research Consultants Association. All rights reserved.

  • Open access
  • Published: 20 December 2023

Pathways from media attention and peer communication to body dissatisfaction: the moderating role of protective filtering

  • Jing Ji 1   na1 ,
  • Xiaoli Xiang 2   na1 ,
  • Ren Chen 1 ,
  • Zenghong Chen 3 &
  • Jing Yan 1  

BMC Psychology volume  11 , Article number:  447 ( 2023 ) Cite this article

424 Accesses

Metrics details

Negative body image is a common psychological phenomenon among young Chinese women, and merits further investigation. Peers and the media are important factors that associated with body image. This study explored how media and peers promote body dissatisfaction among young Chinese women, including the mediating role of body surveillance and the moderating role of protective filtering.

3499 women from the general China community aged 18–40 years (M = 23.44 years, SD = 1.18 years) were investigated with sociocultural attitudes towards appearance scale-3, objectified body consciousness scale and protective filtering scale. The data were analyzed by using a moderated mediation model with SPSS and the Process 4.0 macro.

Correlational analysis results indicated that body surveillance acted as a chained indirect effect between the internalization of media information and body dissatisfaction, as well as between peer comparison and body dissatisfaction. Moreover, protective filtering was demonstrated to moderate the path of media attention affecting the internalization of media information and the path of peer communication affecting peer comparisons.

Our results contribute to the understanding of the sociocultural mechanisms underlying young women’s negative body image. Furthermore, investigating the moderating effect of protective filtering is conducive to guiding future female positive body image interventions.

Peer Review reports


Body dissatisfaction is a body image evaluation attitude characterized by negative self-perception about body, including body size, weight, and attractiveness [ 1 ]. Social media is full of idealized and sexualized portray of women’s bodies. Compared to male groups of all ages, young adult women pay more attention to ideal beauty-related photos, videos and messages on social media, and body dissatisfaction is a pervasive problem among young adult women [ 2 , 3 ]. People’s attitude towards their own bodies not only affects their cognition but also their behavior. Body dissatisfaction can lead to a range of harms such as psychological distress, low self-esteem, and eating disorders [ 4 , 5 ]. Most studies have focused on the negative consequences of body dissatisfaction rather than its antecedents [ 6 , 7 ]. In addition, few studies have focused on the moderating effect of certain protective factors on negative body image from an information dissemination perspective.

The tripartite influence model of social culture theory is an important theory to interpret female body image which proposes that sociocultural factors affect individuals’ satisfaction with their physical appearance [ 8 ]. This crucial model has been used to investigate individuals’ body image and the negative distress caused by body dissatisfaction. Furthermore, a substantial body of research literature has proven that three main factors of the tripartite influence model (social media, peers, and parents) affect women’s negative body image through two mechanisms: internalization of the beauty-ideal and body comparison processes [ 3 ].

However, the tripartite influence model does not elaborate how internalization and comparison processes affect body dissatisfaction. The self-objectification theory emphasizes that when women are constantly exposed to information about ideal beauty of social media and negative comments on their bodies from peers, the process of self-objectification will be initiated. They examine their bodies from the perspective of observers. Such continuous monitoring of their bodies will eventually lead to body dissatisfaction and shame [ 9 ]. Meanwhile, peer influence, especially peer conversations, would stimulate appearance comparison among peers, subsequently affecting body dissatisfaction [ 10 , 11 ].

In addition, the latest research proposes that protective filtering (Women process information in a self-protective manner, internalizing most positive body image messages while rejecting and reconstructing most negative body image messages) is a protective factor against body dissatisfaction [ 4 ], thereby protecting and promoting body satisfaction among women. The term “protective filtering” was first coined by Wood-Barcalow et al. [ 11 ] based on their qualitative investigation of the characteristics of women with a positive body image. Protective filtering is defined as, “accepting information that is consistent with positive body image while rejecting messages that could endanger it”. That is to say, it is an information processing strategy conducive to the construction of positive body image and help women defend themselves against the negative effects of negative external information on body image.

Although these theories and research findings were initially developed and used in a Western socio-cultural context, it has also proved applicable in the Chinese cultural background [ 12 ]. In China, collectivism spirit is rooted in the social environment, and individuals tend to feel more social pressure to care about others judgments and thoughts [ 13 , 14 ]. Therefore, Chinese young adult women may easily be influenced by mass media and interpersonal relationships. For social media regulators, parents, and Chinese young adult woman, it is essential to understand the components of women’s body dissatisfaction. Based on the theory of the tripartite influence model of sociocultural theory, objectification theory, and previous literature on protective filtering, this study developed a hypothesized framework to better understand how media attention to appearance-related messages, peer conversation, and protective filtering influences body dissatisfaction. Particular attention was given to the role of protective filtering.

Theoretical framework and hypothesis development

The tripartite influence model.

The tripartite influence model is commonly used by scholars to explain how body dissatisfaction is formed under the influence of participation in photo-related social media platforms and peer interactions [ 15 ]. The main point of this model is that three aspects that affect body image (media, parent, and peers) including two primary mechanisms (internalization of the beauty-ideal and appearance comparison). Internalization is one of the focuses of the tripartite influence model, which means that individuals’ values and social standards are affected by social culture and social standards as their norms of behavior [ 16 , 17 ]. The media, especially social media, is a powerful driver of transmitting sociocultural beauty standards and expectations, such as ideal size, weight, and fashion. With its rapid development, the number of young adult women paying attention to information promoting the “ideal body image” is rapidly increasing. Media attention refers to exposure to or use of certain media types, most commonly television, newspaper, the Internet, or social media. Slater et al [ 18 ]. explained it as “people’ s tendency to devote cognitive effort to particular types of media messages consciously”. In the field of body image, media attention is often behaviorally manifested by spending excessive time and energy on social media related to body image, as well as actively following and searching for information related to “ideal beauty” on the internet [ 17 , 19 ]. Accumulating evidences suggest that excessive attention to the idealized societal standards of beauty created by social media may affect how individuals process body-related information, making it easier for them to form negative body images [ 20 , 21 ].

Meanwhile, a person’s values and self-image are internalized through the subtle influence of parents and significant others [ 22 ]. Research has shown that women’s attitudes towards their bodies are influenced by their parents’ memory of emotional indifference [ 23 ]. However, as women reach adulthood, they have less contact with their parents and more contact with their peers, and the influence of peers gradually strengthens. Peer communication refers to talking and discussing with peers about body image such as appearance, image and attractiveness [ 24 ]. These appearance-related communication provides an environment in which image concerns are focused upon, interpreted and subsequently come to be valued. Peer conversations about appearance are common in the daily lives of young adult women indicating that individuals focus on body image in the process of interpersonal communication. Several studies have found that when women talk more frequently about their appearance with their friends, they also have an increased sense of comparison about their bodies [ 25 , 26 ]. Furthermore, when discussing their bodies with friends, women tend to evaluate their appearance by comparing themselves with others, which is relevant to more serious negative body image and disordered eating [ 27 ]. Based on aforementioned viewpoints, we hypothesize:

H1: Media attention is positively associated with the internalization of media information.

H2: Peer communication is positively correlated with peer comparison.

The mediating role of body surveillance

Objectification theory indicates that self-objectification is the process by which women treat themselves as objects to be evaluated and internalize others’ evaluations of their own bodies [ 9 ], which has been proven to be associated with increased anxiety and dissatisfaction with their bodies [ 28 , 29 ]. Body surveillance is a behavioral manifestation of self-objectification and is commonly used to reflect the level of self-objectification [ 30 , 31 ]. Women with a higher level of self-objectification spend more time monitoring their body (i.e., body surveillance) to ensure that they conform to the ideal of social beauty [ 29 ]. The extent of body surveillance has been shown to positively correlate with the level of body dissatisfaction [ 32 ].

In addition, objectification experiences include not only women’s internalization of sexualized information conveyed by social media, but also interaction and commentary with peers about their own and others’ body appearance. On the one hand, social media often focuses on women’s body in a sexualized way, which tends to standardize the aesthetics of women’s appearance [ 33 ]. Previous studies proved that once young adult women internalize these standards, it may trigger body surveillance to monitor how one’s body is being evaluated by others, which then lead to negative psychological outcomes or perception of flaws in one’s appearance, producing body dissatisfaction [ 34 ]. On the other hand, interaction and commentary with peers about their own and others’ body appearance reinforce the intensity of body image comparisons between young adult women and their peers, which subsequently triggers body surveillance. Wang et al. [ 33 ] posited that body surveillance mediated the relationship between appearance-relevant comparison and body shame. Based on the above views, we posit the following hypotheses:

H3a: Body surveillance mediates the relationship between internalization of media information and body dissatisfaction.

H3b: Body surveillance mediates the relationship between peer comparison and body dissatisfaction.

The moderating role of protective filtering

Protective filtering was first proposed as an intervention strategy that incorporates multiple features of positive body image in a qualitative study, and is thought to help focus women’s body investment on self-care and functionality and preserve their positive body evaluation [ 11 ]. When exposed to information related to appearance in social media and direct social environment, protective filtering refers to accepting positive information that is benefit to women’s body appreciation, while rejected or reconstructed negative information [ 35 ]. That is to say, selectively filtering in positive source information and counteracting negative source information help promote and maintain their positive body evaluation [ 36 , 37 ]. For example, Andrew et al. [ 38 ] revealed that women with protective filtering strategy after viewing slim female images on social media did not compare themselves with the beauty-ideal or experience a negative change in their satisfaction with their bodies.

Furthermore, in prior body image studies [ 35 , 39 ], protective filtering was proved to be an effective intervention to lower women’s internalization of beauty-ideal and body depression. Based on fertile previous studies, it can be supposed that protective filtering can moderate the effects of external information (information from social media and peer communication) on internal cognition (such as internalization and peer comparison). Following these statements, we hypothesize:

H4a: Protective filtering moderates the effects of media attention on the internalization of media information.

H4b: Protective filtering moderates the effects of peer communication on peer comparison.

Based on the aforementioned hypotheses, the research framework is depicted in Fig.  1 .

figure 1

Hypothesized model


Participants were recruited through Sojump, a questionnaire collection platform contained more than 260 million registered users in mainland China, which provides functions equivalent to Qualtrics [ 37 ] and is widely used by many Chinese researchers to conduct online surveys [ 40 ]. Since the subjects of this study were mainly women and men were not included in the study, we randomly sent the survey link and brief introduction to female registered users of Sojump via email. We obtained the informed consent of the participants through a consent page on the first page of online questionnaire, and assured them that the questionnaire contained no identifying information to ensure confidentiality. After agreeing to participate in the survey, participants began to fill out the questionnaire in a self-administered manner. Data collection consisted of two parts: a demographic survey and a series of body image questionnaires. In the demographic survey section, participants were asked to indicate their age, marital status, and educational background. After completing the questionnaire survey, all participants have received a cash reward of CNY 5 (equivalent to USD 0.7) online in exchange for their participation.

The online survey lasted three months (March 4 to May 15, 2022). In the end, a total of 4,057 questionnaires were completed. 558 questionnaires were declared invalid based on three criteria: missing data, uniform responses for all questions, and the participants’ fill-in time (less than five minutes is assessed to be unqualified). Finally, 3,499 valid questionnaires were obtained, yielding a valid response rate of 86.2%. All participants were from the general China community, aged 18–40 years (M = 23.44 years, SD = 1.18 years), and the great majority of participants (62.2%) had a college degree or higher.

Outcome measures

Beauty-ideal media attention and internalization of the beauty-ideal scale.

We used the Sociocultural Attitudes Towards Appearance Questionnaire-3 [ 41 ](SATAQ-3) to assess the extent of media attention and internalization of the beauty ideal; the scale contains two subscales: Media Attention Subscale (including six questions, such as, “Social media is an important source for me to get information on fashion, beauty, weight loss, and more”) and Internalization Subscale (including nine questions, such as, “When I see photos of other people on social media, I compare my appearance or body to them”). The response format was a 5-point Likert scale ranging from I completely disagree (= 1) to I completely agree (= 5), higher scores in the scale reflect higher levels of media attention and information internalization. Cronbach’s α was 0.89.

Peer communication and peer comparison scale

Frequency of peer communication and peer comparison were assessed with the Peer Influence Scale drawn from the SATAQ-3 [ 41 ], which contains six items and is divided into two subscales: Peer communication scale (three questions, such as “I often talk to my friends about physical appearance”) and information internalization scale (three problems, such as “When I see pictures posted by my friends, I compare myself to them”). The response format was a 7-point Likert scale ranging from I completely disagree (= 1) to I completely agree (= 7), higher scores in the scale reflect higher degree of peer communication and information internalization. Cronbach’s α was 0.88.

Objectification body awareness scale-body surveillance subscale

Frequency of body surveillance was assessed with the Objectified Body Consciousness Scale (OBCS-Body Surveillance) drawn from the Body Surveillance Scale [ 42 ], which contains eight items (e.g., “When I look in the mirror before I go out, I am often dissatisfied with how I look in the mirror”), and use the 5-point Likert scoring method (1 = completely disagree, 5 = completely agree), higher scores in the scale reflect body surveillance. Cronbach’s α was 0.86.

Protective filtering scale

Items to measure protective filtering were newly developed. Although protective filtering is an important construct for examining how individuals process and respond to appearance-related information, body image research lacks a professional quantitative protective filtering scale. Thus, the present study measured protective filtering with five items, which were modified from the work of Ornella et al. [ 4 ] and Tylka and Wood-Barcalow [ 11 ]. The first three items (such as “I accept information from social media that encourage women to be themselves and make their own body image”, “I often actively block messages, photos and videos that make me anxious about my appearance, " and “I try to relate the ideas in the body-related information or comments to my health”) were from Ornella et al. [ 4 , 43 ], which were used to investigate women’s information permitting. The last two items (such as “I don’t pay too much attention to information about ideal beauty on the Internet”, and “I ignore negative body-related information”) were from the Tylka and Wood-Barcalow [ 11 ], which were used to measure women’s information forefending or blocking out. The response format was a 7-point Likert scale ranging from I completely disagree (= 1) to I completely agree (= 7). The average score of each question was counted as the total score of protective filtering, higher scores in the scale reflect higher extent of protective filtering. Cronbach’s α was 0.86.

Body dissatisfaction scale

Frequency of body dissatisfaction was assessed with the Body dissatisfaction Scale in the SATAQ-3 [ 41 ]. The scale consists of eight items, for example “I think my natural, authentic looks and body shape are also good,” and use the 5-point Likert scoring method (1 = completely disagree, 5 = completely agree). The average score of each question was counted as the total body dissatisfaction score, with a higher score indicating greater dissatisfaction with body image. Cronbach’s α was 0.87.

Data analysis

All analyses were conducted with SPSS 26.0. Regression analysis is used to explore the influence of social media attention on the internalization of media information and the influence of peer communication on peer comparison. The hierarchical regression method was used to examine the moderating effects of protective filtering on the relationship between social media attention and media information internalization as well as between peer communication and peer comparison. PROCESS macro (Model 4) in SPSS was used to verify the mediating effect of body surveillance between the internalization of social information and body dissatisfaction, as well as between the mediating effect of body surveillance between peer comparison and body dissatisfaction.

Preliminary analyses

Pearson’s correlation was conducted (Table  1 ). The results demonstrate that media attention positively correlates with internalizing media information. Peer communication was also positively correlated with peer comparison. Meanwhile, it is worth noting that some correlations between constructs were higher than the benchmark of 0.6, so a multicollinearity test is needed. The analysis results showed that the highest VIF was 3.18, indicating that multicollinearity is not a significant problem in our dataset [ 44 ].

Main analyses

Test for mediation effect.

In Hypothesis 3a, this study proposes that body surveillance was mediators of the link between the internalization of media information and body dissatisfaction. PROCESS macro (Model 4) in SPSS was conducted to test this hypothesis. As evident from Table  2 , internalization of media information has a positive prediction on body dissatisfaction, β  = 0.87, p < 0.001 (Model 1). Internalization of media information was positively linked with body surveillance, β  = 0.73, p < 0.001 (Model 2), and body surveillance was positively correlated with the extent of body dissatisfaction, β  = 0.75, p < 0.001 (Model 3).

The indirect effect of internalization of media information on the degree of body dissatisfaction via body surveillance was 0.85 (SE = 0.02, 95% CI = [0.40, 0.45]). The CI did not include zero. Consistent with our hypothesis, the internalization of media information had a significant indirect effect on body dissatisfaction via body surveillance. The results indicates that body surveillance plays a mediating role between the internalization of media information and the level of body dissatisfaction. Therefore, it can be concluded that Hypothesis 3a was considered persuasive.

Hypothesis 3b indicated that body surveillance would play a mediating role between peer comparison and body dissatisfaction. As evident from Table  3 , peer comparison was positively correlated with body dissatisfaction, β  = 0.34, p < 0.001 (Model 1). Peer comparison was positively correlated with body surveillance, β  = 0.77, p < 0.001 (Model 2), and body surveillance was positively linked to the extent of body dissatisfaction, β  = 0.53, p < 0.001 (Model 3).

The indirect effect of peer comparison extent on the extent of body dissatisfaction though body surveillance was 0.54 (SE = 0.03, 95% CI = [0.53, 0.58]). The result indicate that peer comparison exerted a significant indirect effect on body dissatisfaction through body surveillance. This indicated body surveillance was mediators of the relationship between peer comparison and the extent of body dissatisfaction. Therefore, it can be concluded that Hypothesis 3b was regarded as valid.

Testing for Moderation Effect

In Hypothesis 4, we assumed that protective filtering would moderate the relationship between media attention and internalization of media information and moderate the relationship between peer communication and peer comparison. To examine the moderation hypothesis, a hierarchical regression was adopted to analyze the moderating role of protective filtering. First, media attention and internalization of media information were entered. Then, the interaction (media attention × protective filtering) was entered. The results demonstrated significant main effects of media attention and protective filtering on internalizing media information. The effect of the interaction term was also significant, indicating a significant moderating effect of protective filtering on the relationship between media attention and internalization of media information (Table  4 ). A simple slope test was then adopted, and associated among females with low (− 1 SD) and high (+ 1 SD) levels of protective filtering (Fig.  2 ). The results demonstrated that the link between media attention and internalization of media information was significant among female with a low level of protective filtering (β = 0.48, p < 0.001) and insignificant among female with a high level of protective filtering (β = 0.07, p > 0.05). These results supported the hypothesis of this study that protective filtering could significantly buffer the positive relationship between media attention and females’ internalization of media information.

figure 2

The interaction between media attention and protective filtering on internalization of media information

To examine whether protective filtering moderates the relationship between peer communication and peer comparison, hierarchical regression was adopted to analyze the moderating role of protective filtering. First, peer communication and peer comparison were entered; then, interaction (peer communication × protective filtering) was entered. The results demonstrated significant main effects of peer communication and protective filtering on peer comparison. The effect of the interaction term was also significant, indicating that protective filtering significantly moderates the relationships between peer communication and comparison (Table  5 ). A simple slope test was then adopted, and associations were conducted among females with low (− 1 SD) and high (+ 1 SD) levels of protective filtering (Fig.  3 ). The results demonstrated that the relationship between peer communication and peer comparison was significant among female with a low level of protective filtering (β = 0.64, p < 0.001) and insignificant among female with a high level of protective filtering (β = 0.15, p > 0.05). The statistical analysis results above support the hypothesis of this study that protective filtering could significantly buffer the positive relationship between peer communication and peer comparison.

figure 3

The interaction between peer communication and protective filtering on peer comparison

Overview of findings

Women’s thoughts and feelings about their appearance are complex and affected by many factors. This study combined the theoretical model of sociocultural theory, the objectification theory and the main factor (protective filtering) to examine how media attention and peer communication about appearance-related information affect women’s body image. Several key insights can be inferred from the study’s findings. First, attention to information about beauty-ideal on social media significantly and positively predicted the extent of internalization of beauty ideals, supporting Hypothesis (1) Second, peer communication of appearance-related information is positively related to peer comparison, thus supporting Hypothesis (2) Third, body surveillance played a mediating role between the internalization of beauty ideals and body dissatisfaction as well as peer comparison and body dissatisfaction, supporting Hypotheses 3a and 3b. Finally, and most importantly, protective filtering moderated both the influence of internalization of beauty- ideal information on body satisfaction and the effect of peer comparison on body dissatisfaction, supporting Hypotheses 4a and 4b. Overall, these findings not only support the integrated sociocultural model of body image but also enrich its connotations through constructs from the tripartite influence model, objectification theory and previous studies on protective filtering.

Consistent with our expectations, individuals who paid excessive attention to appearance information in mass media were more likely to have a high degree of internalization. This finding is in conformity with that of Shen et al. [ 13 ]. Furthermore, girls who discuss appearance-related information more frequently with their peers were more likely to compare their bodies to others. This finding is consistent with those of Dohnt and Tiggemann [ 9 ]. These two findings support the tenets of the theoretical model of sociocultural theory, which emphasizes that pressures from social media and peers, once internalized, can lead to feelings of anxiety about appearance or body shape. Notably, media attention about beauty-ideal and peer communication did not directly predict body dissatisfaction. The internalization of the beauty-ideal information had a chained indirect influence on the media attention and body dissatisfaction. Online media efficiently sets unrealistic standards of ideal beauty and transmit information encouraging women to aspire to it [ 45 ]. With frequent exposure to this information, Chinese young adult women with more social orientation characteristics are more easily to internalize and be influenced by these online standards. Second, peer communication has an indirect connection with body dissatisfaction. Peer communication on body appearance, especially talking about one’s appearance, brought self-evaluation motivation and peer comparison. Thus, it is reasonable to speculate that excessive communication on body shape or poor evaluation stimulates young adult women to compare themselves with peers with “good bodies” [ 25 ].

Body surveillance has been shown to play a mediating role between the internalization of media attention and peer comparison on body dissatisfaction. This crucial finding provides empirical evidence for the further integration of sociocultural and objectification theory. Firstly, our findings support that the internalization of media information is positive correlated with body dissatisfaction via engaging in greater body surveillance. According to feminist theorists [ 29 ], internalization and body surveillance are crucial components of women’s negative experiences of their bodies, and internalization provides an ideal standard for the cultural body. When women compare themselves to the standard but cannot reduce the discrepancy, they may feel bad about their bodies. Meanwhile, consistent with the findings of Vandenbosch and Eggermont’s research [ 26 ] on adolescent females, being exposed to sexually objectifying media and internalization of beauty-ideal information may causes females to assess their bodies through an observer’s perspective habitually. Finally, negative evaluations of their body would be generated through body surveillance. Considering China’s traditional patriarchal culture, women’s appearance is often associated with their social and economic benefits [ 12 ]. Chinese young adult women with beautiful appearances are likely to have more satisfying marriages and successful careers, which is also consistent with the mainstream ideology advocated by social media. When young adult women internalize information, they create a negative attitude toward their bodies through the mediation of body surveillance.

Our findings also suggested that body surveillance plays a mediating role between peer comparison and body dissatisfaction, which is consistent with the findings of previous research on the correlation between peer effect and body dissatisfaction [ 24 ]. As self-objectification theory depicts, self-objectification manifests itself behaviorally as bodily surveillance, which can cause negative influence on psychology [ 46 ]. Our study adds further empirical support for aspects of self-objectification theory and verifies the mediating role of body surveillance between peer comparison and body dissatisfaction.

Firstly, the first stages in the mediation model shows that peer comparison is positively correlated with body surveillance, which is consistent with previous findings that there was a significant relationship between peer comparison and body surveillance [ 47 ], women who frequently compared themselves to their peers showed higher levels of body surveillance. Secondly, the path from peer comparisons to body dissatisfaction was also significant. This finding can be explained by social comparison theory [ 48 ], which indicates that women frequently make appearance-related social comparisons, and that such comparisons are usually upward and horizontal. Compared to upward comparisons with unrealistic images of ideal beauty on social media, compare with peers is more realistic and feasible, because their lifestyles and resources are more similar to women’s own than celebrities. Therefore, horizontal comparisons between peers may be more common in female groups, and these horizontal body comparisons often lead to negative outcomes, such as body dissatisfaction. It is worth noting that there are also studies showing that horizontal comparison with peers is conducive to establish women’s positive feelings about their bodies to a certain extent [ 49 ]. Future studies may focus on the positive effects of peer comparison on female body image.

In addition, another core contribution of the study is the moderating effect of protective filtering on two paths (the relationship between media attention and internalization of media information and the link between peer communication and peer comparison). This finding extends the study of Halliwell [ 34 ], who used controlled trial to examine whether the protective role of body appreciation can protect women from exposure to information about negative body image in the media. First, the influence of appearance-related media attention on internalization depended on the level of personal protective filtering. When individuals strongly perceive that information will negatively affect their perception of the body and try to reject it, the link between media attention and internalization is weakened. Second, the effect of appearance-related peer communication on peer comparison depended on protective filtering. When a protective filtering approach is applied to the process of communicating about body image with peers, sense of comparison about their bodies will be diminished. Conversations and interactions with peers create a daily context for attending to, constructing, and interpreting information related to appearance or body shape. Protective filtering is defined similarly to systematic processing, which is one of the main information processing strategies in the Heuristic-Systematic Model (HSM) in the field of communication. HSM states that systematic processing means an individual makes a judgment based on thoughtful consideration of concepts and comparing those concepts to information already available [ 50 ]. Therefore, it is reasonable to speculate that women with stronger protective filtering are less likely to be negatively influenced by appearance-related peer communication in their rational information processing strategies. To some extent, this finding explains why social media and peer comparison are not always directly related to women’s body image [ 51 ].

Strengths, limitations and future directions

The findings of this study have several theoretical contributions. First of all, it investigates the impact of media attention and peer influence on negative body image and explore its underlying mechanisms. In addition, the study further enriches the body image research of women by combining the sociocultural and self-objectification models and applying them to Chinese context. This design provides us with a deeper understanding of the mechanisms of social media and peer influence on body image. Second, our study considers the protective filtering variable creatively and incorporated this positive body image cognitive strategy into the influencing factor model of negative body image, which not only enriches and expands the tripartite influence model but is also helpful in developing intervention measures to improve body image on the basis of this model.

Moreover, the study has several practical implications. First, by building awareness of surveillance behaviors, as well as structural cognitive and behavioral changes, it may be possible to prevent the internalization of these behaviors into negative body image. Second, more valuable strategies should be adopted to stimulate protective filtering in women. For example, mass media campaigns promoting positive body image can cultivate a social environment that encourages young adult women to emphasize the importance of focusing not only on their physical appearance but also on other valued domains in their lives and spending time with people who are not invested in physical appearance [ 7 ].

However, there are some limitations to this study. First, since this is a cross-sectional study, it is not possible to determine the causal relationships between the variables. Experimental methods should be applied to identify complex relationships among related variables. Second, since consumers aged 20 to 29 are the biggest users of social media Footnote 1 .the sample age of this study is also mainly concentrated in this age group. Therefore, the results of this study may not apply to women in their forties and older. In addition, the study did not include male group, which has been a growing concern in recent years. Future research may need to enrich the current findings with a more diverse sample in terms of gender, age, socioeconomic status, and ethnicity. Third, our study only verified the moderating effect of protective filtering on the process of internalization of media information and peer comparison, without investigating how protective filtering specifically affects body dissatisfaction. Future research must explore precisely the mechanisms responsible for the influence of protective filtering on negative body image. For example, self-compassion and body appreciation are highly correlated [ 52 ], those orientating cognitive processing may contribute to women’s rejection and reconstruction of negative messages from social media and promote a positive view of their body appearance.


In summary, this study is an important extension of the research on the mechanism of influencing factors of body dissatisfaction among young Chinese women. Meanwhile, it uncovered a positive relationship between media attention and internalization of media information, as well as between peer communication and peer comparison. In addition, the mediating role of body surveillance in the relationship between the internalization of media information and body dissatisfaction, as well as between peer comparison and body dissatisfaction, tested positive. This study provides a brand-new perspective for exploring the innovation of objectified body consciousness and expanding the scope of applicability of the tripartite influence model. Furthermore, our findings indicate that protective filtering can moderate the path of media attention affecting the internalization of media information and the path of peer communication affecting peer comparison, which has important implications for future intervention research and practice.

Data Availability

The datasets generated during and analyzed during the current study are available from the corresponding author on reasonable request.

What Age Group Uses Social Media the Most? [Aug 2023 Update] (oberlo.com).

Grogan S. Body image and health: contemporary perspectives. J Health Psychol. 2006;11(4):523–30.

Article   PubMed   Google Scholar  

Fisher S, Cleveland SE. Body image and personality . Body image and personality. 1958, Oxford, England: Van Nostrand. xi, 420-xi, 420.

Frederick DA, et al. Pathways from sociocultural and objectification constructs to body satisfaction among women: the US Body Project I. Body Image. 2022;41:195–208.

Article   PubMed   PubMed Central   Google Scholar  

Tylka TL, Wood-Barcalow NL. A positive complement. Elsevier; 2015. pp. 115–7.

Brechan I, Kvalem IL. Relationship between body dissatisfaction and disordered eating: mediating role of self-esteem and depression. Eat Behav. 2015;17:49–58.

Weinberger N-A, et al. Body dissatisfaction in individuals with obesity compared to normal-weight individuals: a systematic review and meta-analysis. Obes Facts. 2017;9(6):424–41.

Article   Google Scholar  

Shroff H, Thompson JK. The tripartite influence model of body image and eating disturbance: a replication with adolescent girls. Body Image. 2006;3(1):17–23.

Van den Berg P, et al. Body dissatisfaction and body comparison with media images in males and females. Body Image. 2007;4(3):257–68.

Dohnt H, Tiggemann M. The contribution of peer and media influences to the development of body satisfaction and self-esteem in young girls: a prospective study. Dev Psychol. 2006;42(5):929.

Fredrickson BL, Roberts T-A. Objectification theory: toward understanding women’s lived experiences and mental health risks. Psychol Women Q. 1997;21(2):173–206.

Wood-Barcalow NL, Tylka TL, Augustus-Horvath CL. But I like my body: positive body image characteristics and a holistic model for young-adult women. Body Image. 2010;7(2):106–16.

Wu Y, Mulkens S, Alleva JM. Body Image and acceptance of cosmetic Surgery in China and the Netherlands: a qualitative study on cultural differences and similarities. Body Image. 2022;40:30–49.

Shen J, et al. The effects of media and peers on negative body image among Chinese college students: a chained indirect influence model of appearance comparison and internalization of the thin ideal. J Eat Disorders. 2022;10(1):1–9.

Google Scholar  

Karsay K, Knoll J, Matthes J. Sexualizing media use and self-objectification: a meta-analysis. Psychol Women Q. 2018;42(1):9–28.

Karazsia BT, et al. Thinking meta-theoretically about the role of internalization in the development of body dissatisfaction and body change behaviors. Body Image. 2013;10(4):433–41.

Hogue JV, Mills JS. The effects of active social media engagement with peers on body image in young women. Body Image. 2019;28:1–5.

Holmqvist K, Frisen A. I bet they aren’t that perfect in reality: appearance ideals viewed from the perspective of adolescents with a positive body image. Body Image. 2012;9(3):388–95.

Slater MD, Goodall CE, Hayes AF. Self-reported news attention does assess differential processing of media content: an experiment on risk perceptions utilizing a random sample of US local crime and Accident news. J Communication. 2009;59(1):117–34.

Koetsier J. Massive TikTok Growth: Up 75% this year, now 33x more users than nearest direct competitor Retrieved Dec, 2020. 15: p. 2020.

Shan Y, Chen K-J, Lin J-S. When social media influencers endorse brands: the effects of self-influencer congruence, parasocial identification, and perceived endorser motive. Int J Advertising. 2020;39(5):590–610.

Michael SL, et al. Parental and peer factors associated with body image discrepancy among fifth-grade boys and girls. J Youth Adolesc. 2014;43:15–29.

Ricciardelli LA, McCabe MP, Banfield S. Body image and body change methods in adolescent boys: role of parents, friends and the media. J Psychosom Res. 2000;49(3):189–97.

Lawler M, Nixon E. Body dissatisfaction among adolescent boys and girls: the effects of body mass, peer appearance culture and internalization of appearance ideals. J Youth Adolesc. 2011;40(1):59–71.

Jones DC, Vigfusdottir TH, Lee Y. Body image and the appearance culture among adolescent girls and boys: an examination of friend conversations, peer criticism, appearance magazines, and the internalization of appearance ideals. J Adolesc Res. 2004;19(3):323–39.

Arroyo A, Harwood J. Theorizing fat talk: Intrapersonal, interpersonal, and intergroup communication about groups. Annals of the International Communication Association. 2014;38(1):175–205.

Vandenbosch L, Eggermont S. Understanding sexual objectification: a comprehensive approach toward media exposure and girls’ internalization of beauty ideals, self-objectification, and body surveillance. J Communication. 2012;62(5):869–87.

Dimas MA, Galway SC, Gammage KL. Do you see what I see? The influence of self-objectification on appearance anxiety, intrinsic motivation, interoceptive awareness, and physical performance. Body Image. 2021;39:53–61.

Fredrickson BL, et al. That swimsuit becomes you: sex differences in self-objectification, restrained eating, and math performance. J Personal Soc Psychol. 1998;75(1):269.

Moradi B, Huang Y-P. Objectification theory and psychology of women: a decade of advances and future directions. Psychol Women Q. 2008;32(4):377–98.

Aubrey JS. Effects of sexually objectifying media on self-objectification and body surveillance in undergraduates: results of a 2-year panel study. J Communication. 2006;56(2):366–86.

Gattino S, et al. A cross-cultural study of biological, psychological, and social antecedents of self-objectification in Italy and Romania. Sex Roles. 2018;78:325–37.

Mercurio A, Rima B. Watching my weight: Self-weighing, body surveillance, and body dissatisfaction. Sex Roles. 2011;65:47–55.

Wang Y, et al. Body talk on social networking sites, body surveillance, and body shame among young adults: the roles of self-compassion and gender. Sex Roles. 2020;82:731–42.

Halliwell E. The impact of thin idealized media images on body satisfaction: does body appreciation protect women from negative effects? Body Image. 2013;10(4):509–14.

Poulter PI, Treharne GJ. I’m actually pretty happy with how I am: A mixed-methods study of young women with positive body image Psychology & Health, 2021. 36(6): p. 649–668.

Tylka TL, Wood-Barcalow NL. What is and what is not positive body image? Conceptual foundations and construct definition. Body Image. 2015;14:118–29.

Boas TC, Christenson DP, Glick DM. Recruiting large online samples in the United States and India: Facebook, mechanical Turk, and qualtrics. Political Sci Res Methods. 2020;8(2):232–50.

Andrew R, Tiggemann M, Clark L. The protective role of body appreciation against media-induced body dissatisfaction. Body Image. 2015;15:98–104.

Becker CB, Smith LM, Ciao AC. Peer-facilitated eating disorder prevention: a randomized effectiveness trial of cognitive dissonance and media advocacy. J Couns Psychol. 2006;53(4):550.

Yan J, Ji J, Gao L. From Health campaign to Interpersonal Communication: does traditional Diet Culture hinder the communication of the Chinese Gongkuai campaign? Int J Environ Res Public Health. 2022;19(16):9992.

Thompson JK, et al. The sociocultural attitudes towards appearance Scale-3 (SATAQ-3): development and validation. Int J Eat Disord. 2004;35(3):293–304.

McKinley NM, Hyde JS. The objectified body consciousness scale: development and validation. Psychol Women Q. 1996;20(2):181–215.

Evens O, Stutterheim SE, Alleva JM. Protective filtering: a qualitative study on the cognitive strategies young women use to promote positive body image in the face of beauty-ideal imagery on Instagram. Body Image. 2021;39:40–52.

James WL, Hatten KJ. Further evidence on the validity of the self typing paragraph approach: Miles and snow strategic archetypes in banking. Strateg Manag J. 1995;16(2):161–8.

Giorgi A. Religious feminists and the intersectional Feminist movements: insights from a case study. Eur J Women’s Stud. 2021;28(2):244–59.

Brajdić Vuković M, Lucić M, Štulhofer A. Internet use associated body-surveillance among female adolescents: assessing the role of peer networks. Sex Cult. 2018;22(2):521–40.

Festinger L. A theory of social comparison processes. Hum Relat. 1954;7(2):117–40.

Yang J et al. Selfie-viewing and facial dissatisfaction among emerging adults: a moderated mediation model of appearance comparisons and self-objectification . Int J Environ Res Public Health, 2020. 17(2).

Fitzsimmons-Craft EE, et al. Examining social physique anxiety and disordered eating in college women. The roles of social comparison and body surveillance. Appetite. 2012;59(3):796–805.

Trumbo CW. Information processing and risk perception: an adaptation of the heuristic-systematic model. J Communication. 2002;52(2):367–82.

Keery H, van den Berg P, Thompson JK. An evaluation of the tripartite influence model of body dissatisfaction and eating disturbance with adolescent girls. Body Image. 2004;1(3):237–51.

Wasylkiw L, MacKinnon AL, MacLellan AM. Exploring the link between self-compassion and body image in university women. Body Image. 2012;9(2):236–45.

Download references


We acknowledge Wanwan Yu, Mingjun Zhou for assistance with data collection and coding.

This study was supported by The Anhui Provincial Social Science Fund for Distinguished Young Scholars (2022AH020049); The Anhui Provincial Scientific Research Projects for Higher Education(2023AH010036).

Author information

Jing Ji and Xiaoli Xiang have contributed equally to this work.

Authors and Affiliations

School of Health Service Management, Anhui Medical University, Hefei, 230032, China

Jing Ji, Ren Chen & Jing Yan

Department of Ophthalmology, The Affiliated Changshu Hospital of Nantong University, Changshu, 215500, China

Xiaoli Xiang

Department of Plastic surgery, The second Hospital of Anhui Medical University, Hefei, 230601, China

Zenghong Chen

You can also search for this author in PubMed   Google Scholar


Jing Yan and Jing Ji contributed to the conception of the study, secured funding, organized the investigation, and revised the manuscript. Jing Ji analyzed the data and wrote the original paper. Ren Chen helped data collection and analyzed the data. Zenghong Chen and Xiaoli Xiang organized the investigation and helped data collection.

Corresponding author

Correspondence to Jing Yan .

Ethics declarations

Ethics approval and consent to participate.

The original study has been performed in accordance with the Declaration of Helsinki and have been approved by the Biomedical Ethics Committee of Anhui Medical University (reference number: 20210542) where the researchers were affiliated. The participants received oral and written information and provided written informed consent before participating in the study.

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Ji, J., Xiang, X., Chen, R. et al. Pathways from media attention and peer communication to body dissatisfaction: the moderating role of protective filtering. BMC Psychol 11 , 447 (2023). https://doi.org/10.1186/s40359-023-01491-x

Download citation

Received : 28 April 2023

Accepted : 17 December 2023

Published : 20 December 2023

DOI : https://doi.org/10.1186/s40359-023-01491-x

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Body dissatisfaction
  • Media attention
  • Peer communication
  • Protective filtering
  • Body surveillance

BMC Psychology

ISSN: 2050-7283

research body language in communication

107 How Your Internal Dialog Confuses Simple Communication With Your Child The Language of Play - Kids that Listen, Speech Therapy, Language Development, Early Intervention

Hey Friends ~ According to research a small amount of our communication is actually the words that we say.  Our tone, gestures, and our internal emotions communicated a vast amount to our children.   Children pick up on our internal energy when we speak. If inside, you are full of self-criticism, while your words say, I love you , the kids can feel confusion. They get a mixed message. !! And they react to the confusion !! Today we’re going to talk about our mixed messages that create communication confusion, simply due to the energy you bring to the interaction. Then strategies to help both you and your child.   Note:  If you like this this episode, you will LOVE Thursday's episode with Expert: Dewey Kraus giving us strategies on Self-Compassion in parenting.     Always cheering you on!  Dinalynn   CONTACT the Host, Dinalynn:  [email protected]   RESOURCES from The Language Of Play Sign up for my newsletter! Join my new FREE Facebook Community HERE! to join a new community where you are free to ask questions, make friends, and we learn from each other! Sign up for a 15 min zoom Session!  ("Let's Meet Session") For Speaking Engagements or For 1:1 or Group Parent Coaching (virtual or live), contact me at [email protected] Related or Mentioned Episodes:  06 Helping Kids Share Thoughts, Feelings and Ideas 26 Four Strategies to Create Frustration Tolerance in Children 33 Series: “Why Won’t the Kids Listen!?” part 6: BiG EMoTioNS! 4 things to ask yourself before you respond  

  • More Episodes
  • Copyright 2022-2023 All rights reserved.
  • Reference Manager
  • Simple TEXT file

People also looked at

Original research article, body language in the brain: constructing meaning from expressive movement.

research body language in communication

  • 1 Department of Psychiatry, University of British Columbia, Vancouver, BC, Canada
  • 2 Mental Health and Integrated Neurobehavioral Development Research Core, Child and Family Research Institute, Vancouver, BC, Canada
  • 3 Psychiatric Epidemiology and Evaluation Unit, Saint John of God Clinical Research Center, Brescia, Italy
  • 4 Department of Psychological and Brain Sciences, University of California, Santa Barbara, CA, USA

This fMRI study investigated neural systems that interpret body language—the meaningful emotive expressions conveyed by body movement. Participants watched videos of performers engaged in modern dance or pantomime that conveyed specific themes such as hope, agony, lust, or exhaustion. We tested whether the meaning of an affectively laden performance was decoded in localized brain substrates as a distinct property of action separable from other superficial features, such as choreography, kinematics, performer, and low-level visual stimuli. A repetition suppression (RS) procedure was used to identify brain regions that decoded the meaningful affective state of a performer, as evidenced by decreased activity when emotive themes were repeated in successive performances. Because the theme was the only feature repeated across video clips that were otherwise entirely different, the occurrence of RS identified brain substrates that differentially coded the specific meaning of expressive performances. RS was observed bilaterally, extending anteriorly along middle and superior temporal gyri into temporal pole, medially into insula, rostrally into inferior orbitofrontal cortex, and caudally into hippocampus and amygdala. Behavioral data on a separate task indicated that interpreting themes from modern dance was more difficult than interpreting pantomime; a result that was also reflected in the fMRI data. There was greater RS in left hemisphere, suggesting that the more abstract metaphors used to express themes in dance compared to pantomime posed a greater challenge to brain substrates directly involved in decoding those themes. We propose that the meaning-sensitive temporal-orbitofrontal regions observed here comprise a superordinate functional module of a known hierarchical action observation network (AON), which is critical to the construction of meaning from expressive movement. The findings are discussed with respect to a predictive coding model of action understanding.


Body language is a powerful form of non-verbal communication providing important clues about the intentions, emotions, and motivations of others. In the course of our everyday lives, we pick up information about what people are thinking and feeling through their body posture, mannerisms, gestures, and the prosody of their movements. This intuitive social awareness is an impressive feat of neural integration; the cumulative result of activity in distributed brain systems specialized for coding a wide range of social information. Reading body language is more than just a matter of perception. It entails not only recognizing and coding socially relevant visual information, but also ascribing meaning to those representations.

We know a great deal about brain systems involved in the perception of facial expressions, eye movements, body movement, hand gestures, and goal directed actions, as well as those mediating affective, decision, and motor responses to social stimuli. What is still missing is an understanding of how the brain “reads” body language. Beyond the decoding of body motion, what are the brain substrates directly involved in extracting meaning from affectively laden body expressions? The brain has several functionally specialized structures and systems for processing socially relevant perceptual information. A subcortical pulvinar-superior colliculus-amygdala-striatal circuit mediates reflex-like perception of emotion from body posture, particularly fear, and activates commensurate reflexive motor responses ( Dean et al., 1989 ; Cardinal et al., 2002 ; Sah et al., 2003 ; de Gelder and Hadjikhani, 2006 ). A region of the occipital cortex known as the extrastriate body area (EBA) is sensitive to bodily form ( Bonda et al., 1996 ; Hadjikhani and de Gelder, 2003 ; Astafiev et al., 2004 ; Peelen and Downing, 2005 ; Urgesi et al., 2006 ). The fusiform gyrus of the ventral occipital and temporal lobes has a critical role in processing faces and facial expressions ( McCarthy et al., 1997 ; Hoffman and Haxby, 2000 ; Haxby et al., 2002 ). Posterior superior temporal sulcus is involved in perceiving the motion of biological forms in particular ( Allison et al., 2000 ; Pelphrey et al., 2005 ). Somatosensory, ventromedial prefrontal, premotor, and insular cortex contribute to one's own embodied awareness of perceived emotional states ( Adolphs et al., 2000 ; Damasio et al., 2000 ). Visuomotor processing in a functional brain network known as the action observation network (AON) codes observed action in distinct functional modules that together link the perception of action and emotional body language with ongoing behavioral goals and the formation of adaptive reflexes, decisions, and motor behaviors ( Grafton et al., 1996 ; Rizzolatti et al., 1996b , 2001 ; Hari et al., 1998 ; Fadiga et al., 2000 ; Buccino et al., 2001 ; Grézes et al., 2001 ; Grèzes et al., 2001 ; Ferrari et al., 2003 ; Zentgraf et al., 2005 ; Bertenthal et al., 2006 ; de Gelder, 2006 ; Frey and Gerry, 2006 ; Ulloa and Pineda, 2007 ). Given all we know about how bodies, faces, emotions, and actions are perceived, one might expect a clear consensus on how meaning is derived from these percepts. Perhaps surprisingly, while we know these systems are crucial to integrating perceptual information with affective and motor responses, how the brain deciphers meaning based on body movement remains unknown. The focus of this investigation was to identify brain substrates that decode meaning from body movement, as evidenced by meaning-specific neural processing that differentiates body movements conveying distinct expressions.

To identify brain substrates sensitive to the meaningful emotive state of an actor conveyed through body movement, we used repetition suppression (RS) fMRI. This technique identifies regions of the brain that code for a particular stimulus dimension (e.g., shape) by revealing substrates that have different patterns of neural activity in response to different attributes of that dimension (e.g., circle, square, triangle; Grill-Spector et al., 2006 ). When a particular attribute is repeated, synaptic activity and the associated blood oxygen level-dependent (BOLD) response decreases in voxels containing neuronal assemblies that code that attribute ( Wiggs and Martin, 1998 ; Grill-Spector and Malach, 2001 ). We have used this method previously to show that various properties of an action such as movement kinematics, object goal, outcome, and context-appropriateness of action mechanics are uniquely coded by different neural substrates within a parietal-frontal action observation network (AON; Hamilton and Grafton, 2006 , 2007 , 2008 ; Ortigue et al., 2010 ). Here, we applied RS-fMRI to identify brain areas in which activity decreased when the meaningful emotive theme of an expressive performance was repeated between trials. The results demonstrate a novel coding function of the AON—decoding meaning from body language.

Working with a group of professional dancers, we produced a set of video clips in which performers intentionally expressed a particular meaningful theme either through dance or pantomime. Typical themes consisted of expressions of hope, agony, lust, or exhaustion. The experimental manipulation of theme was studied independently of choreography, performer, or camera viewpoint, which allowed us to repeat the meaning of a movement sequence from one trial to another while varying physical movement characteristics and perceptual features. With this RS-fMRI design, a decrease in BOLD activity for repeated relative to novel themes (RS) could not be attributed to specific movements, characteristics of the performer, “low-level” visual features, or the general process of attending to body expressions. Rather, RS revealed brain areas in which specific voxel-wise neural population codes differentiated meaningful expressions based on body movement (Figure 1 ).


Figure 1. Manipulating trial sequence to induce RS in brain regions that decode body language . The order of video presentation was controlled such that themes depicted in consecutive videos were either novel or repeated. Each consecutive video clip was unique; repeated themes were always portrayed by different dancers, different camera angles, or both. Thus, RS for repeated themes was not the result of low-level visual features, but rather identified brain areas that were sensitive to the specific meaningful theme conveyed by a performance. In brain regions showing RS, a particular affective theme—hope, for example—will evoke a particular pattern of neural activity. A novel theme on the subsequent trial—illness, for instance—will trigger a different but equally strong pattern of neural activity in distinct cell assemblies, resulting in an equivalent BOLD response. In contrast, a repetition of the hopefulness theme on the subsequent trial will trigger activity in the same neural assemblies as the first trial, but to a lesser extent, resulting in a reduced BOLD response for repeated themes. In this way, regions showing RS reveal regions that support distinct patterns of neural activity in response to different themes.

Participants were scanned using fMRI while viewing a series of 10-s video clips depicting modern dance or pantomime performances that conveyed specific meaningful themes. Because each performer had a unique artistic style, the same theme could be portrayed using completely different physical movements. This allowed the repetition of meaning while all other aspects of the physical stimuli varied from trial to trial. We predicted that specific regions of the AON engaged by observing expressive whole body movement would show suppressed BOLD activation for repeated relative to novel themes (RS). Brain regions showing RS would reveal brain substrates directly involved in decoding meaning based on body movement.

The dance and pantomime performances used here conveyed expressive themes through movement, but did not rely on typified, canonical facial expressions to invoke particular affective responses. Rather, meaningful themes were expressed with unique artistic choreography while facial expressions were concealed with a classic white mime's mask. The result was a subtle stimulus set that promoted thoughtful, interpretive viewing that could not elicit reflex-like responses based on prototypical facial expressions. In so doing, the present study shifted the focus away from automatic affective resonance toward a more deliberate ascertainment of meaning from movement.

While dance and pantomime both expressed meaningful emotive themes, the quality of movement and the types of gestures used were different. Pantomime sequences used fairly mundane gestures and natural, everyday movements. Dance sequences used stylized gestures and interpretive, prosodic movements. The critical distinction between these two types of expressive movement is in the degree of abstraction in the metaphors that link movement with meaning (see Morris, 2002 for a detailed discussion of movement metaphors). Pantomime by definition uses gesture to mimic everyday objects, situations, and behavior, and thus relies on relatively concrete movement metaphors. In contrast, dance relies on more abstract movement metaphors that draw on indirect associations between qualities of movement and the emotions and thoughts it evokes in a viewer. We predicted that since dance expresses meaning more abstractly than pantomime, dance sequences would be more difficult to interpret than pantomimed sequences, and would likewise pose a greater challenge to brain processes involved in decoding meaning from movement. Thus, we predicted greater involvement of thematic decoding areas for danced than for pantomimed movement expressions. Greater RS for dance than pantomime could result from dance triggering greater activity upon a first presentation, a greater reduction in activity with a repeated presentation, or some combination of both. Given our prediction that greater RS for dance would be linked to interpretive difficulty, we hypothesized it would be manifested as an increased processing demand resulting in greater initial BOLD activity for novel danced themes.


Forty-six neurologically healthy, right-handed individuals (30 women, mean age = 24.22 years, range = 19–55 years) provided written informed consent and were paid for their participation. Performers also agreed in writing to allow the use of their images and videos for scientific purposes. The protocol was approved by the Office of Research Human Subjects Committee at the University of California Santa Barbara (UCSB).

Eight themes were depicted, including four danced themes (happy, hopeful, fearful, and in agony) and four pantomimed themes (in love, relaxed, ill, and exhausted). Performance sequences were choreographed and performed by four professional dancers recruited from the SonneBlauma Danscz Theatre Company (Santa Barbara, California; now called ArtBark International, http://www.artbark.org/ ). Performers wore expressionless white masks so body language was conveyed though gestural whole-body movement as opposed to facial expressions. To express each theme, performers adopted an affective stance and improvised a short sequence of modern dance choreography (two themes per performer) or pantomime gestures (two themes per performer). Each of the eight themes were performed by two different dancers and recorded from two different camera angles, resulting in four distinct videos representing each theme (32 distinct videos in total; clips available in Supplementary Materials online).

Behavioral Procedure

In a separate session outside the scanner either before or after fMRI data collection, an interpretation task measured observers' ability to discern the intended meaning of a performance (Figure 2 ). The interpretation task was carried out in a separate session to avoid confounding movement observation in the scanner with explicit decision-making and overt motor responses. Participants were asked to view each video clip and choose from a list of four options the theme that best corresponded with the movement sequence they had just watched. Responses were made by pressing one of four corresponding buttons on a keyboard. Two behavioral measures were collected to assess how well participants interpreted the intended meaning of expressive performances. Consistency scores reflected the proportion of observers' interpretations that matched the performer's intended expression. Response times indicated the time taken to make interpretive judgments. In order to encourage subjects to use their initial impressions and to avoid over-deliberating, the four response options were previewed briefly immediately prior to video presentation.


Figure 2. Experimental testing procedure . Participants completed a thematic interpretation task outside the scanner, either before or after the imaging session. Performance on this task allowed us to test whether there was a difference in how readily observers interpreted the intended meaning conveyed through dance or pantomime. Any performance differences on this explicit theme judgment task could help interpret the functional significance of observed differences in brain activity associated with passively viewing the two types of movement in the scanner.

For the interpretation task collected outside the scanner, videos were presented and responses collected on a Mac Powerbook G4 laptop programmed using the Psychtoolbox (v. 3.0.8) extension ( Brainard, 1997 ; Pelli and Brainard, 1997 ) for Mac OSX running under Matlab 7.5 R2007b (the MathWorks, Natick, MA). Each trial began with the visual presentation of a list of four theme options corresponding to four button press responses (“u,” “i,” “o,” or “p” keyboard buttons). This list remained on the screen for 3 s, the screen blanked for 750 ms, and then the movie played for 10 s. Following the presentation of the movie, the four response options were presented again, and remained on the screen until a response was made. Each unique video was presented twice, resulting in 64 trials total. Video order was randomized for each participant, and the response options for each trial included the intended theme and three randomly selected alternatives.

Neuroimaging Procedure

fMRI data were collected with a Siemens 3.0 T Magnetom Tim Trio system using a 12-channel phased array head coil. Functional images were acquired with a T2* weighted single shot gradient echo, echo-planar sequence sensitive to Blood Oxygen Level Dependent (BOLD) contrast (TR = 2 s; TE = 30 ms; FA = 90°; FOV = 19.2 cm). Each volume consisted of 37 slices acquired parallel to the AC–PC plane (interleaved acquisition; 3 mm thick with 0.5 mm gap; 3 × 3 mm in-plane resolution; 64 × 64 matrix).

Each participant completed four functional scanning runs lasting approximately 7.5 min while viewing danced or acted expressive movement sequences. While there were a total of eight themes in the stimulus set for the study, each scanning run depicted only two of those eight themes. Over the course of all four scanning runs, all eight themes were depicted. Trial sequences were arranged such that theme of a movement sequence was either novel or repeated with respect to the previous trial. This allowed for the analysis of BOLD response RS for repeated vs. novel themes. Each run presented 24 video clips (3 presentations of 8 unique videos depicting 2 themes × 2 dancers × 2 camera angles). Novel and repeated themes were intermixed within each scanning run, with no more than three sequential repetitions of the same theme. Two scanning runs depicted dance and two runs depicted pantomime performances. The order of runs was randomized for each participant. The experiment was controlled using Presentation software (version 13.0, Neurobehavioral Systems Inc, CA). Participants were instructed to focus on the movement performance while viewing the videos. No specific information about the themes portrayed or types of movement used was provided, and no motor responses were required.

For the behavioral data collected outside the scanner, mean consistency scores and mean response time (RT; ms) were computed for each participant. Consistency and RT were each submitted to an ANOVA with Movement Type (dance vs. pantomime) as a within-subjects factor using Stata/IC 10.0 for Macintosh.

Statistical analysis of the neuroimaging data was organized to identify: (1) brain areas responsive to the observation of expressive movement sequences, defined by BOLD activity relative to an implicit baseline, (2) brain areas directly involved in decoding meaning from movement, defined by RS for repeated themes, (3) brain areas in which processes for decoding thematic meaning varied as a function of abstractness, defined by greater RS for danced than pantomimed themes, and (4) the specific pattern of BOLD activity differences for novel and repeated themes as a function of danced or pantomimed movements in regions showing greater RS for dance.

The fMRI data were analyzed using Statistical Parametric Mapping software (SPM5, Wellcome Department of Imaging Neuroscience, London; www.fil.ion.ucl.ac.uk/spm ) implemented in Matlab 7.5 R2007b (The MathWorks, Natick, MA). Individual scans were realigned, slice-time corrected and spatially normalized to the Montreal Neurological Institute (MNI) template in SPM5 with a resampled resolution of 3 × 3 × 3 mm. A smoothing kernel of 8 mm was applied to the functional images. A general linear model was created for each participant using SPM5. Parameter estimates of event-related BOLD activity were computed for novel and repeated themes depicted by danced and pantomimed movements, separately for each scanning run, for each participant.

Because the intended theme of each movement sequence was not expressed at a discrete time point but rather throughout the duration of the 10 s video clip, the most appropriate hemodynamic response function (HRF) with which to model the BOLD response at the individual level was determined empirically prior to parameter estimation. Of interest was whether the shape of the BOLD response to these relatively long video clips differed from the canonical HRF typically implemented in SPM. The shape of the BOLD response was estimated for each participant by modeling a finite impulse response function ( Ollinger et al., 2001 ). Each trial was represented by a sequence of 12 consecutive TRs, beginning at the onset of each video clip. Based on this deconvolution, a set of beta weights describing the shape of the response over a 24 s interval was obtained for both novel and repeated themes depicted by both danced and pantomimed movement sequences. To determine whether adjustments should be made to the canonical HRF implemented in SPM, the BOLD responses of a set of 45 brain regions within a known AON were evaluated (see Table 1 for a complete list). To find the most representative shape of the BOLD response within the AON, deconvolved beta weights for each condition were averaged across sessions and collapsed by singular value decomposition analysis ( Golub and Reinsch, 1970 ). This resulted in a characteristic signal shape that maximally described the actual BOLD response in AON regions for both novel and repeated themes, for both danced and pantomimed sequences. This examination of the BOLD response revealed that its time-to-peak was delayed 4 s compared to the canonical HRF response curve typically implemented in SPM. That is, the peak of the BOLD response was reached at 8–10 s following stimulus onset instead of the canonical 4–6 s. Given this result, parameter estimation for conditions of interest in our main analysis was based on a convolution of the design matrix for each participant with a custom HRF that accounted for the observed 4 s delay. Time-to-peak of the HRF was adjusted from 6 to 10 s while keeping the same overall width and height of the canonical function implemented in SPM. Using this custom HRF, the 10 s video duration was modeled as usual in SPM by convolving the HRF with a 10 s boxcar function.


Table 1. The action observation network, as defined by previous investigations .

Second-level whole-brain analysis was conducted with SPM8 using a 2 × 2 random effects model with Movement Type and Repetition as within-subject factors using the weighted parameter estimates (contrast images) obtained at the individual level as data. A gray matter mask was applied to whole-brain contrast images prior to second-level analysis to remove white matter voxels from the analysis. Six second-level contrasts were computed, including (1) expressive movement observation (BOLD relative to baseline), (2) dance observation effect (danced sequences > pantomimed sequences), (3) pantomime observation effect (pantomimed sequences > danced sequences), (4) RS (novel themes > repeated themes), (5) dance × repetition interaction (RS for dance > RS for pantomime), and (6) pantomime x repetition interaction (RS for pantomime > RS for dance). Following the creation of T-map images in SPM8, FSL was used to create Z-map images (Version 4.1.1; Analysis Group, FMRIB, Oxford, UK; Smith et al., 2004 ; Jenkinson et al., 2012 ). The results were thresholded at p < 0.05, cluster-corrected using FSL subroutines based on Gaussian random field theory ( Poldrack et al., 2011 ; Nichols, 2012 ). To examine the nature of the differences in RS between dance and pantomime, a mask image was created based on the corresponding cluster-thresholded Z-map of regions showing greater RS for dance, and the mean BOLD activity (contrast image values) was computed for novel and repeated dance and pantomime contrasts from each participant's first-level analysis. Mean BOLD activity measures were submitted to a 2 × 2 ANOVA with Movement Type (dance vs. pantomime) and Repetition (novel vs. repeat) as within-subjects factors using Stata/IC 10.0 for Macintosh.

In order to ensure that observed RS effects for repeated themes were not due to low-level kinematic effects, a motion tracking analysis of all 32 videos was performed using Tracker 4.87 software for Mac (written by Douglas Brown, distributed on the Open Source Physics platform, www.opensourcephysics.org ). A variety of motion parameters, including velocity, acceleration, momentum, and kinetic energy, were computed within the Tracker software based on semi-automated/supervised motion tracking of the top of the head, one hand, and one foot of each performer. The key question relevant to our results was whether there was a difference in motion between videos depicting novel and repeated themes. One factor ANOVAs for each motion parameter revealed no significant differences in coarse kinematic profiles between “novel” and “repeated” theme trials (all p 's > 0.05). This was not particularly surprising given that all videos were used for both novel and repeated themes, which were defined entirely based on trial sequence). In contrast, the comparison between danced and pantomimed themes did reveal significant differences in kinematic profiles. A 2 × 3 ANOVA with Movement Type (Dance, Pantomime) and Body Point (Hand, Head, Foot) as factors was conducted for each motion parameter (velocity, acceleration, momentum, and kinetic energy), and revealed greater motion energy on all parameters for the danced themes compared to the pantomimed themes (all p 's < 0.05). Any differences in RS between danced and pantomimed themes may therefore be attributed to differences in kinematic properties of body movement. Importantly, however, because there were no systematic differences in motion kinematics between novel and repeated themes, any RS effects for repeated themes could not be attributed to the effect of motion kinematics.

Figure 3 illustrates the behavioral results of the interpretation task completed outside the scanner. Participants had higher consistency scores for pantomimed movements than danced movements [ F (1, 42) = 42.06, p < 0.0001], indicating better transmission of the intended expressive meaning from performer to viewer. Pantomimed sequences were also interpreted more quickly than danced sequences [ F (1, 42) = 27.28, p < 0.0001], suggesting an overall performance advantage for pantomimed sequences.


Figure 3. Behavioral performance on the theme judgment task . Participants more readily interpreted pantomime than dance. This was evidenced by both greater consistency between the meaningful theme intended to be expressed by the performer and the interpretive judgments made by the observer (left), and faster response times (right). This pattern of results suggests that dance was more difficult to interpret than pantomime, perhaps owing to the use of more abstract metaphors to link movement with meaning. Pantomime, on the other hand, relied on more concrete, mundane sorts of movements that were more likely to carry meaningful associations based on observers' prior everyday experience. SEM, standard error of the mean.

Expressive Whole-body Movements Engage the Action Observation Network

Brain activity associated with the observation of expressive movement sequences was revealed by significant BOLD responses to observing both dance and pantomime movement sequences, relative to the inter-trial resting baseline. Figure 4 depicts significant activation ( p < 0.05, cluster corrected in FSL) rendered on an inflated cortical surface of the Human PALS-B12 Atlas ( Van Essen, 2005 ) using Caret (Version 5. 61; http://www.nitrc.org/projects/caret ; Van Essen et al., 2001 ). Table 2 presents the MNI coordinates for selected voxels within clusters active during movement observation, as labeled in Figure 4 . Region names were obtained from the Harvard-Oxford Cortical and Subcortical Structural Atlases ( Frazier et al., 2005 ; Desikan et al., 2006 ; Makris et al., 2006 ; Goldstein et al., 2007 ; Harvard Center for Morphometric Analysis; www.partners.org/researchcores/imaging/morphology_MGH.asp ), and Brodmann Area labels were obtained from the Juelich Histological Atlas ( Eickhoff et al., 2005 , 2006 , 2007 ), as implemented in FSL. Observation of body movement was associated with robust BOLD activation encompassing cortex typically associated with the AON, including fronto-parietal regions linked to the representation of action kinematics, goals, and outcomes ( Hamilton and Grafton, 2006 , 2007 ), as well as temporal, occipital, and insular cortex and subcortical regions including amygdala and hippocampus—regions typically associated with language comprehension ( Kirchhoff et al., 2000 ; Ni et al., 2000 ; Friederici et al., 2003 ) and socio-affective information processing and decision-making ( Anderson et al., 1999 ; Adolphs et al., 2003 ; Bechara et al., 2003 ; Bechara and Damasio, 2005 ).


Figure 4. Expressive performances engage the action observation network . Viewing expressive whole-body movement sequences engaged a distributed cortical action observation network ( p < 0.05, FWE corrected). Large areas of parietal, temporal, frontal, and insular cortex included somatosensory, motor, and premotor regions that have been considered previously to comprise a human “mirror neuron” system, as well as non-motor areas linked to comprehension, social perception, and affective decision-making. Number labels correspond to those listed in Table 2 , which provides anatomical names and voxel coordinates for areas of peak activation. Dotted line for regions 17/18 indicates medial temporal position not visible on the cortical surface.


Table 2. Brain regions showing a significant BOLD response while participants viewed expressive whole-body movement sequences .

The Action Observation Network “Reads” Body Language

To isolate brain areas that decipher meaning conveyed by expressive body movement, regions showing RS (reduced BOLD activity for repeated compared to novel themes) were identified. Since theme was the only stimulus dimension repeated systematically across trials for this comparison, decreased activation for repeated themes could not be attributed to physical features of the stimulus such as particular movements, performers, or camera viewpoints. Figure 5 illustrates brain areas showing significant suppression for repeated themes ( p < 0.05, cluster corrected in FSL). Table 3 presents the MNI coordinates for selected voxels within significant clusters. RS was found bilaterally on the rostral bank of the middle temporal gyrus extending into temporal pole and orbitofrontal cortex. There was also significant suppression in bilateral amygdala and insular cortex.


Figure 5. BOLD suppression (RS) reveals brain substrates for “reading” body language . Regions involved in decoding meaning in body language showing were isolated by testing for BOLD suppression when the intended theme of an expressive performance was repeated across trials. To identify regions showing RS, BOLD activity associated with novel themes was contrasted with BOLD activity associated with repeated themes ( p < 0.05, cluster corrected in FSL). Significantly greater activity for novel relative to repeated themes was evidence of RS. Given that the intended theme of a performance was the only element that was repeated between trials, regions showing RS revealed brain substrates that were sensitive to the specific meaning infused into a movement sequence by a performer. Number labels correspond to those listed in Table 3 , which provides anatomical names and voxel coordinates for key clusters showing significant RS. Blue shaded area indicates vertical extent of axial slices shown.


Table 3. Brain regions showing significant BOLD suppression for repeated themes ( p < 0.05, cluster corrected in FSL) .

Movement Abstractness Challenges Brain Substrates that Decode Meaning

The behavioral analysis indicated that interpreting danced themes was more difficult than interpreting pantomimed themes, as evidenced by lower consistency scores and greater RTs. Previous research indicates that greater difficulty discriminating a particular stimulus dimension is associated with greater BOLD suppression upon repetition of that dimension's attributes ( Hasson et al., 2006 ). To test whether greater difficulty decoding meaning from dance than pantomime would also be associated with greater RS in the present data, the magnitude of BOLD response suppression was compared between movement types. This was done with the Dance × Repetition interaction contrast in the second-level whole brain analysis, which revealed regions that had greater RS for dance than for pantomime. Figure 6 illustrates brain regions showing greater RS for themes portrayed through dance than pantomime ( p < 0.05, cluster corrected in FSL). Significant differences were entirely left-lateralized in superior and middle temporal gyri, extending into temporal pole and orbitofrontal cortex, and also present in laterobasal amygdala and the cornu ammonis of the hippocampus. Table 4 presents the MNI coordinates for selected voxels within significant clusters. The reverse Pantomime × Repetition interaction was also tested, but did not reveal any regions showing greater RS for pantomime than dance ( p > 0.05, cluster corrected in FSL).


Figure 6. Regions showing greater RS for dance than pantomime . RS effects were compared between movement types. This was implemented as an interaction contrast within our Movement Type × Repetition ANOVA design [(Novel Dance > Repeated Dance) > (Novel Pantomime > Repeated Pantomime)]. Greater RS for dance was lateralized to left hemisphere meaning-sensitive regions. The brain areas shown here have been linked previously to the comprehension of meaning in verbal language, suggesting the possibility they represent shared brain substrates for building meaning from both language and action. Number labels correspond to those listed in Table 4 , which provides anatomical names and voxel coordinates for key clusters showing significantly greater RS for dance. Blue shaded area indicates vertical extent of axial slices shown.


Table 4. Brain regions showing significantly greater RS for themes expressed through dance relative to themes expressed through pantomime ( p < 0.05, cluster corrected in FSL) .

In regions showing greater RS for dance than pantomime, mean BOLD responses for novel and repeated dance and pantomime conditions were computed across voxels for each participant based on their first-level contrast images. This was done to test whether the greater RS for dance was due to greater activity in the novel condition, lower activity in the repeated condition, or some combination of both. Figure 7 illustrates a pattern of BOLD activity across conditions demonstrates that the greater RS for dance was the result of greater initial BOLD activation in response to novel themes. The ANOVA results showed a significant Movement Type × Repetition interaction [ F (1, 42) = 7.83, p < 0.01], indicating that BOLD activity in response to novel danced themes was greater than BOLD activity for all other conditions in these regions.


Figure 7. Novel danced themes challenge brain substrates that decode meaning from movement . To determine the specific pattern of BOLD activity that resulted in greater RS for dance, average BOLD activity in these areas was computed for each condition separately. Greater RS for dance was driven by a larger BOLD response to novel danced themes. Considered together with behavioral findings indicating that dance was more difficult to interpret, greater RS for dance seems to result from a greater processing “challenge” to brain substrates involved in decoding meaning from movement. SEM, standard error of the mean.

This study was designed to reveal brain regions involved in reading body language—the meaningful information we pick up about the affective states and intentions of others based on their body movement. Brain regions that decoded meaning from body movement were identified with a whole brain analysis of RS that compared BOLD activity for novel and repeated themes expressed through modern dance or pantomime. Significant RS for repeated themes was observed bilaterally, extending anteriorly along middle and superior temporal gyri into temporal pole, medially into insula, rostrally into inferior orbitofrontal cortex, and caudally into hippocampus and amygdala. Together, these brain substrates comprise a functional system within the larger AON. This suggests strongly that decoding meaning from expressive body movement constitutes a dimension of action representation not previously isolated in studies of action understanding. In the following we argue that this embedding is consistent with the hierarchical organization of the AON.

Body Language as Superordinate in a Hierarchical Action Observation Network

Previous investigations of action understanding have identified the AON as a key a cognitive system for the organization of action in general, highlighting the fact that both performing and observing action rely on many of the same brain substrates ( Grafton, 2009 ; Ortigue et al., 2010 ; Kilner, 2011 ; Ogawa and Inui, 2011 ; Uithol et al., 2011 ; Grafton and Tipper, 2012 ). Shared brain substrates for controlling one's own action and understanding the actions of others are often taken as evidence of a “mirror neuron system” (MNS), following from physiological studies showing that cells in area F5 of the macaque monkey premotor cortex fired in response to both performing and observing goal-directed actions ( Pellegrino et al., 1992 ; Gallese et al., 1996 ; Rizzolatti et al., 1996a ). Since these initial observations were made regarding monkeys, there has been a tremendous effort to characterize a human analog of the MNS, and incorporate it into theories of not only action understanding, but also social cognition, language development, empathy, and neuropsychiatric disorders in which these faculties are compromised ( Gallese and Goldman, 1998 ; Rizzolatti and Arbib, 1998 ; Rizzolatti et al., 2001 ; Gallese, 2003 ; Gallese et al., 2004 ; Rizzolatti and Craighero, 2004 ; Iacoboni et al., 2005 ; Tettamanti et al., 2005 ; Dapretto et al., 2006 ; Iacoboni and Dapretto, 2006 ; Shapiro, 2008 ; Decety and Ickes, 2011 ). A fundamental assumption common to all such theories is that mirror neurons provide a direct neural mechanism for action understanding through “motor resonance,” or the simulation of one's own motor programs for an observed action ( Jacob, 2008 ; Oosterhof et al., 2013 ). One proposed mechanism for action understanding through motor resonance is the embodiment of sensorimotor associations between action goals and specific motor behaviors ( Mitz et al., 1991 ; Niedenthal et al., 2005 ; McCall et al., 2012 ). While the involvement of the motor system in a range of social, cognitive and affective domains is certainly worthy of focused investigation, and mirror neurons may well play an important role in supporting such “embodied cognition,” this by no means implies that mirror neurons alone can account for the ability to garner meaning from observed body movement.

Since the AON is a distributed cortical network that extends beyond motor-related brain substrates engaged during action observation, it is best characterized not as a homogeneous “mirroring” mechanism, but rather as a collection of functionally specific but interconnected modules that represent distinct properties of observed actions ( Grafton, 2009 ; Grafton and Tipper, 2012 ). The present results build on this functional-hierarchical model of the AON by incorporating meaningful expression as an inherent aspect of body movement that is decoded in distinct regions of the AON. In other words, the bilateral temporal-orbitofrontal regions that showed RS for repeated themes comprise a distinct functional module of the AON that supports an additional level of the action representation hierarchy. Such an interpretation is consistent with the idea that action representation is inherently nested, carried out within a hierarchy of part-whole processes for which higher levels depend on lower levels ( Cooper and Shallice, 2006 ; Botvinick, 2008 ; Grafton and Tipper, 2012 ). We propose that the meaning infused into the body movement of a person having a particular affective stance is decoded superordinately to more concrete properties of action, such as kinematics and object goals. Under this view, while decoding these representationally subordinate properties of action may involve motor-related brain substrates, decoding “body language” engages non-motor regions of the AON that link movement and meaning, relying on inputs from lower levels of the action representation hierarchy that provide information about movement kinematics, prosodic nuances, and dynamic inflections.

While the present results suggest that the temporal-orbitofrontal regions identified here as decoding meaning from emotive body movement constitute a distinct functional module within a hierarchically organized AON, it is important to note that these regions have not previously been included in anatomical descriptions of the AON. The present study, however, isolated a property of action representation that had not been previously investigated; so identifying regions of the AON not previously included in its functional-anatomic definition is perhaps not surprising. This underscores the important point that the AON is functionally defined, such that its apparent anatomical extent in a given experimental context depends upon the particular aspects of action representation that are engaged and isolable. Previous studies of another abstract property of action representation, namely intention understanding, also illustrate this point. Inferring the intentions of an actor engages medial prefrontal cortex, bilateral posterior superior temporal sulcus, and left temporo-parietal junction—non-motor regions of the brain typically associated with “mentalizing,” or thinking about the mental states of another agent ( Ansuini et al., 2015 ; Ciaramidaro et al., 2014 ). A key finding of this research is that intention understanding depends fundamentally on the integration of motor-related (“mirroring”) brain regions and non-motor (“mentalizing”) brain regions ( Becchio et al., 2012 ). The present results parallel this finding, and point to the idea that in the context of action representation, motor and non-motor brain areas are not two separate brain networks, but rather one integrated functional system.

Predictive Coding and the Construction of Meaning in the Action Observation Network

A critical question raised by the idea that the temporal-orbitofrontal brain regions in which RS was observed here constitute a superordinate, meaning-sensitive functional module of the AON is how activity in subordinate AON modules is integrated at this higher level to produce differential neural firing patterns in response to different meaningful body expressions. That is, what are the neural mechanisms underlying the observed sensitivity to meaning in body language, and furthermore, why are these mechanisms subject to adaptation through repetition (RS)? While the present results do not provide direct evidence to answer these questions, we propose that a “predictive coding” interpretation provides a coherent model of action representation ( Brass et al., 2007 ; Kilner and Frith, 2008 ; Brown and Brüne, 2012 ) that yields useful predictions about the neural processes by which meaning is decoded that would account for the observed RS effect. The primary mechanism invoked by a predictive coding framework of action understanding is recurrent feed-forward and feedback processing across the various levels of the AON, which supports a Bayesian system of predictive neural coding, feedback processes, and prediction error reduction at each level of action representation ( Friston et al., 2011 ). According to this model, each level of the action observation hierarchy generates predictions to anticipate neural activity at lower levels of the hierarchy. Predictions in the form of neural codes are sent to lower levels through feedback connections, and compared with actual subordinate neural representations. Any discrepancy between neural predictions and actual representations are coded as prediction error. Information regarding prediction error is sent through recurrent feed-forward projections to superordinate regions, and used to update predictive priors such that subsequent prediction error is minimized. Together, these Bayes-optimal neural ensemble operations converge on the most probable inference for representation at the superordinate level ( Friston et al., 2011 ) and, ultimately, action understanding based on the integration of representations at each level of the action observation hierarchy ( Chambon et al., 2011 ; Kilner, 2011 ).

A predictive coding account of the present results would suggest that initial feed-forward inputs from subordinate levels of the AON provided the superordinate temporal-orbitofrontal module with information regarding movement kinematics, prosody, gestural elements, and dynamic inflections, which, when integrated with other inputs based on prior experience, would provide a basis for an initial prediction about potential meanings of a body expression. This prediction would yield a generative neural model about the movement dynamics that would be expected given the predicted meaning of the observed body expression, which would be fed back to lower levels of the network that coded movement dynamics and sensorimotor associations. Predictive activity would be contrasted with actual representations as movement information was accrued throughout the performance, and the resulting prediction error would be utilized via feed-forward projections to temporal-orbitofrontal regions to update predictive codes regarding meaning and minimize subsequent prediction error. In this way, the meaningful affective theme being expressed by the performer would be converged upon through recurrent Bayes-optimal neural ensemble operations. Thus, meaning expressed through body language would be accrued iteratively in temporal-orbitofrontal regions by integrating neural representations of various facets of action decoded throughout the AON. Interestingly, and consistent with a model in which an iterative process accrued information over time, we observed that BOLD responses in AON regions peaked more slowly than expected based on SPM's canonical HRF as the videos were viewed over an extended (10 s) duration. Under an iterative predictive coding model, RS for repeated themes could be accounted for by reduced initial generative activity in temporal-orbitofrontal regions due to better constrained predictions about potential meanings conveyed by observed movement, more efficient convergence on an inference due to faster minimization of prediction error, or some combination of both of these mechanisms. The present results provide indirect evidence for the former account, in that more abstract, less constrained movement metaphors relied upon by expressive dance resulted in greater RS due to larger BOLD responses for novel themes relative to the more concrete, better-constrained associations conveyed by pantomime.

Shared Brain Substrates for Meaning in Action and Language

The middle temporal gyrus and superior temporal sulcus regions identified here as part of a functional module of the AON that “reads” body language have been linked previously to a variety of high-level linguistic domains related to understanding meaning. Among these are conceptual knowledge ( Lambon Ralph et al., 2009 ), language comprehension ( Hasson et al., 2006 ; Noppeney and Penny, 2006 ; Price, 2010 ), sensitivity to the congruency between intentions and actions, both verbal/conceptual ( Deen and McCarthy, 2010 ), and perceptual/implicit ( Wyk et al., 2009 ), as well as understanding abstract language and metaphorical descriptions of action ( Desai et al., 2011 ). While together these studies demonstrate that high-level linguistic processing involves bilateral superior and middle temporal regions, there is evidence for a general predominance of the left hemisphere in comprehending semantics ( Price, 2010 ), and a predominance of the right hemisphere in incorporating socio-emotional information and affective context ( Wyk et al., 2009 ). For example, brain atrophy associated with a primary progressive aphasia characterized by profound disturbances in semantic comprehension occurs bilaterally in anterior middle temporal regions, but is more pronounced in the left hemisphere ( Gorno-Tempini et al., 2004 ). In contrast, neural degeneration in right hemisphere orbitofrontal, insula, and anterior middle temporal regions is associated not only with semantic dementia but also deficits in socio-emotional sensitivity and regulation ( Rosen et al., 2005 ).

This hemispheric asymmetry in brain substrates associated with interpreting meaning in verbal language is paralleled in the present results, which not only link the same bilateral temporal-orbitofrontal brain substrates to comprehending meaning from affectively expressive body language, but also demonstrate a predominance of the left hemisphere in deciphering the particularly abstract movement metaphors conveyed by dance. This asymmetry was evident as greater RS for repeated themes for dance relative to pantomime, which was driven by a greater initial activation for novel themes, suggesting that these left-hemisphere regions were engaged more vigorously when decoding more abstract movement metaphors. Together, these results illustrate a striking overlap in the brain substrates involved in processing meaning in verbal language and decoding meaning from expressive body movement. This overlap suggests that a long-hypothesized evolutionary link between gestural body movement and language ( Hewes et al., 1973 ; Harnad et al., 1976 ; Rizzolatti and Arbib, 1998 ; Corballis, 2003 ) may be instantiated by a network of shared brain substrates for representing semiotic structure, which constitutes the informational scaffolding for building meaning in both language and gesture ( Lemke, 1987 ; Freeman, 1997 ; McNeill, 2012 ; Lhommet and Marsella, 2013 ). While speculative, under this view the temporal-orbitofrontal AON module for coding meaning observed may provide a neural basis for semiosis (the construction of meaning), which would lend support to the intriguing philosophical argument that meaning is fundamentally grounded in processes of the body, brain, and the social environment within which they are immersed ( Thibault, 2004 ).

Summary and Conclusions

The present results identify a system of temporal, orbitofrontal, insula, and amygdala brain regions that supports the meaningful interpretation of expressive body language. We propose that these areas reveal a previously undefined superordinate functional module within a known, stratified hierarchical brain network for action representation. The findings are consistent with a predictive coding model of action understanding, wherein the meaning that is imbued into expressive body movements through subtle kinematics and prosodic nuances is decoded as a distinct property of action via feed-forward and feedback processing across the levels of a hierarchical AON. Under this view, recurrent processing loops integrate lower-level representations of movement dynamics and socio-affective perceptual information to generate, evaluate, and update predictive inferences about expressive content that are mediated in a superordinate temporal-orbitofrontal module of the AON. Thus, while lower-level action representation in motor-related brain areas (sometimes referred to as a human “mirror neuron system”) may be a key step in the construction of meaning from movement, it is not these motor areas that code the specific meaning of an expressive body movement. Rather, we have demonstrated an additional level of the cortical action representation hierarchy in non-motor regions of the AON. The results highlight an important link between action representation and language, and point to the possibility of shared brain substrates for constructing meaning in both domains.

Author Contributions

CT, GS, and SG designed the experiment. CT and GS created stimuli, which included recruiting professional dancers and filming expressive movement sequences. GS carried out video editing. CT completed computer programming for experimental control and data analysis. GS and CT recruited participants and conducted behavioral and fMRI testing. CT and SG designed the data analysis and CT and GS carried it out. GS conducted a literature review, and CT wrote the paper with reviews and edits from SG.

Conflict of Interest Statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.


Research supported by the James S. McDonnell Foundation.

Supplementary Material

The Supplementary Material for this article can be found online at: http://dx.doi.org/10.6084/m9.figshare.1508616

Adolphs, R., Damasio, H., Tranel, D., Cooper, G., and Damasio, A. R. (2000). A role for somatosensory cortices in the visual recognition of emotion as revealed by three-dimensional lesion mapping. J. Neurosci. 20, 2683–2690.

PubMed Abstract | Google Scholar

Adolphs, R., Tranel, D., and Damasio, A. R. (2003). Dissociable neural systems for recognizing emotions. Brain Cogn. 52, 61–69. doi: 10.1016/S0278-2626(03)00009-5

PubMed Abstract | CrossRef Full Text | Google Scholar

Allison, T., Puce, A., and McCarthy, G. (2000). Social perception from visual cues: role of the STS region. Trends Cogn. Sci. 4, 267–278. doi: 10.1016/S1364-6613(00)01501-1

Anderson, S. W., Bechara, A., Damasio, H., Tranel, D., and Damasio, A. R. (1999). Impairment of social and moral behavior related to early damage in human prefrontal cortex. Nat. Neurosci. 2, 1032–1037. doi: 10.1038/14833

Ansuini, C., Cavallo, A., Bertone, C., and Becchio, C. (2015). Intentions in the brain: the unveiling of Mister Hyde. Neuroscientist 21, 126–135. doi: 10.1177/1073858414533827

Astafiev, S. V., Stanley, C. M., Shulman, G. L., and Corbetta, M. (2004). Extrastriate body area in human occipital cortex responds to the performance of motor actions. Nat. Neurosci. 7, 542–548. doi: 10.1038/nn1241

Becchio, C., Cavallo, A., Begliomini, C., Sartori, L., Feltrin, G., and Castiello, U. (2012). Social grasping: from mirroring to mentalizing. Neuroimage 61, 240–248. doi: 10.1016/j.neuroimage.2012.03.013

Bechara, A., and Damasio, A. R. (2005). The somatic marker hypothesis: a neural theory of economic decision. Games Econ. Behav. 52, 336–372. doi: 10.1016/j.geb.2004.06.010

CrossRef Full Text | Google Scholar

Bechara, A., Damasio, H., and Damasio, A. R. (2003). Role of the amygdala in decision making. Ann. N.Y. Acad. Sci. 985, 356–369. doi: 10.1111/j.1749-6632.2003.tb07094.x

Bertenthal, B. I., Longo, M. R., and Kosobud, A. (2006). Imitative response tendencies following observation of intransitive actions. J. Exp. Psychol. 32, 210–225. doi: 10.1037/0096-1523.32.2.210

Bonda, E., Petrides, M., Ostry, D., and Evans, A. (1996). Specific involvement of human parietal systems and the amygdala in the perception of biological motion. J. Neurosci. 16, 3737–3744.

Botvinick, M. M. (2008). Hierarchical models of behavior and prefrontal function. Trends Cogn. Sci. 12, 201–208. doi: 10.1016/j.tics.2008.02.009

Brainard, D. H. (1997). The psychophysics toolbox. Spat. Vis. 10, 433–436. doi: 10.1163/156856897X00357

Brass, M., Schmitt, R. M., Spengler, S., and Gergely, G. (2007). Investigating action understanding: inferential processes versus action simulation. Curr. Biol. 17, 2117–2121. doi: 10.1016/j.cub.2007.11.057

Brown, E. C., and Brüne, M. (2012). The role of prediction in social neuroscience. Front. Hum. Neurosci . 6:147. doi: 10.3389/fnhum.2012.00147

Buccino, G., Binkofski, F., Fink, G. R., Fadiga, L., Fogassi, L., Gallese, V., et al. (2001). Action observation activates premotor and parietal areas in a somatotopic manner: an fMRI study. Eur. J. Neurosci. 13, 400–404. doi: 10.1046/j.1460-9568.2001.01385.x

Calvo-Merino, B., Glaser, D. E., Grèzes, J., Passingham, R. E., and Haggard, P. (2005). Action observation and acquired motor skills: an FMRI study with expert dancers. Cereb. Cortex 15, 1243. doi: 10.1093/cercor/bhi007

Cardinal, R. N., Parkinson, J. A., Hall, J., and Everitt, B. J. (2002). Emotion and motivation: the role of the amygdala, ventral striatum, and prefrontal cortex. Neurosci. Biobehav. Rev. 26, 321–352. doi: 10.1016/S0149-7634(02)00007-6

Chambon, V., Domenech, P., Pacherie, E., Koechlin, E., Baraduc, P., and Farrer, C. (2011). What are they up to? The role of sensory evidence and prior knowledge in action understanding. PLoS ONE 6:e17133. doi: 10.1371/journal.pone.0017133

Ciaramidaro, A., Becchio, C., Colle, L., Bara, B. G., and Walter, H. (2014). Do you mean me? Communicative intentions recruit the mirror and the mentalizing system. Soc. Cogn. Affect. Neurosci . 9, 909–916. doi: 10.1093/scan/nst062

Cooper, R. P., and Shallice, T. (2006). Hierarchical schemas and goals in the control of sequential behavior. Psychol. Rev. 113, 887–916. discussion 917–931. doi: 10.1037/0033-295x.113.4.887

Corballis, M. C. (2003). “From hand to mouth: the gestural origins of language,” in Language Evolution: The States of the Art , eds M. H. Christiansen and S. Kirby (Oxford University Press). Available online at: http://groups.lis.illinois.edu/amag/langev/paper/corballis03fromHandToMouth.html

PubMed Abstract

Cross, E. S., Hamilton, A. F. C., and Grafton, S. T. (2006). Building a motor simulation de novo : observation of dance by dancers. Neuroimage 31, 1257–1267. doi: 10.1016/j.neuroimage.2006.01.033

Cross, E. S., Kraemer, D. J. M., Hamilton, A. F. D. C., Kelley, W. M., and Grafton, S. T. (2009). Sensitivity of the action observation network to physical and observational learning. Cereb. Cortex 19, 315. doi: 10.1093/cercor/bhn083

Damasio, A. R., Grabowski, T. J., Bechara, A., Damasio, H., Ponto, L. L., Parvizi, J., et al. (2000). Subcortical and cortical brain activity during the feeling of self-generated emotions. Nat. Neurosci. 3, 1049–1056. doi: 10.1038/79871

Dapretto, M., Davies, M. S., Pfeifer, J. H., Scott, A. A., Sigman, M., Bookheimer, S. Y., et al. (2006). Understanding emotions in others: mirror neuron dysfunction in children with autism spectrum disorders. Nat. Neurosci. 9, 28–30. doi: 10.1038/nn1611

Dean, P., Redgrave, P., and Westby, G. W. M. (1989). Event or emergency? Two response systems in the mammalian superior colliculus. Trends Neurosci . 12, 137–147. doi: 10.1016/0166-2236(89)90052-0

Decety, J., and Ickes, W. (2011). The Social Neuroscience of Empathy . Cambridge, MA: MIT Press.

Google Scholar

Deen, B., and McCarthy, G. (2010). Reading about the actions of others: biological motion imagery and action congruency influence brain activity. Neuropsychologia 48, 1607–1615. doi: 10.1016/j.neuropsychologia.2010.01.028

de Gelder, B. (2006). Towards the neurobiology of emotional body language. Nat. Rev. Neurosci. 7, 242–249. doi: 10.1038/nrn1872

de Gelder, B., and Hadjikhani, N. (2006). Non-conscious recognition of emotional body language. Neuroreport 17, 583. doi: 10.1097/00001756-200604240-00006

Desai, R. H., Binder, J. R., Conant, L. L., Mano, Q. R., and Seidenberg, M. S. (2011). The neural career of sensory-motor metaphors. J. Cogn. Neurosci. 23, 2376–2386. doi: 10.1162/jocn.2010.21596

Desikan, R. S., Ségonne, F., Fischl, B., Quinn, B. T., Dickerson, B. C., Blacker, D., et al. (2006). An automated labeling system for subdividing the human cerebral cortex on MRI scans into gyral based regions of interest. Neuroimage 31, 968–980. doi: 10.1016/j.neuroimage.2006.01.021

Eickhoff, S. B., Heim, S., Zilles, K., and Amunts, K. (2006). Testing anatomically specified hypotheses in functional imaging using cytoarchitectonic maps. Neuroimage 32, 570–582. doi: 10.1016/j.neuroimage.2006.04.204

Eickhoff, S. B., Paus, T., Caspers, S., Grosbras, M. H., Evans, A. C., Zilles, K., et al. (2007). Assignment of functional activations to probabilistic cytoarchitectonic areas revisited. Neuroimage 36, 511–521. doi: 10.1016/j.neuroimage.2007.03.060

Eickhoff, S. B., Stephan, K. E., Mohlberg, H., Grefkes, C., Fink, G. R., Amunts, K., et al. (2005). A new SPM toolbox for combining probabilistic cytoarchitectonic maps and functional imaging data. Neuroimage 25, 1325–1335. doi: 10.1016/j.neuroimage.2004.12.034

Fadiga, L., Fogassi, L., Gallese, V., and Rizzolatti, G. (2000). Visuomotor neurons: ambiguity of the discharge or motor perception? Int. J. Psychophysiol. 35, 165–177. doi: 10.1016/S0167-8760(99)00051-3

Ferrari, P. F., Gallese, V., Rizzolatti, G., and Fogassi, L. (2003). Mirror neurons responding to the observation of ingestive and communicative mouth actions in the monkey ventral premotor cortex. Eur. J. Neurosci. 17, 1703–1714. doi: 10.1046/j.1460-9568.2003.02601.x

Frazier, J. A., Chiu, S., Breeze, J. L., Makris, N., Lange, N., Kennedy, D. N., et al. (2005). Structural brain magnetic resonance imaging of limbic and thalamic volumes in pediatric bipolar disorder. Am. J. Psychiatry 162, 1256–1265. doi: 10.1176/appi.ajp.162.7.1256

Freeman, W. J. (1997). A neurobiological interpretation of semiotics: meaning vs. representation. IEEE Int. Conf. Syst. Man Cybern. Comput. Cybern. Simul. 2, 93–102. doi: 10.1109/ICSMC.1997.638197

Frey, S. H., and Gerry, V. E. (2006). Modulation of neural activity during observational learning of actions and their sequential orders. J. Neurosci. 26, 13194–13201. doi: 10.1523/JNEUROSCI.3914-06.2006

Friederici, A. D., Rüschemeyer, S.-A., Hahne, A., and Fiebach, C. J. (2003). The role of left inferior frontal and superior temporal cortex in sentence comprehension: localizing syntactic and semantic processes. Cereb. Cortex 13, 170–177. doi: 10.1093/cercor/13.2.170

Friston, K., Mattout, J., and Kilner, J. (2011). Action understanding and active inference. Biol. Cybern. 104, 137–60. doi: 10.1007/s00422-011-0424-z

Gallese, V. (2003). The roots of empathy: the shared manifold hypothesis and the neural basis of intersubjectivity. Psychopathology 36, 171–180. doi: 10.1159/000072786

Gallese, V., Fadiga, L., Fogassi, L., and Rizzolatti, G. (1996). Action recognition in the premotor cortex. Brain 119, 593. doi: 10.1093/brain/119.2.593

Gallese, V., and Goldman, A. (1998). Mirror neurons and the simulation theory of mind-reading. Trends Cogn. Sci. 2, 493–501. doi: 10.1016/S1364-6613(98)01262-5

Gallese, V., Keysers, C., and Rizzolatti, G. (2004). A unifying view of the basis of social cognition. Trends Cogn. Sci. 8, 396–403. doi: 10.1016/j.tics.2004.07.002

Goldstein, J. M., Seidman, L. J., Makris, N., Ahern, T., O'Brien, L. M., Caviness, V. S., et al. (2007). Hypothalamic abnormalities in Schizophrenia: sex effects and genetic vulnerability. Biol. Psychiatry 61, 935–945. doi: 10.1016/j.biopsych.2006.06.027

Golub, G. H., and Reinsch, C. (1970). Singular value decomposition and least squares solutions. Numer. Math. 14, 403–420. doi: 10.1007/BF02163027

Gorno-Tempini, M. L., Dronkers, N. F., Rankin, K. P., Ogar, J. M., Phengrasamy, L., Rosen, H. J., et al. (2004). Cognition and anatomy in three variants of primary progressive aphasia. Ann. Neurol. 55, 335–346. doi: 10.1002/ana.10825

Grafton, S. T. (2009). Embodied cognition and the simulation of action to understand others. Ann. N.Y. Acad. Sci. 1156, 97–117. doi: 10.1111/j.1749-6632.2009.04425.x

Grafton, S. T., Arbib, M. A., Fadiga, L., and Rizzolatti, G. (1996). Localization of grasp representations in humans by positron emission tomography. Exp. Brain Res. 112, 103–111. doi: 10.1007/BF00227183

Grafton, S. T., and Tipper, C. M. (2012). Decoding intention: a neuroergonomic perspective. Neuroimage 59, 14–24. doi: 10.1016/j.neuroimage.2011.05.064

Grèzes, J., Decety, J., and Grezes, J. (2001). Functional anatomy of execution, mental simulation, observation, and verb generation of actions: a meta-analysis. Hum. Brain Mapp. 12, 1–19. doi: 10.1002/1097-0193(200101)12:1<1::AID-HBM10>3.0.CO;2-V

Grezes, J., Fonlupt, P., Bertenthal, B., Delon-Martin, C., Segebarth, C., Decety, J., et al. (2001). Does perception of biological motion rely on specific brain regions? Neuroimage 13, 775–785. doi: 10.1006/nimg.2000.0740

Grill-Spector, K., Henson, R., and Martin, A. (2006). Repetition and the brain: neural models of stimulus-specific effects. Trends Cogn. Sci. 10, 14–23. doi: 10.1016/j.tics.2005.11.006

Grill-Spector, K., and Malach, R. (2001). fMR-adaptation: a tool for studying the functional properties of human cortical neurons. Acta Psychol. 107, 293–321. doi: 10.1016/S0001-6918(01)00019-1

Hadjikhani, N., and de Gelder, B. (2003). Seeing fearful body expressions activates the fusiform cortex and amygdala. Curr. Biol. 13, 2201–2205. doi: 10.1016/j.cub.2003.11.049

Hamilton, A. F. C., and Grafton, S. T. (2006). Goal representation in human anterior intraparietal sulcus. J. Neurosci. 26, 1133. doi: 10.1523/JNEUROSCI.4551-05.2006

Hamilton, A. F. D. C., and Grafton, S. T. (2008). Action outcomes are represented in human inferior frontoparietal cortex. Cereb. Cortex 18, 1160–1168. doi: 10.1093/cercor/bhm150

Hamilton, A. F., and Grafton, S. T. (2007). “The motor hierarchy: from kinematics to goals and intentions,” in Sensorimotor Foundations of Higher Cognition: Attention and Performance , Vol. 22, eds P. Haggard, Y. Rossetti, and M. Kawato (Oxford: Oxford University Press), 381–402.

Hari, R., Forss, N., Avikainen, S., Kirveskari, E., Salenius, S., and Rizzolatti, G. (1998). Activation of human primary motor cortex during action observation: a neuromagnetic study. Proc. Natl. Acad. Sci. U.S.A. 95, 15061–15065. doi: 10.1073/pnas.95.25.15061

Harnad, S. R., Steklis, H. D., and Lancaster, J. (eds.). (1976). “Origins and evolution of language and speech,” in Annals of the New York Academy of Sciences (New York, NY: New York Academy of Sciences), 280.

Hasson, U., Nusbaum, H. C., and Small, S. L. (2006). Repetition suppression for spoken sentences and the effect of task demands. J. Cogn. Neurosci. 18, 2013–2029. doi: 10.1162/jocn.2006.18.12.2013

Haxby, J. V., Hoffman, E. A., and Gobbini, M. I. (2002). Human neural systems for face recognition and social communication. Biol. Psychiatry 51, 59–67. doi: 10.1016/S0006-3223(01)01330-0

Hewes, G. W., Andrew, R. J., Carini, L., Choe, H., Gardner, R. A., Kortlandt, A., et al. (1973). Primate communication and the gestural origin of language [and comments and reply]. Curr. Anthropol. 14, 5–24. doi: 10.1086/201401

Hoffman, E. A., and Haxby, J. V. (2000). Distinct representations of eye gaze and identity in the distributed human neural system for face perception. Nat. Neurosci. 3, 80–84. doi: 10.1038/71152

Iacoboni, M., and Dapretto, M. (2006). The mirror neuron system and the consequences of its dysfunction. Nat. Rev. Neurosci. 7, 942–51. doi: 10.1038/nrn2024

Iacoboni, M., Molnar-Szakacs, I., Gallese, V., Buccino, G., Mazziotta, J. C., and Rizzolatti, G. (2005). Grasping the intentions of others with one's own mirror neuron system. PLoS Biol. 3:e79. doi: 10.1371/journal.pbio.0030079

Jacob, P. (2008). What do mirror neurons contribute to human social cognition? Mind Lang. 23, 190–223. doi: 10.1111/j.1468-0017.2007.00337.x

Jenkinson, M., Beckmann, C. F., Behrens, T. E. J., Woolrich, M. W., and Smith, S. M. (2012). Fsl. Neuroimage 62, 782–90. doi: 10.1016/j.neuroimage.2011.09.015

Kilner, J. M. (2011). More than one pathway to action understanding. Trends Cogn. Sci. 15, 352–37. doi: 10.1016/j.tics.2011.06.005

Kilner, J. M., and Frith, C. D. (2008). Action observation: inferring intentions without mirror neurons. Curr. Biol. 18, R32–R33. doi: 10.1016/j.cub.2007.11.008

Kirchhoff, B. A., Wagner, A. D., Maril, A., and Stern, C. E. (2000). Prefrontal-temporal circuitry for episodic encoding and subsequent memory. J. Neurosci. 20, 6173–6180.

Lambon Ralph, M. A., Pobric, G., and Jefferies, E. (2009). Conceptual knowledge is underpinned by the temporal pole bilaterally: convergent evidence from rTMS. Cereb. Cortex 19, 832–838. doi: 10.1093/cercor/bhn131

Lemke, J. L. (1987). “Strategic deployment of speech and action: a sociosemiotic analysis,” in Semiotics 1983: Proceedings of the Semiotic Society of America ‘Snowbird’ Conference , eds J. Evans and J. Deely (Lanham, MD: University Press of America), 67–79.

Lhommet, M., and Marsella, S. C. (2013). “Gesture with meaning,” in Intelligent Virtual Agents , eds Y. Nakano, M. Neff, A. Paiva, and M. Walker (Berlin; Heidelberg: Springer), 303–312. doi: 10.1007/978-3-642-40415-3_27

CrossRef Full Text

Makris, N., Goldstein, J. M., Kennedy, D., Hodge, S. M., Caviness, V. S., Faraone, S. V., et al. (2006). Decreased volume of left and total anterior insular lobule in schizophrenia. Schizophr. Res. 83, 155–171. doi: 10.1016/j.schres.2005.11.020

McCall, C., Tipper, C. M., Blascovich, J., and Grafton, S. T. (2012). Attitudes trigger motor behavior through conditioned associations: neural and behavioral evidence. Soc. Cogn. Affect. Neurosci. 7, 841–889. doi: 10.1093/scan/nsr057

McCarthy, G., Puce, A., Gore, J. C., and Allison, T. (1997). Face-specific processing in the human fusiform gyrus. J. Cogn. Neurosci. 9, 605–610. doi: 10.1162/jocn.1997.9.5.605

McNeill, D. (2012). How Language Began: Gesture and Speech in Human Evolution . Cambridge: Cambridge University Press. Available online at: https://scholar.google.ca/scholar?q=How+Language+Began+Gesture+and+Speech+in+Human+Evolution&hl=en&as_sdt=0&as_vis=1&oi=scholart&sa=X&ei=-ezxVISFIdCboQS1q4KACQ&ved=0CBsQgQMwAA

Morris, D. (2002). Peoplewatching: The Desmond Morris Guide to Body Language . New York, NY: Vintage Books. Available online at: http://www.amazon.ca/Peoplewatching-Desmond-Morris-Guide-Language/dp/0099429780 (Accessed March 10, 2014).

Ni, W., Constable, R. T., Mencl, W. E., Pugh, K. R., Fulbright, R. K., Shaywitz, S. E., et al. (2000). An event-related neuroimaging study distinguishing form and content in sentence processing. J. Cogn. Neurosci. 12, 120–133. doi: 10.1162/08989290051137648

Nichols, T. E. (2012). Multiple testing corrections, nonparametric methods, and random field theory. Neuroimage 62, 811–815. doi: 10.1016/j.neuroimage.2012.04.014

Niedenthal, P. M., Barsalou, L. W., Winkielman, P., Krauth-Gruber, S., and Ric, F. (2005). Embodiment in attitudes, social perception, and emotion. Personal. Soc. Psychol. Rev. 9, 184–211. doi: 10.1207/s15327957pspr0903_1

Noppeney, U., and Penny, W. D. (2006). Two approaches to repetition suppression. Hum. Brain Mapp. 27, 411–416. doi: 10.1002/hbm.20242

Ogawa, K., and Inui, T. (2011). Neural representation of observed actions in the parietal and premotor cortex. Neuroimage 56, 728–35. doi: 10.1016/j.neuroimage.2010.10.043

Ollinger, J. M., Shulman, G. L., and Corbetta, M. (2001). Separating processes within a trial in event-related functional MRI: II. Analysis. Neuroimage 13, 218–229. doi: 10.1006/nimg.2000.0711

Oosterhof, N. N., Tipper, S. P., and Downing, P. E. (2013). Crossmodal and action-specific: neuroimaging the human mirror neuron system. Trends Cogn. Sci. 17, 311–338. doi: 10.1016/j.tics.2013.04.012

Ortigue, S., Sinigaglia, C., Rizzolatti, G., Grafton, S. T., and Rochelle, E. T. (2010). Understanding actions of others: the electrodynamics of the left and right hemispheres. A high-density EEG neuroimaging study. PLoS ONE 5:e12160. doi: 10.1371/journal.pone.0012160

Peelen, M. V., and Downing, P. E. (2005). Selectivity for the human body in the fusiform gyrus. J. Neurophysiol. 93, 603–608. doi: 10.1152/jn.00513.2004

Pellegrino, G., Fadiga, L., Fogassi, L., Gallese, V., Rizzolatti, G., and di Pellegrino, G. (1992). Understanding motor events: a neurophysiological study. Exp. Brain Res. 91, 176–180. doi: 10.1007/BF00230027

Pelli, D. G., and Brainard, D. H. (1997). The VideoToolbox software for visual psychophysics: transforming numbers into movies. Spat. Vis. 10, 433–436. doi: 10.1163/156856897X00366

Pelphrey, K. A., Morris, J. P., Michelich, C. R., Allison, T., and McCarthy, G. (2005). Functional anatomy of biological motion perception in posterior temporal cortex: an fMRI study of eye, mouth, and hand movements. Cereb. Cortex 15, 1866–1876. doi: 10.1093/cercor/bhi064

Poldrack, R. A., Mumford, J. A., and Nichols, T. E. (2011). Handbook of Functional MRI Data Analysis . New York, NY: Cambridge University Press. doi: 10.1017/cbo9780511895029

Price, C. J. (2010). The anatomy of language: a review of 100 fMRI studies published in 2009. Ann. N.Y. Acad. Sci. 1191, 62–88. doi: 10.1111/j.1749-6632.2010.05444.x

Mitz, A. R., Godschalk, M., and Wise, S. P. (1991). Learning-dependent neuronal activity in the premotor cortex: activity during the acquisition of conditional motor associations. J. Neurosci. 11, 1855–1872.

Rizzolatti, G., and Arbib, M. A. (1998). Language within our grasp. Trends Neurosci. 21, 188–194. doi: 10.1016/S0166-2236(98)01260-0

Rizzolatti, G., and Craighero, L. (2004). The mirror-neuron system. Annu. Rev. Neurosci. 27, 169–192. doi: 10.1146/annurev.neuro.27.070203.144230

Rizzolatti, G., Fadiga, L., Gallese, V., and Fogassi, L. (1996a). Premotor cortex and the recognition of motor actions. Cogn. brain Res. 3, 131–141. doi: 10.1016/0926-6410(95)00038-0

Rizzolatti, G., Fadiga, L., Matelli, M., Bettinardi, V., Paulesu, E., Perani, D., et al. (1996b). Localization of grasp representations in humans by PET: 1. Observation versus execution. Exp. Brain Res. 111, 246–252. doi: 10.1007/BF00227301

Rizzolatti, G., Fogassi, L., and Gallese, V. (2001). Neurophysiological mechanisms underlying the understanding and imitation of action. Nat. Rev. Neurosci. 2, 661–670. doi: 10.1038/35090060

Rosen, H. J., Allison, S. C., Schauer, G. F., Gorno-Tempini, M. L., Weiner, M. W., and Miller, B. L. (2005). Neuroanatomical correlates of behavioural disorders in dementia. Brain 128, 2612–2625. doi: 10.1093/brain/awh628

Sah, P., Faber, E. S. L., De Armentia, M. L., and Power, J. (2003). The amygdaloid complex: anatomy and physiology. Physiol. Rev. 83, 803–834. doi: 10.1152/physrev.00002.2003

Shapiro, L. (2008). Making sense of mirror neurons. Synthese 167, 439–456. doi: 10.1007/s11229-008-9385-8

Smith, S. M., Jenkinson, M., Woolrich, M. W., Beckmann, C. F., Behrens, T. E. J., Johansen-Berg, H., et al. (2004). Advances in functional and structural MR image analysis and implementation as FSL. Neuroimage 23(Suppl. 1), S208–S219. doi: 10.1016/j.neuroimage.2004.07.051

Tettamanti, M., Buccino, G., Saccuman, M. C., Gallese, V., Danna, M., Scifo, P., et al. (2005). Listening to action-related sentences activates fronto-parietal motor circuits. J. Cogn. Neurosci. 17, 273–281. doi: 10.1162/0898929053124965

Thibault, P. (2004). Brain, Mind and the Signifying Body: An Ecosocial Semiotic Theory . London: A&C Black. Available online at: https://scholar.google.ca/scholar?q=Brain,+Mind+and+the+Signifying+Body:+An+Ecosocial+Semiotic+Theory&hl=en&as_sdt=0&as_vis=1&oi=scholart&sa=X&ei=Lf3xVOayBMK0ogSniYLwCA&ved=0CB0QgQMwAA

Tunik, E., Rice, N. J., Hamilton, A. F., and Grafton, S. T. (2007). Beyond grasping: representation of action in human anterior intraparietal sulcus. Neuroimage 36, T77–T86. doi: 10.1016/j.neuroimage.2007.03.026

Uithol, S., van Rooij, I., Bekkering, H., and Haselager, P. (2011). Understanding motor resonance. Soc. Neurosci. 6, 388–397. doi: 10.1080/17470919.2011.559129

Ulloa, E. R., and Pineda, J. A. (2007). Recognition of point-light biological motion: Mu rhythms and mirror neuron activity. Behav. Brain Res. 183, 188–194. doi: 10.1016/j.bbr.2007.06.007

Urgesi, C., Candidi, M., Ionta, S., and Aglioti, S. M. (2006). Representation of body identity and body actions in extrastriate body area and ventral premotor cortex. Nat. Neurosci. 10, 30–31. doi: 10.1038/nn1815

Van Essen, D. C. (2005). A Population-Average, Landmark- and Surface-based (PALS) atlas of human cerebral cortex. Neuroimage 28, 635–662. doi: 10.1016/j.neuroimage.2005.06.058

Van Essen, D. C., Drury, H. A., Dickson, J., Harwell, J., Hanlon, D., and Anderson, C. H. (2001). An integrated software suite for surface-based analyses of cerebral cortex. J. Am. Med. Inform. Assoc. 8, 443–459. doi: 10.1136/jamia.2001.0080443

Wiggs, C. L., and Martin, A. (1998). Properties and mechanisms of perceptual priming. Curr. Opin. Neurobiol. 8, 227–233. doi: 10.1016/S0959-4388(98)80144-X

Wyk, B. C. V., Hudac, C. M., Carter, E. J., Sobel, D. M., and Pelphrey, K. A. (2009). Action understanding in the superior temporal sulcus region. Psychol. Sci. 20, 771. doi: 10.1111/j.1467-9280.2009.02359.x

Zentgraf, K., Stark, R., Reiser, M., Künzell, S., Schienle, A., Kirsch, P., et al. (2005). Differential activation of pre-SMA and SMA proper during action observation: effects of instructions. Neuroimage 26, 662–672. doi: 10.1016/j.neuroimage.2005.02.015

Keywords: action observation, dance, social neuroscience, fMRI, repetition suppression, predictive coding

Citation: Tipper CM, Signorini G and Grafton ST (2015) Body language in the brain: constructing meaning from expressive movement. Front. Hum. Neurosci . 9:450. doi: 10.3389/fnhum.2015.00450

Received: 28 March 2015; Accepted: 28 July 2015; Published: 21 August 2015.

Reviewed by:

Copyright © 2015 Tipper, Signorini and Grafton. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Christine M. Tipper, Mental Health and Integrated Neurobehavioral Development Research Core, Child and Family Research Institute, 3rd Floor - 938 West 28th Avenue, Vancouver, BC V5Z 4H4, Canada, [email protected]


  1. 30 Body Language Tips That Will Improve Your Communication Skills

    research body language in communication

  2. Body language for effective communication

    research body language in communication

  3. Why is Body Language Important in Communication?

    research body language in communication

  4. Why Body Language Is Important In Communication

    research body language in communication

  5. 21 Body Language Tricks in Winning Communication

    research body language in communication

  6. Why is Body Language Important in Communication?

    research body language in communication


  1. Just 2 body language

  2. Body Language that WINS

  3. Confident Body Language tips in odia || personality development Body Language tips

  4. Confident Body Language

  5. Unleashing Your Charismatic Presence

  6. Body language 2 facial experience /لغة الجسد ٢


  1. Download .nbib

    Abstract This fMRI study investigated neural systems that interpret body language—the meaningful emotive expressions conveyed by body movement. Participants watched videos of performers engaged in modern dance or pantomime that conveyed specific themes such as hope, agony, lust, or exhaustion.

  2. Unspoken science: exploring the significance of body language in

    Body language has a direct impact on how people perceive and interpret scientific ideas and findings. 1 For example, a presenter who maintains confident eye contact, uses purposeful gestures, and exhibits an open posture is likely to be seen as more credible and persuasive compared with someone who fidgets, avoids eye contact, and displays close...

  3. Language, Gesture, and Emotional Communication: An Embodied View of

    Spoken language is an innate ability of the human being and represents the most widespread mode of social communication. The ability to share concepts, intentions and feelings, and also to respond to what others are feeling/saying is crucial during social interactions.

  4. A Review of Communication, Body Language and ...

    l. 24, Issue 09, 2020 ISSN: 1475-7192 2837 In humans, hand movements and other communicative body movements play an important role in language production and understanding. Movements are a form...

  5. Online Communication and Body Language

    The COVID-19 emergency brought out the role of online digital technologies. The increase in online social interactivity was accelerated by social distancing, which has been recognized to have adverse effects due to physical and emotional isolation (Canet-Juric et al., 2020 ). Body language is central to social interactions, and its role is ...

  6. Body Language: Using Your Body to Communicate

    Body language is an essential part of communication and can be just as important as our verbal exchanges. Often, it's the nonverbal messages we send in our gestures, facial expressions, or...

  7. Understanding Body Language Does Not Require Matching the Body's

    Body language (BL) is a type of nonverbal communication in which the body communicates the message. We contrasted participants' cognitive processing of body representations or meanings versus body positions.

  8. Body Language

    Reviewed by Psychology Today Staff Body language is a silent orchestra, as people constantly give clues to what they're thinking and feeling. Non-verbal messages including body movements,...

  9. Body Language: An Effective Communication Tool

    Body Language is a significant aspect of modern communications and relationships. Body language describes the method of communicating using body movements or gestures instead of, or in addition to, verbal language. The interpretation of body language, such as facial expressions and gestures, is formally called kinesics. Body language includes subtle, unconscious movements, including winking ...

  10. How to Understand Body Language and Facial Expressions

    Body language refers to the nonverbal signals that we use to communicate. These nonverbal signals make up a huge part of daily communication. In fact, body language may account for between 60% to 65% of all communication. Examples of body language include facial expressions, eye gaze, gestures, posture, and body movements.

  11. Body Language

    Research on body language has addressed both the communication of states in humans and animals - most often emotions - and the communication of traits. The latter both with regard to the expressive features that characterize certain traits, for example, "the loud voice of extraversion" (Scherer 1978 ), and with regard to first ...

  12. Nonverbal Communication and Body Language

    Body language is the use of physical behavior, expressions, and mannerisms to communicate nonverbally, often done instinctively rather than consciously. Whether you're aware of it or not, when you interact with others, you're continuously giving and receiving wordless signals.

  13. (PDF) Language and Body Language

    ... In meetings > 2, small images of participants and only the head visible 70 to over 90% of communication is missing (Hegstrom (2009); Singh (2018) ). Loss of "secondary" (Participant 9)...

  14. PDF Body Language: An Effective Communication Tool

    Different researches were carried out on the role of body language and it has been reported that during communication: (1) only 7% of the information human transmits to others is in the language we use; (2) 38% in how people speak—quality of voice, accent, voice projection, emphasis, expression, pace, volume, pitch, etc.; and (3) 55% through bod...

  15. Body language: An effective communication tool

    Body language: An effective communication tool June 2014 Authors: D.S. Patel Abstract Body Language is a significant aspect of modern communications and relationships. Body language...

  16. Body Language Analysis in Healthcare: An Overview

    Body language constitutes one of the languages of communication. The types of languages are classified into verbal and non-verbal languages. Body language includes non-verbal language, where the movements and behaviors of the body are used instead of words to express and convey information.

  17. Nonverbal communication speaks volumes, with David Matsumoto, PhD

    David Matsumoto, PhD, is a renowned expert in the field of facial expression, gesture, nonverbal behavior, emotion and culture. He has published more than 400 articles, manuscripts, book chapters and books on these subjects. Since 1989, Matsumoto has been a professor of psychology at San Francisco State University.

  18. How Universal Is Body Language?

    New research suggests emotional body language may transcend culture Posted April 12 , 2017 ... only seven percent of the meaning we derive from human communication comes from the actual spoken ...

  19. What Is Active Listening?

    Amy Gallo. Summary. Active listening requires mastering many skills, including reading body language and tone of voice, maintaining your attention, and being aware of and controlling your ...

  20. The Role Of Body Language In Communication

    Body language often plays a significant role in communication and can be as important as the words we say. It can involve eye contact, head movement, posture, gestures, and facial expressions, all of which can add meaning to our verbal communication. Non-human primates also frequently use body language to communicate.

  21. How To Read Body Language: Examples, Types & Meaning

    According to Phutela (2015), nonverbal communication, such as body language and facial expressions, greatly affects social environments and the communication process. ... As such, our research team has created a guide on body language. This article will discuss the different types of nonverbal cues for various body parts to help you perceive ...

  22. The Critical Need to Read Body Language in Qualitative Research

    A research participant has a positive verbal reaction to a new product concept, but their body language is not quite so convincing. What Interview Were You Watching? Not long ago, I interviewed a respondent to whom I showed a rough concept for a new product.

  23. 23 Essential Body Language Examples and Their Meanings

    Body language is the science of nonverbal signals such as gestures, facial expressions, and eye gaze that communicate a person's emotions and intentions. In total, there are 11 types of body language that we use to communicate. Unlike words, body language is often done subconsciously and constitutes a large part of our communication.

  24. Pathways from media attention and peer communication to body

    Negative body image is a common psychological phenomenon among young Chinese women, and merits further investigation. Peers and the media are important factors that associated with body image. This study explored how media and peers promote body dissatisfaction among young Chinese women, including the mediating role of body surveillance and the moderating role of protective filtering. 3499 ...

  25. ‎The Language of Play

    According to research a small amount of our communication is actually the words that we say. Our tone, gestures, and our internal emotions communicated a vast amount to our children. Children pick up on our internal energy when we speak. If inside, you are full of self-criticism, while your words say, I love you , the kids can feel confusion.

  26. Frontiers

    This fMRI study investigated neural systems that interpret body language—the meaningful emotive expressions conveyed by body movement. Participants watched videos of performers engaged in modern dance or pantomime that conveyed specific themes such as hope, agony, lust, or exhaustion. We tested whether the meaning of an affectively laden performance was decoded in localized brain substrates ...