Do Blind People Develop Facial Expressions?
Sabrina L. Strong
Eastern Mennonite University
Are we born knowing facial expressions or are they something we learn as we grow up? How do we perceive other’s facial expressions? Studies in Blind verses sighted children show that there are similarities in spontaneous facial expression supporting the idea that facial expressions are genetic and we are born knowing how to make certain faces for emotions and how to perceive others’ facial expressions. This also shows that there is a flip side to this theory and that there are environmental factors that result in different facial expressions as well.
Other research has shown that there are many other factors in determining what plays a role in learning how to perceive and show facial expressions. Studies of the comprehension deficits that are experienced by children with language impairment show significant results. Brain injuries, autism, prosopagnosic and culture differences all play a role in how one learns to understand facial expressions.
The amygdala was found to be a big player in what part of the brain is responsible for learning and understanding facial expressions. This is one area of the brain that studies have shown activity specifically when asked to classify or look at certain facial expressions.
How do we learn facial expressions? How do we know that someone’s facial expression for a happy emotion is the same as the next persons’? Are all facial expressions universal or does every culture have different ways of expressing emotions? People with impairments may perceive facial expressions differently than most others. According to Sternberg (2003), a mental percept is a mental representation of a stimulus that is perceived. Is everyone’s mental percept of facial expressions the same? Sternberg also notes that for face perception to be a modular process, we need to have additional evidence of domain specificity and informational encapsulation.
Darwin was the first to research facial expressions and their importance. Studies have shown that perceiving facial expressions is a combination of nature and nurture. Faces, which play a close second maybe only to language, convey critical social information are a key channel for social communication. The ability to use the face as a channel for social communication develops early in infancy (Adolphs, Sears & Piven, 2001). Children as young as two months old can respond to facial expressions such as happy and sad (Ford & Milosky, 2003). Many factors affect how one learns how to perceive facial expressions. One large debate is nature verses nurture. Some other key factors are language impairments, blind verses sighted, brain injuries, developmental disorders such as autism, and psychopathy.
After one learns how to perceive facial expressions, can we teach children to understand them better or across cultures?
The information that is sensed from a face allows for an immediate assessment of the person’s mood. The processes of decoding the emotion in faces can inform models of perceptual and cognitive face processing. Information about one’s emotional state is shown through the same features as everyone elses’, however, it is assumed that there is a higher level of processing that allows for separate analysis of identiy and emotion. If there are different levels of processing needed for identifying the emotion of a face, then one first needs to recognize a face, recognize that the expression shows an emotion and then figure out a specific emotion (Prkachin, 2003).
Sternberg (2003) explains that prosopagnosics have deficts, which disables them to perceive human faces at all. He goes on to explain that right-hemisphere activity is powerfully occupied in prosopagnosia, however, we need more proof about the neurological source of prosopagnosia before anything can be said about the neural design of facial recognition. What part of the brain is responsible for perceiving facial expressions? Some studies show that the categorical awareness of human facial expressions has a perceptual basis in the bilateral occipito- temporal regions (Guerit, 2002). Other studies have suggested that the amygdala may play a role in this function but we will first look at other factors.
One study found that facial expressions in blind children from age eight to eleven were the same as sighted children’s. On a single case observation, he said that some spontaneous emotional expressions, such as smiling are present in blind infants.
Genetics play a large part of who we are and what we do. Ortega, Iglesias, Fernandez and Corraliza (2003) used the Facial Action Coding System to describe facial expressions objectively. This system allows one to record each facial movement in terms of so called action units. With this system, Ortega et al. (2003) studied twenty-two congenially blind children and adolescents ages seven to thirteen. He monitored their voluntary and spontaneous facial expressions.
Voluntary expressions are produced by the subject on request of the experimenter and do not necessarily reflect an inner feeling or emotion. Spontaneous facial expressions are evoked by an authentic emotional sensation. An appealing result came from this study. Comparing the facial expressions of the blind children to the sighted children, Ortega et al. (2003) found that the spontaneous expressions were comparable but there were differences in voluntary expressions. These conclusions support Ortega et al’s study and put forward two factors that influence the development of facial expression. The first is a genetic factor that accounts for the similarities between the two groups in spontaneous expressions, and secondly, an environmental factor, which accounts for the dissimilarities in voluntary expressions.
Galati and his colleagues investigated genetic factors in developing facial expressions in greater intensity. In two of his studies, participants’ spontaneous facial expressions were filmed in seven specific situations that were set up to evoke different emotions: joy, sadness, anger, fear, disgust, surprise, and interest. Subjective judgments and objective coding were two measurements in analyzing the emotions (Ortega at el, 2003).
Objective coding was measured by using the Maximally Discriminative Facial Movement Coding System, which was developed to code patterns of facial movements. Subjective judgments were assessed with four different scales: two categorical ones that required the coders to connect each expression with one of the seven situations or alternative and a free-emotion label. The second two were dimensional scales measuring the levels of activation and degrees of pleasure or disgust shown by the children’s faces (Ortega at el, 2003).
The outcome of this study found that facial expressions of the sighted children and the blind children were similar with little differences. Interestingly, the blind children contracted their eyes and opened their mouths more intensely than the sighted children did. It was said the differences found in the study were that the blind children’s expressions were judged as more intense and showed less pleasure than the sighted children’s expressions. This was even more obvious in the second study with the older children. Also, some of the sighted children’s negative facial expressions were judged as positive expressions. This was not the case with the blind children. One reason for this variation was the sighted children may have by now learned to control, or disguise their expressions, mostly negative ones, and the blind children do not have the best ability to do this. This is where the environmental factors come in to play.
As children get older, they go through a development of progressive socialization through the acceptance of socially shared communicative codes, influenced by what is called display rules. These rules establish in which situations what emotions may be expressed. People learn to mask emotions or control when or when they do not show expressions. It is reasonable to presume that blind children do not have this capability, and especially cannot see visual feedback from others in different situations when different emotions are exposed. Rinn (1991) believes the control and tone of facial expressions are based on a cortical process. They are more likely to be influenced by learning, while spontaneous expressions are triggered by subcortical motor centers that are joined with biologically rooted and natural action schemes. It was noted that blind adults were rarely capable of using appropriate facial expressions when asked to do so voluntarily (Rinn, 1991). During social interactions, faces of people who are blind are said to be expressionless. This is known as the phenomena of the so-called blank face.
Other factors contribute to the way we perceive facial expressions. Children with language impairments experience comprehension deficits. About fifteen inferences must be made for successful comprehension of every statement we hear (Ford & Milosky, 2003). Facial expressions may assist one’s ability to understand another’s mental state. The first guess was that since children with language impairment need to depend more heavily on other senses to help understand others, they would somehow aquire the skill to understand facial expressions better or be born with the genetics to make this possible. Supprisingly, Janet Ford and Linda Milsoky found interesting results when working with kindgergarten children to see if children with language impairments could label facial expressions showing one out of four emotions: happy, surprised, mad, sad.
Afterwards, the children were asked to understand emotional reactions from stories offered in three modalities: verbal, visual, and combined. Results showed that all of the normally functioning children could identify and label the facial expressions, but children with language impairments had a hard time putting together emotion knowledge with even context in order to infer a character’s feelings. After these mistakes, the children with language impairments were more likely to substitute another emotion of a dissimilar valence than were the children without language impairment. Ford and Milosky’s (2003) research shows that children with language impairment might have a difficult time making assumptions from physical events and those based on mental states. Could these findings could downplay the importance of the make up of the brain when it comes to perceiving facial expressions?
Is spontaneous facial behavior different for children with brain injuries? Children who have lived part of their lives as a normally functioning child would have problems adjusting to life after a brain injury. They usually are not given other stronger senses to make up for one they lack. Susan Kupferberg, Mary Morris, and Roger Bakeman (2001) studied sixteen school-aged children with acquired brain injuries and thirty-two normally developed children of the same gender and age. Results showed that children with brain injuries are less expressive overall and children with injuries changed facial expressions less often than their peers.
It is thought that autism is characterized somewhat by dysfunction in emotional and social cognition. The pathology of the primary development and their neural substrates are still poorly understood. However, it is hypothesized by many studies that the role of the amygdala may be responsible for some of the impairments seen in autism, especially identification of information from faces (Adolphs, Sears & Piven, 2001). So how then, do people with autism perceive facial expressions? Recent studies have shown that people with autism have an impaired ability to link socially relevant behavior and visual perception. Subjects with autism made abnormal social judgments about the trustworthiness of faces and were not able to judge mental states, which were indicated by the eyes. However, they were able to recognize broader facial expressions such as happiness and sadness.
Adolphs et al (2001), noted several studies which use either subjects with amygdala lesions or using functional imaging of the amygdala in normal individuals have shown that the amygdala is essential for identification of certain emotions, such as fear, from facial expressions, and that it is also important for making more intricate social judgments about faces, such as their apparent honesty. Additional support for a probable relationship between amygdala dysfunction and autism is given by a recent functional imaging study, which showed that the amygdala is triggered in normal individuals, but not in subjects with autism, on a chore in which autistic subjects are impaired. The pattern of impaired identification of fear, disgust, and surprise with fully normal recognition of happiness is accurately what has been reported for subjects with two-sided amygdala injury.
Do children with psychopathic tendencies perceive facial expressions the same way most others do? Children with psychopathy screening Device scores were tested to see how well they understood facial expressions and vocal tones. Stevens, Charman and Blair (2001) researched this to find if there was any correlation. The Psychopathy Screening Device catalogs a behavioral condition with two dimensions: affective disturbance and impulsive and conduct problems. Nine children with psychopathic predispositions and nine comparison children were offered two facial expression and two vocal tone subtests from the Diagnostic Analysis of Nonverbal Accuracy. The tests show the ability to name sad, fearful, happy and angry facial expressions and vocal affect.
Results showed that children with psychopathic tendencies did differ from children without psychopathic tendencies (Stevens, Charman & Blair, 2001). Children with psychopathic tendencies showed selective impairments in the recognition of both sad and fearful facial expressions and sad vocal tone. Boys showed even more differences in the capacity to distinguish sad and fearful expressions. The classical fear conditioning impairments demonstrated in psychopathic individuals also were seen in patients with amygdala lesions. This study suggests that development of psychopathic tendencies may reflect early amygdala dysfunction. It is argued that the amygdala is involved in processing fearful and sad expressions. Also, work with adults has shown that obtained amygdala injury point out that these patients are usually impaired in the identification of fearful, and sad facial expressions.
Stevens, Charman and Blair (2001) also note that children with psychopathic inclination and adult psychopathic individuals show reduced skin conductance responses to sad faces than when compared to control groups, were shown by previous studies. How should these results be interpreted? We should look at these within the representation of empathy, which assumes that the cognitive system that mediates empathy is unitary. This means the same system arbitrates the empathic reaction to any emotion felt by others. Feshbach considered the affective empathy reaction to be a function of three factors. The ability to discriminate affective cues in others, the ability to assume the perspective and role of another person and emotion responsiveness. From this idea, the same system would meditate the empathic reaction to another’s sadness as to another’s anger. This model suggests that processing sad and fearful affect in others will be impaired in individuals with psychopathic tendenceies (Stevens, Charman & Blair, 2001).
Cognitive neuroscience seeks to explain main ideas of brain behavior interactions by describing both general mechanism across, and individual differences among, individuals. Studying the amygdala involves looking at both of these approaches. The amygdala is an important structure for the processing of emotional signals. Using functional magnetic resonance imaging, the amygdala is studied to see whether activation to fearful expressions is autonomous of extraversion. Also it is studied to see if activation to happy expressions varies with extraversion. Fifteen emotionally and mentally healthy people were asked to categorize presentations of facial expressions. The findings from this study suggest there are two processes in the amygdala area. The first function was found across all people in response to fearful facial expressions. This may show the importance of detecting clues to potentially threatening situations. The second process differed from one person to the next and showed the degree of extraversion in response to happy faces (Canli, et al., 2002).
More and more research is being done on the neural interaction of the amygdala with the prefrontal and temporal cortices in the processing of facial expressions. Norihiro Sadato (2001) has done research on this using a functional magnetic resonance imaging. Using healthy individuals, he studied how the amygdala interacts with other cortical regions. The subjects of this research were judging the faces with negative, positive or neutral emotion.
Overall, significant activation was observed in the bilateral fusiform gyrus, medial temporal lobe, prefrontal cortex and the right parietal lobe during the task. The left amygdala, was involved in processing of negative expressions. The right amygdaloid activity also had an interactive affect with activity in the right hippocampus and middle temporal gyrus. These results may suggest that the left and right amygdalae play a differential role in effective processing of facial expressions (Sadato, 2001).
In Paul Devereux and Gerald Ginsburg’s (2001) study of sociality effects on the production of laughter, they mention a “Behavioral-Ecology View of Facial Displays.” Fridlund uses evolutionary and ethological perspectives to argue that facial displays are communicative acts that are primarily social, occurring within a particular context. He believes facial displays show the intent or future behavior of an actor and have evolved because they facilitate interaction and do not represent an internal, emotional state of the actor (Devereux & Ginsburg, 2001).
Fridlund also uses the idea of talking on a phone with someone while making facial expressions. Those subjects talking on the phone did not display as many facial displays as the subjects who were visually available to the person they were talking to (Devereux & Ginsburg, 2001). Although this model is not widely accepted, it may show that one makes facial expressions to help others understand what they are communicating.
So how do we understand facial expressions in others? School aged children are able to learn different emotions from facial expressions. A study was done by Grinspan, Hemphill and Nowicki (2003) on school aged children and results found that children in the intervention group were more able to understand, use and relate to different emotions after seeing pictures of facial expressions and matching them correctly with emotional labels. Higher feelings of self-worth and lower social anxiety were two benefits of learning how to better interpret facial expressions. Interestingly, improvement in identifying facial expressions negatively affected boys’ self-concept.
Even normally functioning people learn and understand facial expressions differently. This is due to our culture, families and our experiences. Collins and Nowicki gave the Diagnostic Analysis of Nonverbal Accuracy to eighty-four African American children to determine if the children could accurately identify emotion in the facial expressions of European Americans. It was assumed that African American children would do as well as European American children, however, results found that the African American children performed less accurately on adult and child tones of voice and facial expressions than their European American peers did. One suggestion is that African American children need more time to process and understand Eruopean American’s facial expressions and emotions. (Nowicki, 2001).
These studies seem to support the fact that facial expressions are learned in society as we grow up. It is interesting that some facial expressions are genetic and some are due to environmental factors. I believe environmental factors play a very large part in developing facial expressions in children. If we are told not to make a face in a certain situation, soon the child may be conditioned to think it is a “bad” expression and may suppress this face. This may lead a child to thinking that certain emotions are bad as well, if they are not allowed freedom to express all kinds of different facial expressions. This could lead to an unhealthy emotional life if one is not allowed to show certain expressions because they are told not to, or expected not to. If our brain is telling us one thing and our environment or social interactions are telling us to do another, there is going to be some kind of conflict. I feel it is nurture that plays a bigger role in what one decides to do, rather than the way our brains are wired to handle things.
It is felt this is supported by the fact that the similar facial expressions between the blind children and the sighted children were the genetic ones when asked to produce a facial expression. The blind children grow up not truly knowing what facial expression they are making and are less likely to mask their emotions like their sighted peers may. If environmental causes make us control our emotions, especially our negative ones, who determines what facial expression is bad or not appropriate in different situations? It is most likely different in every family and some may be appropriate in some cultures but not others. Collins and Nowicki’s (2001) study supports this fact since children of different ethnic backgrounds did not do as good as the children of similar backgrounds of correctly identifying emotion in adults.
In response to what part of the brain is responsible for recognition of facial expressions and emotions, it seems that the amygdala is the biggest player. This part of the brain is responsible for understanding emotions. Most studies involving facial expressions show some kind of activation in this part of the brain more than any other. The visual cortex is the part of the brain that is responsible for interpreting what one sees, however, it was rarely mentioned in the studies. It was interesting that the studies did not show any activity in this part of the brain. It was thought that since the visual cortex would be activated when looking at a face, it would show in the functional magnetic resonance imaging. It was assumed that this part of the brain would play a part in perceiving facial expressions, be integrated, or working together with the amygdala, but in hindsight, since this part of the brain does not associate visual clues with inner feelings and ability to recognize emotions, it makes sense that this is reserved for a different part of the brain.
Adolphs, R., Sears, L. & Piven, J. (2001). Abnormal Processing of Social Information from Faces in Autism. Journal of Cognitive Neuroscience, 13, 232-241. Retrieved September 25, 2003, from EBSCOhost.
Canli, T., Sivers, H., Whitfield, S., Gotlib, I., and Gabrieli, J. (2002). Amygdala response to happy faces as a function of extraversion. Science, 296. Retrieved September 28, 2003 from EBSCOhost.
Collins, M. (2001). African American Children’s Ability to Identify Emotion in Facial Expressions and Tones of Voice of European Americans. Journal of Genetic Psychology, 162, 334-348. Retrieved September 25, 2003, from EBSCOhost.
Devereux, P. & Ginsburg, G. (2001). Sociality Effects on the Production of Laughter. Journal of General Psychology, 128, 227-241. Retrieved September 25, 2003, from EBSCOhost.
Ford, J. & Milosky, L. (2003). Inferring Emotional Reactions in Social Situations: Differences in Children With Language Impairment. Journal of Speech, Language & Hearing Research, 46. 21-31. Retrieved September 15, 2003, from EBSCOhost.
Galati, D., Sini, B., Schmidt, S. & Tinti, C. (2003). Spontaneous Facial Expressions in Congenially Blind and Sighted Children Aged 8-11. Journal of Visual Impairment & Blindness, 97. 418-29. Retrieved September 15, 2003, from EBSCOhost.
Gomez, N. (2001). EEG during different emotions in 10-month-old infants of depressed mothers. Journal of Reproductive & Infant Psychology, 19, 295-313. Retrieved September 25, 2003, from EBSCOhost.
Gosselin, P. (2002). Motivation of Hide Emotion and Children’s Understanding of the Distinction Between Real and Apparent Emotions. Journal of Genetic Psychology, 163, 479-496. Retrieved September 15, 2003, from EBSCOhost.
Grinspan, D., Hemphill, A., & Nowicki, S. (2003). Improving the Ability of Elementary School-Age Children to Identify Emotion in Facial Expression. Journal of Genetic Psychology, 164, 88-101. Retrieved September 15, 2003, from EBSCOhost.
Guerit, J.M. M. (2002). Categorical Perception of Happiness and Fear Facial Expressions: An ERP Study. Journal of Cognitive Neuroscience, 14, 210-228. Retrieved September 25, 2003, from EBSCOhost.
Kupferberg, S. & Morris, M. (2001). Spontaneous Facial Expressivity in Children with Acquired Brain Injury. Journal of Head Trauma Rehabilitation, 16, 573-587. Retrieved September 25, 2003, from EBSCOhost.
Prkachin, G. (2003). The effects of orientation on detection and identification of facial expressions of emotion. British Journal of Psychology, 94, 45-63. Retrieved September 25, 2003, from EBSCOhost.
Sadato, N. (2001). Neural Interaction of the Amygdala with the Prefrontal and Temporal Cortices in the Processing of Facial Expressions as Revealed by MRI. Journal of Cognitive Neuroscience, 13, 1035-1048. Retrieved September 25, 2003, from EBSCOhost.
Sternberg, R. J. (2003). Cognitive Psychology. Belmont, CA: Wadsworth/Thomson Learning.
Stevens, D., Charman, T. & Blair, R. (2001). Recognition of Emotion in Facial Expressions and Vocal Tones in Children With Psychopathic Tendencies. Journal of Genetic Psychology, 162, 201-212. Retrieved September 25, 2003, from EBSCOhost.