Is Eye Contact Really Overrated?
In my second post on the autism-related research of Morton Gernbacher, I critiqued Gernsbacher’s claim that Joint Attention (JA) is impaired in autism and concluded by saying
Diminished eye-contact and diminished JA have serious consequences for linguistic and socio-cognitive development. Where language is concerned, treating eyes as socially salient is crucial—especially for basic word learning.
In this post, I turn to a fourth article co-authored by Gernsbacher, Akhtar and Gernsbacher (2008). The central claim in this article is that autism research has privileged eye gaze and ignored other ways to show social engagement.
Engagement, Akhtar and Gernsbacher point out, can also be expressed through voice and touch. This observation, in light of all that tone of voice, or hugging, patting people’s backs, squeezing their hands—or (on a darker note) slapping their faces or punching their noses—can convey, is an observation that’s both reasonable and noncontroversial. What remains questionable, however, are:
Do voice and touch convey as much social and emotional information as eyes do?
How much social and emotional information do autistic people extract from voice and touch?
As far as voice goes, there is research indicating that, compared to their non-autistic counterparts:
Autistic children orient less frequently to voices (Adamson et al., 2021; Dawson et al., 1998l Lepistö et al., 2005)
Autistic individuals are less sensitive to the emotional information conveyed by tones of voice (Globerson et al., 2015; Stuart et al., 2013; Taylor et al., 2015)
Autistic individuals convey less emotion through their own voices (Loveall, et al., 2021)
As for touch, it is, certainly, a common way for one person to convey/enact certain feelings that they have towards the person they are touching (e.g., affection, support, or hostility). Furthermore, the feel of someone’s sweaty or trembling hands can signal worry or distress. But are any of us able to detect emotions more generally, especially emotions that are not reactions to us in particular, from someone’s touch? Is it really possible to tell that someone is currently feeling happy, sad, angry, or surprised, exclusively from how they are touching us?
Beyond emotion, there’s what people are attending to and thinking about. Here, major cues come from where people are looking. If they are looking at a clock, for example, they may be thinking about what time it is. But there is no vocal, tactile, or postural equivalent to eye gaze. How someone is touching us conveys little to nothing about what they are attending to. Head position is only a crude measure of where people are looking—particularly when their eyes aren’t pointed straight ahead. As for voice, it’s hard to see how that could convey a specific object of thought or attention except through actual words—as in “I wonder what time it is?”
As far as words go, as I discussed in my last post, gaze following is crucial to word learning. And in this article, interestingly, Akhtar and Gernsbacher implicitly contradict what they write in their 2007 paper: they acknowledge that studies show gaze following to predict subsequent vocabulary development. But they go on to claim that gaze direction need not be cued by the eyes themselves but instead can be cued by postural changes (head turns) or by voice direction. They speculate, and cite others who speculate, that these cues can be just as revealing as gaze, especially if children are held on laps or carried on backs such that they’re looking away from their caregiver’s faces.
Even if all these speculations are accurate, however, it’s also the case that to learn the meaning of a new word the child still needs to disengage from what they’re doing, look up, and look where the speaker looking—even if it’s not by following the speaker’s eyes (c.f. Baron-Cohen et al., 1997). This phenomenon, as I discussed in my last post, is an example of the kind of Joint Attention (JA) behavior that occurs less frequently in autistic individuals as compared with their non-autistic counterparts.
When it comes to Joint Attention, Akhtar and Gernsbacher fault the strict standards that some researchers, as well as one of the diagnostic screening tools for autism (the ADOS), have used for verifying whether something counts as Joint Attention. They write:
Simply gazing at the same referent as the caregiver is not considered evidence of joint attention; the infant must also alternate his or her gaze to the object with gaze to the caregiver’s face.
They then go on to argue that these criteria, however strict, don’t necessarily guarantee that JA is occurring: a child might look up because a face is attractive or to check someone’s reaction to something. In addition, Akhtar and Gernsbacher point out that JA need not involve the visual channel:
For example, a child sitting in her mother’s lap while they both handle a toy would likely sense from her mother’s posture and touch that they are jointly attending to the toy.
The authors conclude that gaze alternation is not “the only, or even the best, means of signaling shared attention.” They provide no empirical support for their speculative aside (“even the best”).
Akhtar and Gernsbacher are on firmer ground in pointing out reasons for low eye contact in autism that have nothing to do with lack of social interest—namely, that eye contact can be stressful or distracting. Indeed, this is true for non-autistic people as well—though generally to a far lesser extent.
As we’ve noted earlier, however, regardless of the initial reasons for reduced eye contact in autism, this proclivity sends autistic individuals down a trajectory of reduced opportunities for learning language, for social and emotional learning, and for social engagement.
REFERENCES:
Adamson, L. B., Bakeman, R., Suma, K., & Robins, D. L. (2021). Autism adversely affects auditory joint engagement during parent-toddler Interactions. Autism Research: Official Journal of the International Society for Autism Research, 14(2), 301–314. https://doi.org/10.1002/aur.2355
Akhtar, N., & Gernsbacher, M. A. (2008). On Privileging the Role of Gaze in Infant Social Cognition. Child development perspectives, 2(2), 59–65. https://doi.org/10.1111/j.1750-8606.2008.00044.x
Baron-Cohen, S., Baldwin, D. A., & Crowson, M. (1997). Do children with autism use the speaker's direction of gaze strategy to crack the code of language?. Child development, 68(1), 48–57.
Dawson, G., Meltzoff, A. N., Osterling, J., Rinaldi, J., & Brown, E. (1998). Children with autism fail to orient to naturally occurring social stimuli. Journal of Autism and Developmental Disorders, 28(6), 479–485. https://doi.org/10.1023/a:1026043926488
Globerson E, Amir N, Kishon-Rabin L, Golan O. (2015). Prosody recognition in adults with high-functioning autism spectrum disorders: from psychoacoustics to cognition. Autism Research, 8:153–63. doi: 10.1002/aur.1432
Loveall, S. J., Hawthorne, K., & Gaines, M. (2021). A meta-analysis of prosody in autism, Williams syndrome, and Down syndrome. Journal of communication disorders, 89, 106055. https://doi.org/10.1016/j.jcomdis.2020.106055
Lepistö, T., Kujala, T., Vanhala, R., Alku, P., Huotilainen, M., & Näätänen, R. (2005). The discrimination of and orienting to speech and non-speech sounds in children with autism. Brain Research, 1066(1-2), 147–157. https://doi.org/10.1016/j.brainres.2005.10.052
Stewart, M. E., McAdam, C., Ota, M., Peppé, S., & Cleland, J. (2013). Emotional recognition in autism spectrum conditions from voices and faces. Autism : the international journal of research and practice, 17(1), 6–14. https://doi.org/10.1177/1362361311424572
Taylor, L. J., Maybery, M. T., Grayndler, L., & Whitehouse, A. J. (2015). Evidence for shared deficits in identifying emotions from faces and from voices in autism spectrum disorders and specific language impairment. International journal of language & communication disorders, 50(4), 452–466. https://doi.org/10.1111/1460-6984.12146