Email updates

Keep up to date with the latest news and content from BMC Neuroscience and BioMed Central.

Open Access Highly Accessed Research article

Alexithymia and the labeling of facial emotions: response slowing and increased motor and somatosensory processing

Klas Ihme1, Julia Sacher23, Vladimir Lichev1, Nicole Rosenberg1, Harald Kugel4, Michael Rufer5, Hans-Jörgen Grabe67, André Pampel8, Jöran Lepsien8, Anette Kersting1, Arno Villringer23 and Thomas Suslow19*

Author Affiliations

1 Department of Psychosomatic Medicine and Psychotherapy, University of Leipzig, Semmelweisstrasse 10, 04103 Leipzig, Germany

2 Department of Neurology, Max-Planck-Institute of Human Cognitive and Brain Sciences, Leipzig, Germany

3 Clinic of Cognitive Neurology, University of Leipzig, Leipzig, Germany

4 Department of Clinical Radiology, University of Münster, Münster, Germany

5 Department of Psychiatry and Psychotherapy, University Hospital Zurich, Zurich, Switzerland

6 Department of Psychiatry, University of Greifswald, Greifswald, Germany

7 HELIOS Hospital, Stralsund, Germany

8 Nuclear Magnetic Resonance Unit, Max-Planck-Institute of Human Cognitive and Brain Sciences, Leipzig, Germany

9 Department of Psychiatry, University of Münster, Münster, Germany

For all author emails, please log on.

BMC Neuroscience 2014, 15:40  doi:10.1186/1471-2202-15-40

The electronic version of this article is the complete one and can be found online at: http://www.biomedcentral.com/1471-2202/15/40


Received:4 December 2013
Accepted:7 March 2014
Published:14 March 2014

© 2014 Ihme et al.; licensee BioMed Central Ltd.

This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Abstract

Background

Alexithymia is a personality trait that is characterized by difficulties in identifying and describing feelings. Previous studies have shown that alexithymia is related to problems in recognizing others’ emotional facial expressions when these are presented with temporal constraints. These problems can be less severe when the expressions are visible for a relatively long time. Because the neural correlates of these recognition deficits are still relatively unexplored, we investigated the labeling of facial emotions and brain responses to facial emotions as a function of alexithymia.

Results

Forty-eight healthy participants had to label the emotional expression (angry, fearful, happy, or neutral) of faces presented for 1 or 3 seconds in a forced-choice format while undergoing functional magnetic resonance imaging. The participants’ level of alexithymia was assessed using self-report and interview. In light of the previous findings, we focused our analysis on the alexithymia component of difficulties in describing feelings. Difficulties describing feelings, as assessed by the interview, were associated with increased reaction times for negative (i.e., angry and fearful) faces, but not with labeling accuracy. Moreover, individuals with higher alexithymia showed increased brain activation in the somatosensory cortex and supplementary motor area (SMA) in response to angry and fearful faces. These cortical areas are known to be involved in the simulation of the bodily (motor and somatosensory) components of facial emotions.

Conclusion

The present data indicate that alexithymic individuals may use information related to bodily actions rather than affective states to understand the facial expressions of other persons.

Keywords:
Alexithymia; Supplementary motor area; Somatosensory cortex; Facial emotion; Labeling; Toronto structured interview for Alexithymia

Background

Understanding the emotional expression of another person is thought to require mimicry or simulation of others’ facial expressions [1,2]. Thus, it is likely that neural assemblies exist that are active both when a person is experiencing and expressing an emotion and when the same person is seeing and interpreting the facial emotions of somebody else [3,4]. Recent evidence indicates that interpreting facial expressions is a multi-faceted endeavor that requires recruiting a multitude of cortical and subcortical circuits, such as the visual system (e.g., occipital gyrus, fusiform gyrus [FFG]), to process the visual information of the face, the motor system for the (covert) physical simulation of the facial movement (supplementary motor area [SMA] or premotor cortex), somatosensory areas for proprioceptive feedback (primary somatosensory cortex, insula) and limbic or frontal regions for reenacting and feeling the according emotion (striatum, ventromedial pre-frontal cortex [vmPFC], amygdala [AMG]) [3-8].

A personality trait that is related to difficulties in the recognition of emotional facial expression is alexithymia (literally translated as “no words for emotion”). Alexithymia is characterized by deficits in identifying and describing one’s feelings [9]. Alexithymic features can be assessed using the 20-item self-reported Toronto Alexithymia Scale (TAS-20, [10]) or the Toronto Structured Interview for Alexithymia (TSIA, [11]). Both measures of alexithymia include the subscales Difficulties Describing Feelings (DDF), Difficulties Identifying Feelings and Externally Oriented Thinking (the TSIA additionally includes imaginal processing).

It has been repeatedly shown that alexithymia is associated with a decreased ability to identify the facial expressions of others, especially when these expressions are presented under temporal constraints [12-14]. Interestingly, a recent electromyographic (EMG) study demonstrated that highly alexithymic individuals exhibit less facial mimicry when confronted with emotional faces [15]. This could mean that individuals who are high in alexithymia have difficulties in interpreting the emotions of others because they automatically simulate others’ facial expressions to a lesser degree and therefore lack the capability to fully capture the other person’s feelings.

On the contrary, when the presentation time is increased, most studies did not reveal a relationship between the degree of alexithymia and recognition accuracy for emotional facial expressions (e.g., [12,16,17]). So far, only one study [16] has investigated brain activation related to facial emotion labeling, as assessed with longer presentation times (3.75 s) and as a function of alexithymia. No differences as a function of alexithymia could be found. However, the authors studied only 23 participants in a correlational approach. Yarkoni and Braver instead proposed the use of at least 40 participants for a correlational analysis in neuroimaging research [18]. In addition, alexithymic tendencies were only assessed through self-report, although a multi-method approach is recommended [19-21]. Moreover, behavioral evidence [12] suggests that DDF, as opposed to the TAS-20 total score, is most predictive for facial emotion recognition. Thus, the current study investigated the labeling of facial emotions and brain responses to facial emotions as a function of DDF (as measured with TAS-20 and TSIA) using functional magnetic resonance imaging (fMRI). Because our design includes a relatively long response window after the presentation of the facial stimuli, we hypothesized that DDF would have an adverse effect on response latencies but not recognition accuracy.

Methods

Participants

Fifty-two healthy young German native speakers (age range: 18 to 29 yrs) participated in the study. All of them were right-handed and had normal or corrected-to-normal visual acuity. None of the participants had any history of neurological or psychiatric illnesses or contraindications for magnetic resonance imaging. All participants gave written consent to participate and received financial compensation for their participation. The study procedure was approved by the ethics committee of the Medical School of the University of Leipzig and was in accordance with the Declaration of Helsinki. Four participants had to be excluded from data analysis (one participant had a depression score of BDI > 14 at time of scanning, one subject displayed excessive head motions in the magnetic resonance imaging (MRI) scanner (>3 mm translation) and two participants demonstrated erroneous reactions and responded before the intended time window). Thus, 48 participants (23 female, age 24 ± 3 yrs, mean ± SD) entered final analysis.

Assessment of alexithymia and control variables

Alexithymic tendencies were measured using a questionnaire, the TAS-20 (German version: [22]), and an observer-rated measure, the TSIA (German version: [23]). The complete TSIA was administered by one trained interviewer and rated during the interview according to the manual. Before the study, the interviewer was trained to conduct and score the TSIA by the translators of the German version of the TSIA (coauthors MR and HG). This included becoming familiar with the alexithymia construct, the manual outlining administration and the scoring procedures for the TSIA, as well as discussion of the guidelines, the scoring of the items and the correct use of the prompts and probes. Moreover, test interviews were supervised until the interviewer was secure in the solo administration and scoring of the interview. Our analysis was focused on one subscale, DDF, of the TAS-20 and TSIA. This subscale consists of five items in the TAS-20 and six items in the TSIA, respectively. To control for depressive symptoms, anxiety and affectivity, participants also completed the Beck Depression Inventory (German version: [24]), the State-Trait-Anxiety Inventory (German Version: [25]) and the Positive and Negative Affect Schedule (German Version: [26]) trait version.

Task and design

The participants’ task was to label the facial emotion of a target face. Facial stimuli were color photographs taken from the Karolinska Directed Emotional Face database [27] depicting four different emotions (happy – HA, angry – AN, fearful – FE, and neutral – NE). Pictures of twenty different individuals (ten females) were shown in each of the four emotional conditions, consisting of 80 trials in total. Each trial lasted for 9 s, initiated by the presentation of a fixation cross in the center of the screen for 800 ms. In the first 40 trials of the experiment, the target was shown for 1 s; in the second half of the experiment, the target presentation time was set to 3 s. After presentation of the target, participants had 7.2 (5.2) s to label the emotions by pressing a button. Participants had one response pad per hand with two buttons each and provided their responses with their index and middle fingers. Each emotion was attributed to one button during the entire experiment counterbalanced across participants. During the response window, participants saw the four options in the order of button attribution, e.g., the label on the left side on the screen matched the most left button (i.e., left middle finger). After pressing a button, the labels vanished and only a gray screen was visible until the next trial started with the presentation of the fixation cross. Participants were instructed to answer as correctly as possible within the given time frame and were aware of the fact that the response window was shorter in the second half of the experiment. Trials were shown in two fixed random sequences with the constraints that no two subsequent trials depict the same person and that no more than two subsequent trials show the same emotion.

MRI acquisition and preprocessing

Structural and functional MR images were obtained on a 3 T scanner (Magnetom Verio, Siemens, Erlangen, Germany). For each participant, structural images were acquired with a T1-weighted 3D MP-RAGE [28]. Magnetization preparation consisted of a non-selective inversion pulse. The imaging parameters were as follows: TI 650 ms, TR 1300 ms, TE 3.5 ms, flip angle 10°, isotropic spatial resolution of 1 mm3, two averages. Blood oxygen level dependent contrast sensitive images were collected using T2*-weighted echo-planar imaging (EPI) sequence [matrix 642; resolution 3 mm × 3 mm × 4 mm; gap 0.8 mm; TR 2 s; TE 30 ms; flip angle 90°; interleaved slice acquisition; 385 images]. The slices were oriented parallel to a line through the posterior and anterior commissures.

MRI data were preprocessed and analyzed using SPM8 (http://www.fil.ion.ucl.ac.uk/spm/ webcite). The initial five functional volumes were discarded to allow longitudinal magnetization to reach equilibrium. Functional volumes were slice-time corrected (temporal middle slice as reference), realigned to the first image and corrected for movement-induced image distortions (6-parameter rigid body affine realignment). The structural T1 images were coregistered to the mean functional EPI image (default in SPM). Anatomical images were segmented, including normalization to a standard stereotaxic space using the T1 MNI within SPM8. The normalization parameters were then applied to the functional EPI series. The resulting voxel size for the functional images was 3x3x3 mm3. A temporal high-pass filter (128 s) was applied to remove slow signal drifts. For the functional data, spatial smoothing was performed using a three-dimensional Gaussian filter of 6 mm full-width at half-maximum. We chose this rather small smoothing kernel such that the potential activation in subcortical areas involved in facial emotion processing was still detectable and not washed out.

Data analysis

Labeling accuracy was evaluated by the Grier sensitivity index [29], which considers true and false positives. The resulting values for this sensitivity index range from 0 to 1, with a value of 1 meaning perfect performance and a value of 0.5 referring to chance level. Due to the high accuracy and thus lack of sufficient trials to reliably estimate error responses, incorrect trials were discarded prior to analysis of reaction time and fMRI data. The data were pooled across both presentation time conditions. Originally, we aimed to differentiate between the two temporal conditions (1 and 3 s), similar to the study of Parker et al. [12]. However, the accuracy was at its ceiling (> .9) with little variance, such that we decided to collapse across temporal conditions for analysis of reaction time and fMRI data. The high recognition rates in the current study compared to those of Parker et al. seem to be related to our long response window. The participants in Parker et al.’s study had to respond while the picture was presented (1 or 3 s). Participants had more time to respond in the current study, most likely resulting in higher accuracy. This is in line with the conclusions of a recent review (Grynberg et al. [14]), which was published when the data collection for this study was almost finished. Grynberg and colleagues concluded that alexithymic individuals' difficulties in recognizing facial emotions are most prominent when the pictures are presented for less than 300 ms. To investigate associations between measures of alexithymia and labeling accuracy, as well as RTs, correlational analyses were accomplished using Spearman’s rho. Spearman’s rho was also used to check for associations between the measures of alexithymia and affectivity questionnaires (BDI, STAI, and PANAS). We employed Spearman’s rho for correlational analyses because the RT and TSIA-DDF scores were not normally distributed. All associations were tested against a significance threshold of p = .05 (two-tailed).

The fMRI data were analyzed by modeling the onset and duration of the presentation times of each facial expression and by convolving these regressors with the hemodynamic response function for the different emotions. Incorrect trials were included in the first-level design matrix as nuisance regressor. First level t-contrasts were calculated by contrasting each emotional condition with the neutral one (HA > NE, AN > NE, FE > NE). The contrast images for the first level contrasts were then transferred to the second level models for the main effects (HA > NE, AN > NE and FE > NE) and regression models with TAS-20-DDF and TSIA-DDF as regressors. One second level model was calculated per alexithymia measure (TAS-20-DDF, TSIA-DDF) and experimental condition. For all models, significance was tested at the cluster level against a family-wise-error-corrected significance threshold of p = .05 at an individual voxel threshold of t = 3.5. As advised in the literature [30], we also report the activations that would survive a more lenient threshold (p = .001, k = 10) in the additional material to afford using these data in future meta-analyses.

In a recent paper, Yarkoni and colleagues [31] argued that the reaction times per second increase brain activation because the time required for preparatory processes for motor activation is increased. Thus, for contrasts yielding significant clusters, we checked whether adding the difference in RT between the two the conditions in that contrast (e.g., AN > NE) or the RT for the emotion only (e.g., AN) as nuisance covariates changed the results substantially.

Although an association between behavior and TSIA-DDF was revealed for angry and fearful faces, it was only reflected in significant brain activation related to TSIA-DDF in the contrast AN > NE, but not in FE > NE. For FE > NE, the effects on brain activation may be smaller and could thus not be detected using a whole brain approach. Thus, we additionally tested whether there was an association between TSIA-DDF and brain activation in these clusters in an ROI-based approach using small volume correction for FE > NE. For this, the significant clusters from the model testing for a positive correlation between TSIA-DDF and brain activation for the contrast AN > NE were saved as a mask. These, in turn, were employed as an ROI to check for activations positively correlating with TSIA-DDF in these brain areas.

Finally, an exploratory analysis was conducted to check whether our measures of alexithymia (TAS-20, TAS-20-DDF, TSIA, TSIA-DDF) displayed a relationship with brain activations in ROIs, which, based on the previous literature, are associated with facial emotion processing. To estimate the activation in these ROIs, the eigenvariates of the activation in these ROIs were extracted for the main contrasts (i.e., HA > NE, AN > NE, FE > NE) using SPM8. The activations in these ROIs were then related to the measures of alexithymia by employing Spearman’s rho. We decided to employ the following ROIs: amygdala (AMG), ventro-medial pre-frontal gyrus (vmPFC), fusiform gyrus (FFG) and striatum. The masks for AMG, FFG and striatum were defined using the automated anatomical labeling toolbox [32] as implemented in the WFU Pick Atlas [33] using SPM8. However, this tool did not include a reasonable mask for the vmPFC, so we defined this region as a sphere of 20 mm around the MNI coordinates xyz = [0 50–2]. These coordinates were based on the results of a study by Pessoa et al. on facial emotion processing [34]. We also decided to include the clusters (SMA, right S1) positively correlating with TSIA-DDF in the contrast AN > NE as further ROIs.

Results

Alexithymia measures and control variables

The mean scores for the alexithymia subscales were 12.4 ± 4.6 (mean ± standard deviation) for the TAS-20-DDF and 2.9 ± 3.4 for the TSIA-DDF. The TAS-20 total score was 43.0 ± 10.7, and the TSIA total score was 16.9 ± 9.9. Internal consistencies for TAS-20-DDF (Cronbach’s α = .87) and TSIA-DDF (α = .90) were sufficiently high. All measures of alexithymia were significantly correlated with each other (see Table  1). There was no correlation between TAS-20-DDF and depression as assessed by the BDI [35], trait-anxiety as measured by the STAI [36], or positive and negative affect as assessed by the PANAS [37] (all ps > .05). TSIA-DDF was not related to BDI or to STAI and PANAS negative (all ps > .05), but there was a negative correlation between TSIA-DDF and PANAS positive (rho = -.33, p < .05). There was a correlation between STAI and BDI (rho = .49, p < .005).

Table 1. Correlations (Spearman’s rho) between measures of alexithymia

Behavioral data

Labeling accuracy was above 0.9 for all facial emotion conditions (happy [HA]: .99, neutral [NE]; .97; angry [AN]: .96; fearful [FE]: .96), and there was no relationship between TAS-20-DDF or TSIA-DDF and performance as measured using the Grier sensitivity index [29] in any of the emotional conditions (all ps > .1). The fastest reaction times were revealed for happy faces, and the slowest reaction times for fearful faces (HA: .73 s, NE: .96 s, AN: 1.06 s, FE: 1.17 s; F(3,141) = 36.4, p < .01; post-hocs: HA < NE = AN < FE). TAS-20-DDF did not correlate with reaction time (RT) in any condition (all ps > .1). However, there was a positive correlation between TSIA-DDF and RT for angry (rho = .30, p < .05) and fearful faces (rho = .31, p < .05), but no correlation was observed between TSIA-DDF and RT for happy (rho = -.01, p = .47) and neutral faces (rho = .07, p = .32) (see Table  2).

Table 2. Correlations (Spearman’s rho) between difficulties describing feelings (as assessed by TAS-20-DDF and TSIA-DDF) and reaction times in the four facial expression conditions

fMRI data

Main effects

Happy versus neutral faces elicited significant brain activation in clusters in the left middle occipital gyrus extending to the middle temporal gyrus, in the left middle orbital gyrus extending to both the superior frontal gyrus and the bilateral anterior cingulate gyrus, and a cluster in the middle frontal gyrus extending to the superior frontal gyrus. In the contrast AN > NE, significant clusters were revealed in the right fusiform gyrus, the right inferior occipital gyrus extending to middle occipital and lingual gyrus, the left fusiform gyrus extending to inferior temporal gyrus and the left middle occipital gyrus. The contrast FE > NE activated the left inferior frontal gyrus, left fusiform gyrus extending to inferior occipital gyrus, left middle temporal gyrus, right inferior occipital gyrus and right cerebellar structures (lobule VIIb and VIIa). An overview of the results is presented in Table  3. The activations for the main contrasts are presented at a more lenient threshold (p = .001, k = 10) in the Additional file 1: Table S1.

Table 3. Significant brain activations for all fMRI main contrasts

Additional file 1: Table S1. Brain activation in the three main contrasts at a threshold of t = 3.27, k = 10.

Format: DOC Size: 80KB Download file

This file can be viewed with: Microsoft Word ViewerOpen Data

Relationships between brain activation and measures of alexithymia

A significant cluster positively correlating with TSIA-DDF in the contrast AN > NE was revealed in the supplementary motor area (SMA) (Montreal Neurological Institute [MNI] coordinates xyz = [-6 -1 61], cluster extent k = 64, pcluster = .013). The peak of this cluster lay in SMA proper, but the activity clearly extended more rostrally to pre-SMA. A second cluster was found in the post-central gyrus (xyz = [30–37 40], pcluster = .039, k = 47) (see Figure  1), which could be attributed to the right primary somatosensory (S1) cortex. No significant clusters were revealed related to TAS-20-DDF or any other contrast for TSIA-DDF. Because TSIA-DDF was negatively correlated with the positive affect, we checked whether entering the PANAS positive score as a nuisance covariate into the model affected the results. We found a small change for the cluster in the SMA (xyz = [-6 -1 61], k = 39, pcluster = .089) and a decrease in cluster size in the somatosensory cortex (xyz = [30–37 40], k = 14, pcluster = .45, ppeak-uncorrected < .0001).

thumbnailFigure 1. Activation in the right postcentral gyrus (A) and supplementary motor area (B) in response to angry (vs. neutral) faces (AN > NE) positively correlating with TSIA-DDF.

Entering the difference between the reaction times for AN and NE as nuisance covariates into our second-level model slightly changed the results (SMA: xyz = [3 2 61], k = 43, pcluster = .05; somatosensory cortex: xyz = [30–37 40], k = 47, pcluster = .038). Similarly, using only the reaction time in the angry condition as a covariate led to small changes in the results (SMA: xyz = [3 2 61], k = 37, pcluster = .08; somatosensory cortex: xyz = [30–37 40], k = 45, pcluster = .046). Thus, our findings are highly likely to mainly reflect differences due to alexithymia (TSIA-DDF) and cannot be attributed to (differences in) the reaction time. The activations for the models related to the measures of alexithymia are presented at a more lenient threshold (p = .001, k = 10) in the Additional file 2: Table S2.

Additional file 2: Table S2. Brain activation related to measures of alexithymia in the three contrasts at a threshold of t = 3.27, k = 10.

Format: DOC Size: 45KB Download file

This file can be viewed with: Microsoft Word ViewerOpen Data

Post-hoc analysis of activation in SMA and S1 positively correlating with TSIA-DDF for contrast FE > NE

A post-hoc region of interest (ROI) analysis revealed a significant small-volume-corrected (SVC) peak voxel activation in the SMA (xyz = [-9 -4 61], pSVC = .019) and a marginally significant peak voxel activation in S1 (xyz = [30–37 40], pSVC = .069) positively correlating with TSIA-DDF in the contrast FE > NE. The activation in SMA remained marginally significant when controlling for PANAS positive affect (pSVC = .061) and significant when entering the difference in RT between FE and NE (pSVC = .032), or RT in FE alone (pSVC = .034). Similarly, the significance of the activation in S1 changed only slightly when controlling for PANAS (pSVC = .084), the difference in RT (pSVC = .052) or the reaction time for FE (pSVC = .059).

Exploratory analysis of correlations between brain regions relevant for emotion processing and measures of alexithymia

The results of our exploratory analysis considering associations between measures of alexithymia and brain activity in the AMG, FFG, vmPFC, striatum, SMA and S1 are displayed in Figure  2. Descriptively, our measures of alexithymia are rather positively related to activation in S1 and SMA and show no or negative correlative trends with AMG and vmPFC. These relationships between alexithymia and FFG as well as striatum depend on the contrast, and no consistent pattern emerges. When thresholding the plot at p = .05 (two-tailed), SMA seems to be strongly associated with TSIA-DDF (all contrasts), while S1 is related to TSIA-DDF (AN > NE, FE > NE), TSIA and TAS-20 (AN > NE) in the conditions with negative emotions. For the contrast HA > NE, the activity in vmPFC is negatively related to TAS-20-DDF and TSIA-DDF. Moreover, FFG activity seems to be positively related to TAS-20 in both negative conditions. For AN > NE, activation in the left striatum seems to be positively related to TSIA-DDF. However, it has to be noted that the correlations between TSIA-DDF and SMA and S1 for AN > NE are likely to be an overestimation of the real correlations in these areas because we extracted a cluster using a mask defined by voxels that positively correlated with TSIA-DDF in that very contrast (cf. [38,39]). Thus, these correlations are only presented here in an exploratory and descriptive fashion.

thumbnailFigure 2. Relationship (as calculated with Spearman’s rho) between measures of alexithymia and brain activations in regions-of-interest that are relevant for facial emotion processing. The left column (A,C,E) depicts the magnitude of rho coded by color; in the right column (B,D,F), rho is only presented as different from zero if the according p < .05. Each row is related to one contrast: HA > NE (happy > neutral) in A and B. AN > NE (angry > neutral) in C and D. FE > NE (fearful > neutral) in E and F. l = left, r = right; AMG = amygdala, FFG = fusiform gyrus, vmPFC = ventro-medial pre-frontal cortex, STR = striatum, SMA = supplementary motor area, S1 = primary sensory cortex.

Discussion

This study investigated the effects of self-report (TAS-20-DDF) and observer-rated (TSIA-DDF) facets of alexithymia on the labeling and neural processing of facial emotions presented for a rather long time (1 or 3 seconds). Our analysis of the main contrasts revealed significant clusters of brain activation in the fusiform gyrus, inferior and middle occipital gyrus (all conditions), in the middle temporal gyrus (fearful faces), inferior (fearful) and orbital and medial (happy) frontal gyrus as well as the cerebellum. All of these regions have been reported to be implicated in facial emotion processing (e.g.: [7,8,40-42]). Thus, we can assume that our experimental design is suitable for eliciting brain activation related to facial emotion recognition. Considering the specific effects of alexithymia, we found that high TSIA-DDF scores were related to increased reaction times when labeling angry and fearful faces and to increased brain activation in SMA and right S1 during the recognition of these negative faces. A post-hoc exploratory analysis suggests that activity in brain areas that are important in the affective components of facial emotion processing (AMG, vmPFC, striatum) does not show a particular relationship with alexithymia in the current task.

Their increased reaction times indicate that alexithymic individuals were slower in labeling negative emotions. Highly alexithymic individuals appear to need more time to reach a labeling accuracy level similar to subjects with low alexithymia. In contrast to previous studies describing a relationship between accuracy and degree of alexithymia [12,13], we used relatively long stimulus presentation times and response windows and could not reveal interrelationships between alexithymia and recognition accuracy. Thus, it seems that alexithymic individuals have difficulties in recognizing facial expressions, which are reflected in decreased accuracy when presentation times and response windows are short (see also [14]). Prolonging presentation times and response windows could improve recognition accuracy, however, at the cost of increases in response time.

SMA is part of a brain network that is involved in the processing of motor-related information and motor preparation and has been shown to be involved in the production of facial emotions [43]. Moreover, it has been argued that (especially pre-) SMA is involved in the recognition of facial emotions [44] by playing an important role in the motor components of simulation (see also [6]). Additionally, a cluster in S1 was revealed, which seems to reflect somatosensory aspects of facial emotion processing [3,7,45]. According to Adolphs et al. [46], recognizing emotions from facial expressions requires right primary somatosensory areas. The authors argue that recognition of another individual’s emotional state is mediated by internally generated somatosensory representations that simulate how the other individual would feel when displaying a certain facial expression. Taken together, this mediation could mean that highly alexithymic individuals have difficulties in automatically reenacting the negative facial emotion of others when these are presented briefly [15]. When the presentation time is increased, highly alexithymic individuals can reach a similar performance as less alexithymic individuals, which seem to require an increased activation of motor and somatosensory areas. Interestingly, it has been found that highly (as compared to less) alexithymic individuals also show increased activation in motor-related brain areas when interpreting the directed actions of others in a classical mirror-neuron task and show no differences in interpreting these actions [47]. Thus, highly alexithymic individuals may be more inclined to imitate the actions of others via (covert) motor simulation than are non-alexithymics. A recent meta-analysis by van der Velde et al. [48] reported that high levels of alexithymia are related to decreased activity in the SMA when participants are confronted with negative stimuli. However, this meta-analysis included all types of emotional paradigms and tasks (not only facial emotion recognition), so the published results may not necessarily reflect processes related specifically to facial emotion recognition.

There seems to be no particular relationship between activity in the amygdala, vmPFC and ventral striatum and alexithymia in the task studied here. This finding is very interesting because earlier studies on brain function [49-52] and structure [53] reported alterations in highly alexithymic individuals in these regions. In particular, functional studies on automatic processing of emotional faces (affective priming) [49-51] have revealed decreased activations in these brain areas. The lack of involvement in the current task may be the case because the emotional faces were presented for a rather long time in the current study. The amygdala and the ventral striatum, however, are thought to operate in a fast and automatic fashion and may be less relevant when the participants are fully aware of the emotional nature of the faces (e.g., [54,55]), as in the current study. Thus, it seems that alexithymic individuals show less automatic activation in brain regions particularly involved in the affective components of face processing (AMG, ventral striatum, vmPFC), which most likely leads to alterations in the processing of and difficulties in the labeling of briefly presented faces. However, alexithymic individuals seem to be able to simulate the bodily aspects of facial expressions when the presentation times and response windows are long enough, which makes the correct recognition of faces possible in this case.

Our study points to deficits limited to the recognition of negative faces in alexithymia. Neither behavioral nor neurobiological differences were revealed for happy faces. This finding suggests that alexithymics have fewer problems interpreting positive compared to negative facial expressions. A recent review on alexithymia and the processing of emotional facial expressions concluded that the difficulties of alexithymic individuals in processing facial emotions are not specific to certain emotions [14]. The work of Sonnby-Borgström [15] shows that the imitation of facial expressions (measured with facial EMG) in highly alexithymic individuals was only decreased for corrugator activity related to negative emotions, but not for zygomaticus activity related to happy faces. Against this background, alexithymic individuals may display fewer deficits in automatically simulating happy faces compared to neutral ones, which possibly renders the recognition of happy faces easier.

It is important to note that in our study, the objective measure of alexithymia (TSIA), but not the self-report measure (TAS-20), was predictive for recognition performance. Because some alexithymic individuals may not be aware of their own deficits, self-report tests could be less suitable for measuring difficulties in describing feelings compared to objective tests such as the TSIA.

It has been argued that the TAS-20 and the TSIA only measure cognitive aspects of alexithymia and neglect affective parts of the alexithymia construct [56]. A questionnaire that possibly captures these affective components is the Bermond-Vorst-Alexithymia Questionnaire ( [57], but see also [58]). It is possible that additionally applying this measure of alexithymia may have the potential to discover relationships between the brain areas involved in the affective components of emotional face processing. Future studies need to be conducted to determine whether the results of the current study are only related to cognitive alexithymia or whether they generalize to affective alexithymia as well.

Conclusion

In summary, alexithymic individuals have difficulties in labeling facial expressions of emotion, even when these are presented with little temporal constraints. Such individuals are slowed in their labeling of angry and fearful facial emotions, and they manifest increased activation in the somatosensory and supplementary motor cortex in response to these negative faces. These cortical regions are involved in the simulation of the bodily components of facial emotional expressions. Thus, the present data suggest that alexithymic individuals may recruit cortical processing resources that are involved in the simulation of the bodily components rather than of affective states (angry and fearful) to interpret these facial expressions.

Abbreviations

AMG: Amygdala; AN: Experimental condition with angry faces; BDI: Beck depression inventory; DDF: Difficulties describing feelings (subscale of TAS-20 and TSIA); EMG: Electromyography; EPI: Echo planar imaging; FE: Experimental condition with fearful faces; FFG: Fusiform gyrus; (f)MRI: (functional) magnetic resonance imaging; HA: Experimental condition with happy faces; MNI: Montreal neurological institute; NE: Experimental condition with neutral faces; PANAS: Positive and Negative affect schedule; ROI: Region of interest; RT: Reaction time; S1: Primary somatosensory cortex; SMA: Supplementary motor area; STAI: State-trait-anxiety inventory; SVC: Small volume corrected; TAS-20: 20-Item Toronto Alexithymia scale; TSIA: Toronto structured interview for Alexithymia; vmPFC: Ventro-medial prefrontal cortex.

Competing interest

The authors declare that they have no competing interests.

Authors’ contributions

KI, JS, VL, NR, HK, MR, HG, AP, JL, AK, AV, TS designed the study; MR, HG supervised the alexithymia interviews; KI, VL, NR, TS conducted the psychometric testing of the participants; JS, HK, AP, JL, AV prepared the fMRI sequences; KI, JS, VL, NR, TS, AP run the fMRI experiments; KI, VL, NR, MR, HG, TS analyzed the psychometric data; KI, JS, TS analyzed the fMRI data; KI, JS, VL, NR, HK, MR, HG, AP, JL, AK, AV, TS interpreted the data; KI, JS, AK, TS wrote the manuscript. All authors read and approved the final manuscript.

Acknowledgements

This work was supported by a grant from the German Research Foundation DFG to Thomas Suslow and Harald Kugel (SU 222/6-1).

We thank Sophie-Luise Lenk and Marc Rupietta for their help in data collection.

References

  1. Niedenthal PM: Embodying emotion.

    Science 2007, 316:1002-1005. OpenURL

  2. Dimberg U, Andréasson P, Thunberg M: Emotional empathy and facial reactions to facial expressions.

    J Psychophysiol 2011, 25:26-31. OpenURL

  3. Heberlein AS, Adolphs R: Neurobiology of Emotion Recognition: Current Evidence for Shared Substrates. 1st edition. Edited by Harmon-Jones E, Winkielman P. New York: Guilford Press; 2007:31-55. [Soc Neurosci]

  4. Heberlein AS, Atkinson AP: Neuroscientific evidence for simulation and shared substrates in emotion recognition: beyond faces.

    Emot Rev 2009, 1:162-177. OpenURL

  5. Heberlein AS, Padon AA, Gillihan SJ, Farah MJ, Fellows LK: Ventromedial frontal lobe plays a critical role in facial emotion recognition.

    J Cogn Neurosci 2008, 20:721-733. OpenURL

  6. Van der Gaag C, Minderaa RB, Keysers C: Facial expressions: what the mirror neuron system can and cannot tell us.

    Soc Neurosci 2007, 2:179-222. OpenURL

  7. Adolphs R: Neural systems for recognizing emotion.

    Curr Opin Neurobiol 2002, 12:169-177. OpenURL

  8. Fusar-Poli P, Placentino A, Carletti F, Landi P, Allen P, Surguladze S, Benedetti F, Abbamonte M, Gasparotti R, Barale F, Perez J, McGuire P, Politi P: Functional atlas of emotional faces processing: a voxel-based meta-analysis of 105 functional magnetic resonance imaging studies.

    J Psychiatry Neurosci 2009, 34:418-432. OpenURL

  9. Sifneos PE: The prevalence of alexithymic characteristics in psychosomatic patients.

    Psychother Psychosom 1973, 22:255-262. OpenURL

  10. Bagby RM, Parker JDA, Taylor GJ: The twenty-item Toronto Alexithymia scale—I. Item selection and cross-validation of the factor structure.

    J Psychosom Res 1994, 38:23-32. OpenURL

  11. Bagby RM, Taylor GJ, Parker JDA, Dickens SE: The development of the Toronto Structured Interview for Alexithymia: item selection, factor structure, reliability and concurrent validity.

    Psychother Psychosom 2006, 75:25-39. OpenURL

  12. Parker PD, Prkachin KM, Prkachin GC: Processing of facial expressions of negative emotion in alexithymia: the influence of temporal constraint.

    J Pers 2005, 73:1087-1107. OpenURL

  13. Swart M, Kortekaas R, Aleman A: Dealing with feelings: characterization of trait alexithymia on emotion regulation strategies and cognitive-emotional processing.

    PLoS One 2009, 4:e5751. OpenURL

  14. Grynberg D, Chang B, Corneille O, Maurage P, Vermeulen N, Berthoz S, Luminet O: Alexithymia and the Processing of Emotional Facial Expressions (EFEs): systematic review, unanswered questions and further perspectives.

    PLoS One 2012, 7:e42429. OpenURL

  15. Sonnby-Borgström M: Alexithymia as related to facial imitation, mentalization, empathy, and internal working models-of-self and -others.

    Neuropsychoanalysis 2009, 11:111-128. OpenURL

  16. Mériau K, Wartenburger I, Kazzer P, Prehn K, Lammers C-H, van der Meer E, Villringer A, Heekeren HR: A neural network reflecting individual differences in cognitive processing of emotions during perceptual decision making.

    Neuroimage 2006, 33:1016-1027. OpenURL

  17. Montebarocci O, Surcinelli P, Rossi N, Baldaro B: Alexithymia, verbal ability and emotion recognition.

    Psychiatr Q 2011, 82:245-252. OpenURL

  18. Yarkoni T, Braver TS: Cognitive Neuroscience Approaches to Individual Differences in Working Memory and Executive Control: Conceptual and Methodological Issues. Edited by Gruszka A, Matthews G, Szymura B. New York, NY: Springer; 2010:87-107. [Handb Individ Differ Cogn]

    [The Springer Series on Human Exceptionality (Series editor)]

  19. Lumley MA, Gustavson BJ, Partridge RT, Labouvie-Vief G: Assessing alexithymia and related emotional ability constructs using multiple methods: interrelationships among measures.

    Emotion 2005, 5:329-342. OpenURL

  20. Taylor GJ: Recent developments in alexithymia theory and research.

    Can J Psychiatry 2000, 45:134-142. OpenURL

  21. Lichev V, Rufer M, Rosenberg N, Ihme K, Grabe HJ, Kugel H, Donges U-S, Kersting A, Suslow T: Assessing alexithymia and emotional awareness: Relations between measures in a German non-clinical sample.

    Compr Psychiatry .

    in press

    OpenURL

  22. Bach M, Bach D, de Zwaan M, Serim J, Böhmer J: Validierung der deutschen Version der 20-Item Toronto Alxithymie Skala bei Normalpersonen und psychiatrischen Patienten.

    Psychother Psychosom Med Psychol 1996, 46:23-28. OpenURL

  23. Grabe HJ, Löbel S, Dittrich D, Bagby RM, Taylor GJ, Quilty LC, Spitzer C, Barnow S, Mathier F, Jenewein J, Freyberger HJ, Rufer M: The German version of the Toronto Structured Interview for Alexithymia: factor structure, reliability, and concurrent validity in a psychiatric patient sample.

    Compr Psychiatry 2009, 50:424-430. OpenURL

  24. Hautzinger M, Keller F, Kühner C: Beck-Depressions-Inventar (BDI). Harcourt Test Services: Frankfurt/Main; 2006. OpenURL

  25. Laux L, Glanzmann P, Schaffner P, Spielberger CD: State–Trait–Anxiety Inventory (STAI). [German version.].

    1981,  .

  26. Krohne HW, Egloff B, Kohlmann C-W, Tausch A: Untersuchungen mit einer deutschen Version der “Positive and Negative Affect Schedule” (PANAS). [Investigations with a German version of the Positive and Negative Affect Schedule (PANAS).].

    Diagnostica 1996, 42:139-156. OpenURL

  27. Lundqvist D, Flykt A, Öhmann A:

    The Karolinska Directed Emotional Faces - KDEF. 1998. OpenURL

  28. Mugler JP, Brookeman JR: Three-dimensional magnetization-prepared rapid gradient-echo imaging (3D MP RAGE).

    Magn Reson Med 1990, 15:152-157. OpenURL

  29. Grier JB: Nonparametric indexes for sensitivity and bias: computing formulas.

    Psychol Bull 1971, 75:424-429. OpenURL

  30. Lieberman MD, Cunningham WA: Type I and Type II error concerns in fMRI research: re-balancing the scale.

    Soc Cogn Affect Neurosci 2009, 4:423-428. OpenURL

  31. Yarkoni T, Barch DM, Gray JR, Conturo TE, Braver TS: BOLD correlates of trial-by-trial reaction time variability in gray and white matter: a multi-study fMRI analysis.

    PLoS One 2009, 4:e4257. OpenURL

  32. Tzourio-Mazoyer N, Landeau B, Papathanassiou D, Crivello F, Etard O, Delcroix N, Mazoyer B, Joliot M: Automated anatomical labeling of activations in SPM using a macroscopic anatomical parcellation of the MNI MRI single-subject brain.

    Neuroimage 2002, 15:273-289. OpenURL

  33. Maldjian JA, Laurienti PJ, Kraft RA, Burdette JH: An automated method for neuroanatomic and cytoarchitectonic atlas-based interrogation of fMRI data sets.

    Neuroimage 2003, 19:1233-1239. OpenURL

  34. Pessoa L, McKenna M, Gutierrez E, Ungerleider LG: Neural processing of emotional faces requires attention.

    Proc Natl Acad Sci U S A 2002, 99:11458-11463. OpenURL

  35. Beck AT, Steer RA, Brown GK: BDI-II Beck Depression Inventory Manual. 2nd edition. Psychological Corp.: San Antonio, TX; 1998. OpenURL

  36. Spielberger CD, Gorsuch R, Lushene PR, Vagg PR, Jacobs AG:

    State-Trait Anxiety Inventory. 1970. OpenURL

  37. Watson D, Clark LA, Tellegen A: Development and validation of brief measures of positive and negative affect: the PANAS scales.

    J Pers Soc Psychol 1988, 54:1063-1070. OpenURL

  38. Vul E, Harris C, Winkielman P, Pashler H: Puzzlingly high correlations in fMRI studies of emotion, personality, and social cognition.

    Perspect Psychol Sci 2009, 4:274-290. OpenURL

  39. Kriegeskorte N, Lindquist MA, Nichols TE, Poldrack RA, Vul E: Everything you never wanted to know about circular analysis, but were afraid to ask.

    J Cereb Blood Flow Metab 2010, 30:1551-1557. OpenURL

  40. Pizzagalli DA, Lehmann D, Hendrick AM, Regard M, Pascual-Marqui RD, Davidson RJ: Affective judgments of faces modulate early activity (160 ms) within the fusiform gyri.

    Neuroimage 2002, 16:663-677. OpenURL

  41. Suslow T, Kugel H, Rauch AV, Dannlowski U, Bauer J, Konrad C, Arolt V, Heindel W, Ohrmann P: Attachment avoidance modulates neural response to masked facial emotion.

    Hum Brain Mapp 2009, 30:3553-3562. OpenURL

  42. Dannlowski U, Stuhrmann A, Beutelmann V, Zwanzger P, Lenzen T, Grotegerd D, Domschke K, Hohoff C, Ohrmann P, Bauer J, Lindner C, Postert C, Konrad C, Arolt V, Heindel W, Suslow T, Kugel H: Limbic scars: long-term consequences of childhood maltreatment revealed by functional and structural magnetic resonance imaging.

    Biol Psychiatry 2012, 71:286-293. OpenURL

  43. Likowski KU, Mühlberger A, Gerdes ABM, Wieser MJ, Pauli P, Weyers P: Facial mimicry and the mirror neuron system: simultaneous acquisition of facial electromyography and functional magnetic resonance imaging.

    Front Hum Neurosci 2012, 6:214. OpenURL

  44. Rochas V, Gelmini L, Krolak-Salmon P, Poulet E, Saoud M, Brunelin J, Bediou B: Disrupting pre-SMA activity impairs facial happiness recognition: an event-related TMS study.

    Cereb Cortex 2012, 23:1517-1525. OpenURL

  45. Haxby JV, Hoffman EA, Gobbini MI: The distributed human neural system for face perception.

    Trends Cogn Sci 2000, 4:223-233. OpenURL

  46. Adolphs R, Damasio H, Tranel D, Cooper G, Damasio AR: A role for somatosensory cortices in the visual recognition of emotion as revealed by three-dimensional lesion mapping.

    J Neurosci 2000, 20:2683-2690. OpenURL

  47. Moriguchi Y, Ohnishi T, Decety J, Hirakata M, Maeda M, Matsuda H, Komaki G: The human mirror neuron system in a population with deficient self-awareness: an fMRI study in alexithymia.

    Hum Brain Mapp 2009, 30:2063-2076. OpenURL

  48. Van der Velde J, Servaas MN, Goerlich KS, Bruggeman R, Horton P, Costafreda SG, Aleman A: Neural correlates of alexithymia: A meta-analysis of emotion processing studies.

    Neurosci Biobehav Rev 2013, 37:1774-1785. OpenURL

  49. Reker M, Ohrmann P, Rauch AV, Kugel H, Bauer J, Dannlowski U, Arolt V, Heindel W, Suslow T: Individual differences in alexithymia and brain response to masked emotion faces.

    Cortex 2010, 46:658-667. OpenURL

  50. Duan X, Dai Q, Gong Q, Chen H: Neural mechanism of unconscious perception of surprised facial expression.

    Neuroimage 2010, 52:401-407. OpenURL

  51. Kugel H, Eichmann M, Dannlowski U, Ohrmann P, Bauer J, Arolt V, Heindel W, Suslow T: Alexithymic features and automatic amygdala reactivity to facial emotion.

    Neurosci Lett 2008, 435:40-44. OpenURL

  52. Lee B-T, Lee H-Y, Park S-A, Lim J-Y, Tae WS, Lee M-S, Joe S-H, Jung I-K, Ham B-J: Neural substrates of affective face recognition in alexithymia: a functional magnetic resonance imaging study.

    Neuropsychobiology 2011, 63:119-124. OpenURL

  53. Ihme K, Dannlowski U, Lichev V, Stuhrmann A, Grotegerd D, Rosenberg N, Kugel H, Heindel W, Arolt V, Kersting A, Suslow T: Alexithymia is related to differences in gray matter volume: a voxel-based morphometry study.

    Brain Res 2013, 1491:60-67. OpenURL

  54. Pessoa L: On the relationship between emotion and cognition.

    Nat Rev Neurosci 2008, 9:148-158. OpenURL

  55. Pessoa L, Japee S, Sturman D, Ungerleider LG: Target visibility and visual awareness modulate amygdala responses to fearful faces.

    Cereb Cortex 2006, 16:366-375. OpenURL

  56. Bermond B, Clayton K, Liberova A, Luminet O, Maruszewski T, Ricci Bitti PE, Rimé B, Vorst HH, Wagner H, Wicherts J: A cognitive and an affective dimension of alexithymia in six languages and seven populations.

    Cogn Emot 2007, 21:1125-1136. OpenURL

  57. Vorst H, Bermond B: Validity and reliability of the Bermond–Vorst Alexithymia Questionnaire.

    Pers Individ Dif 2001, 30:413-434. OpenURL

  58. Bagby RM, Quilty LC, Taylor GJ, Grabe HJ, Luminet O, Verissimo R, Grootte I, De Vanheule S: Are there subtypes of alexithymia?

    Pers Individ Dif 2009, 47:413-418. OpenURL