The representation of facial emotion expands from sensory to prefrontal cortex with development
Abstract
Facial expression recognition develops rapidly during infancy and improves from childhood to adulthood. As a critical component of social communication, this skill enables individuals to interpret others’ emotions and intentions. However, the brain mechanisms driving the development of this skill remain largely unclear due to the difficulty of obtaining data with both high spatial and temporal resolution from young children. By analyzing intracranial EEG data collected from childhood (5-10 years old) and post-childhood groups (13-55 years old), we find differential involvement of high-level brain area in processing facial expression information. For the post-childhood group, both the posterior superior temporal cortex (pSTC) and the dorsolateral prefrontal cortex (DLPFC) encode facial emotion features from a high-dimensional space. However, in children, the facial expression information is only significantly represented in the pSTC, not in the DLPFC. Further, the encoding of complex emotions in pSTC is shown to increase with age. Taken together, young children rely more on low-level sensory area than on the prefrontal cortex for facial emotion processing, suggesting that the prefrontal cortex matures with development to enable a full understanding of facial emotions, especially complex emotions that require social and life experience to comprehend.
Related articles
Related articles are currently not available for this article.