The emotional facial recognition performance of Chinese patients with schizophrenia: An event-related potentials study : Indian Journal of Psychiatry

Secondary Logo

Journal Logo


The emotional facial recognition performance of Chinese patients with schizophrenia: An event-related potentials study

Zhang, Yangjun; Zhao, Ding; Wu, Jianfan; Lin, Lixin; Ji, Jiawu

Author Information
Indian Journal of Psychiatry 65(3):p 327-333, March 2023. | DOI: 10.4103/indianjpsychiatry.indianjpsychiatry_413_22
  • Open



Schizophrenia is a chronic mental disease, accompanied by several symptoms, such as hallucination, delusion, behavioral disturbance, abulia, affective flattening, social withdrawal, etc.[1] These symptoms are associated with poor social cognition, causing the confused perception of social cues in patients with schizophrenia (SZs).[2] As an indispensable component of social cognition, the recognition of emotional facial expressions has become an independent determinant of patients’ social functions.[3]

Several types of researches, including meta-analyses, revealed that SZs have deficits in identifying and recognizing emotional facial expressions.[4,5] However, whether such deficits still exist in specific emotional faces remains controversial. A scholar demonstrated that SZs have impairments in all emotional faces, especially in fearful, disgustful, and neutral faces, as well as error patterns of misunderstanding neutral faces as negative cues.[6] Moreover, another article has revealed that SZs had more difficulties recognizing happy faces than fearful ones.[7] Furthermore, a functional magnetic resonance imaging (fMRI) study has proposed that SZs had significantly weaker amygdala activation in processing fearful faces than healthy controls (HCs).[8] Meanwhile, another fMRI study has suggested that amygdala activation of SZs to fearful facial expression was relatively more minor when compared to neutral facial expression.[9]

Researchers have applied event-related potentials (ERPs) to investigate the impairments of emotional face perception in SZs. The N170 component, a negative component whose peak appears at about 150~180 ms, is regarded as the reflection of early visual processing of human faces. Numerous studies have found lower negative N170 amplitudes in SZs compared with HCs during the assessment of emotional faces.[10–13] However, a non-significant reduction in N170 amplitude has been repeatedly reported previously.[14–16]

The P300 component, whose peak appears at about 300 ms after the stimulus onset, is an index of attentional resources.[17] Previous findings of the P300 in facial recognition experiments revealed the reduced amplitudes in SZs compared with that in HCs.[5] The results are mixed regarding specific emotional faces, such as fear, neural, happiness, etc., A recent study has revealed that fearful and happy faces triggered smaller P3a amplitude in SZs than HCs.[18] Others reported that negative faces could trigger a greater P300 amplitude in HCs, whereas a smaller P300 amplitude was observed in SZs compared with positive faces.[19,20]

Moreover, some researchers found an attenuated P3 amplitude in patients with impulsive aggression or violent criminal convictions during oddball tasks.[21,22] It was found that violence could be an independent risk factor for impaired cognitive-executive functioning. Meanwhile, others indicated particular patterns of P300 amplitude to negative stimuli in aggressive groups. They reported that non-violent participants showed an enhanced P300 amplitude to negative stimuli, such as social and physical threat words, by comparison with neutral stimuli, while the such difference was not found in violent participants.[23,24] However, few studies have concentrated on the ERP response to emotional faces in violent schizophrenic patients. Frommann et al.[25] demonstrated that schizophrenic patients with a history of violence had larger N250 amplitudes to emotional faces than schizophrenic patients without any history of violence. It was suggested that larger N250 amplitudes could be due to the higher arousal of violent patients to emotional faces.

Two main objectives were assessed in the present study. Firstly, it was attempted to replicate previous studies with the Chinese Facial Affective Picture System (CFAPS) to compare ERP responses of SZs and HCs, including the N170 and P300 components, among three facial expressions (happy, fearful, and neutral). Another objective was to indicate whether the ERP responses of patients with schizophrenia could be correlated with their level of violence.



A total of 30 patients who were diagnosed with schizophrenia, according to the 5th edition of The Diagnostic and Statistical Manual of Mental Disorders (DSM-V), were enrolled as participants. The inclusion criteria were as follows: (i) participants aged 20-50 years old; (ii) participants who were right-handed; (iii) participants who had a normal or corrected-to-normal vision; (iv) participants who had no history of epilepsy, or serious brain trauma, severe encephalitis, brain tumor, or organic brain diseases; (v) participants who had no history of alcohol consumption or drug abuse in the past 5 years; (vi) participants who were not treated with electroconvulsive treatment during the past 3 years; (vii) participants who were in stable status for at least 3 months to avoid significant alterations in symptoms or medication. All participants voluntarily participated in the experiment, and they were recruited from the Psychiatric Outpatient Department and In-patient Department of Fuzhou Neuro-Psychiatric Hospital (Fuzhou, China).

We also included 31 HCs who were matched for age, handedness, and eyesight to SZs. They were all from the surrounding community and recruited through newspaper advertisements, leaflets, the Internet, etc. They were also evaluated to ensure the absence of (i) a history or family history of psychiatric illness; (ii) a history of epilepsy, serious brain trauma, severe encephalitis, brain tumor, and other organic brain diseases; (iii) a history of alcohol consumption or drug abuse in the past 5 years; (iv) history of receiving antiepileptic or psychoactive drugs that could affect the electrical activity of the brain.

Patients’ risk of violence was determined by the Modified Overt Aggression Scale (MOAS). This assessment was measured based on four models, verbal aggression, aggression against objects, aggression against self, and aggression against others.[26] Well-trained psychiatrists performed all the assessments.

Experimental stimuli

The emotional faces were derived from the CFAPS, compiled by the National Key Laboratory of Cognitive Neuroscience and Learning, Beijing Normal University.[27] Previous studies[28,29] showed that the recognition performance of non-native faces was markedly worse than that of native faces. Hence, the Chinese emotional faces might improve participants’ performance and arouse more significant electro-encephalic responses. It contained 7 standard emotional faces: anger, disgust, fear, sadness, surprise, neutral, and joy, of which three emotional faces (happy, fearful, and neutral) were herein selected as experimental stimuli. Examples of the experimental stimuli are shown in Figure 1a.

Figure 1:
Examples of the experimental stimuli (a) and the sequences of events within a single-task trial (b)

Because of cultural and ethnic differences, humans prefer facial information about their races. Thus, the present study utilized affective pictures of Chines faces as emotion-elicitation material to avoid the race effect. A total of 36 emotional pictures (male (18) vs. female (18)) were selected from the CFAPS for each facial expression (happy, fearful, and neutral). There was no significant difference in the potency of the three emotional pictures (mean: neutral = 5.67, happy = 5.77, fearful = 5.66; F (2.106) = 0.657, P = 0.521). This research also concluded 50 neutral inverted facial images as responded stimuli. Moreover, the background, contrast, and brightness of all facial images were consistent, and the images were resized to 640 × 480 pixels by the Picture Manager software.

Experimental stimuli were displayed on a 17-inch flat color display, with a resolution of 1024 x 768, and the screen background was set to black (RGB = 0,0,0). The experimental program was written by the E-Prime 1.0 software.

Task procedure

The experiment was conducted in a special laboratory under the conditions of proper temperature, sound insulation, and closure. Participants were 100 cm seated in front of a computer screen to complete a designed task. The task was designed based on the classical oddball paradigm and contained three rounds, and each round included 252 trials. We randomly presented one of the three upright facial expressions (happy, fearful, and neutral) as target stimuli (10% probability) and inverted neutral faces as standard stimuli (90% probability) in each round. A single-task trial was processed in the following sequence [Figure 1b]: Step 1. A cross was presented for 1,000 ms on the center of the screen; Step 2. One of the above-mentioned faces appeared in the central area for 1,000 ms, followed by a black screen for 800-1500 ms (in order to relieve visual fatigue). Then, Step 2 was repeated, and participants were asked to press the space bar when they observed the emotional faces. At the end of each round, participants were arranged to close their eyes and rest for 20 min.

Electroencephalogram (EEG) recording and analysis

The recording of the brain electrical activity included the left mastoid as a reference electrode, while the average values of the left and right mastoids were used as references during offline analysis. The electrode cap was covered with a 64-channel amplifier, which was placed by the 10-20 System (Compumedics Neuroscan Inc., Char­lotte, NC, USA). The electrodes that recorded the horizontal electrooculogram were placed at the outer canthi of both eyes. The electrodes placed above and below the left eye were utilized to record the vertical electrooculogram. All electrode locations were maintained below 5 kW. A 0.01–100 Hz band-pass filter was applied to amplify the EEG and electrooculogram, and the sampling frequency was set to 1000 Hz/channel. Data processing was performed by the CURRY 8 software (Compumedics Neuroscan Inc.) and re-referenced to the averaged values of the mastoids. A regression method was used to erase the ocular artifacts from data through the CURRY 8 software, and the data were digitally filtered with a band-pass filter of 0.1 Hz (12 dB/octave) -30.0 Hz (24 dB/octave). The segments for each trial were extended from 200 ms before the stimulus onset to 1000 ms thereafter. Then, the baseline correction was processed by deducting the average activity of the baseline period of each channel from all trials. The trials with extensive artifacts whose EEG voltages exceeded the limits of ± 80 μV were automatically rejected from the analysis. The electrodes selected, consistently with previous literature, the N170 peaks were measured at P7 and P8 electrode positions,[30–32] while the P300 peaks were measured at Cz and Pz electrode positions.[5,14,33]

The statistical analysis was carried out using SPSS 23.0 software (IBM, Armonk, NY, USA). Data were presented as mean ± standard deviation (SD). Repeated measures analysis of variance (ANOVA) was utilized to compare differences in facial emotion recognition between-within of repeated measures of two groups of 2 (group) × 3 (emotional faces), followed by the pairwise comparison. P < 0.05 was considered statistically significant. Pearson’s correlation analysis was used to assess the correlation of patients’ MOAS scores with their N170 and P300 amplitudes in response to emotional faces, respectively.


General data

Overall, a total of 64 participants completed the experiment. However, due to the excessive artifacts and wider margin of disturbances, the data of 2 patients and 1 HC were excluded. Last, the data of 30 SZs (male (16) vs. female (14)) and 31 HCs (male (16) vs. female (15)) were analyzed. SZs and HCs have no difference in age, years of education, reaction time, or accuracy. The accuracy of nearly 100% indicates that all the subjects have completed the experimental task seriously. Table 1 concludes the characteristics of the two groups.

Table 1:
Sample description (mean±SD)

N170 component

The ANOVA between-within of repeated measures of two groups of 2 (group) × 3 (emotional faces) exhibited a significant main effect of group (F [1,59] = 225.13, P < 0.001) with patients (M = -3.03 μV, SD ± 0.094), eliciting smaller N170 amplitudes compared with control group (M = -5.01 μV, SD ± 0.092). There was no significant main effect of facial expression (F [2,58] = 1.457, P = 0.237) or interaction between the group and facial expression (F [2,58] = 1.809, P = 0.168). The pairwise comparison showed that the N170 amplitudes were significantly smaller among all three facial expressions [happy (P < 0.001), fearful (P < 0.001), and neutral (P < 0.001)] in SZs. All the pairwise comparisons were corrected by Bonferroni correction. No significant difference was found in the N170 latency between the two groups (F [1,59] = 0.011, P = 0.916). The mean N170 amplitudes and latency of faces in both groups are illustrated in Table 2.

Table 2:
Comparison of the schizophrenia and healthy control for N170, P300 amplitude and latency for three emotional faces

P300 component

The ANOVA between-within of repeated measures of two groups of 2 (group) × 3 (emotional faces) also revealed a significant main effect of group (F [1,59] = 57.286, P < 0.001), in which the P300 amplitudes of SZs (M = 2.911 μV, SD ± 0.065) were smaller than those of HCs (M = 3.603 μV, SD ± 0.064). The significant main effect was also found in stimulus type (F [2,58] = 10.731, P < 0.001), in which the fearful faces triggered more positive P300 amplitudes compared with neutral faces (P = 0.002) and happy faces (P < 0.001). There was no significant interaction between the group and facial expression (F [2,58] = 2.732, P = 0.074). The pairwise comparison showed that the P300 amplitudes were smaller among all three facial expressions (happy, fearful, and neutral) in SZs than HCs [happy (P < 0.001), fearful (P < 0.001), and neutral (P < 0.001)]. Moreover, a significantly larger P300 amplitude was validated in the fearful faces compared with that in neutral faces (P = 0.002) and happy faces (P < 0.001) in HCs, while the such difference was not found in SZs. All the pairwise comparisons were corrected by Bonferroni correction. No significant difference was detected in the P300 latency between the two groups (F [1,59] = 0.069, P = 0.794). The mean P300 amplitudes and latency of faces in both groups are illustrated in Table 2.

Correlation analysis

Correlations were calculated between patients’ MOAS scores with their N170 and P300 amplitudes, respectively. It was revealed that the MOAS scores of patients were significantly correlated with their P300 amplitudes (r = –0.059, P = 0.001) [Figure 2]. However, there was no significant correlation between MOAS scores and N170 amplitudes (P = 0.189).

Figure 2:
Correlations between MOAS scores and P300 amplitudes of SZs


One primary objective of the present experiment was to indicate whether facial emotional recognition significantly decreased in SZs compared with HCs. To accomplish this objective, we compared the two groups ERP responses elicited by the three upright facial expressions (happy, fearful, and neutral).

The present research revealed that all three facial expressions (happy, fearful, and neutral) showed lower negative N170 amplitudes in SZs than that in HCs. This finding is consistent with that of several previously reported studies, in which SZs showed N170 deficits in the recognition of facial expressions.[10–13] The N170 component, which appeared in the early stage of facial coding, could be affected by several factors, such as facial expressions, faces of different races, face orientation (upright or inverted), etc.[34] The results of the present study suggested that SZs had important deficiencies in the structured coding of face recognition. However, one research found no significant difference in the N170 amplitude between SZs and HCs.[35] The authors believed that their ERP program was more accessible than the oddball program, indicating that patients’ facial coding function could afford some easy tasks. Therefore, the next study will concentrate on indicating how patients perform and whether the N170 amplitudes can be improved in easy face recognition tasks.

As previously described, the P300 component could reflect the allocation of attentional resources.[36] In the present study, the mean P300 amplitudes significantly decreased among all three facial expressions (happy, fearful, and neutral) in SZs, and this finding is consistent with that of previous research,[5,18,37] indicating that SZs may have a deficit in attentional resources. Moreover, compared with neutral or positive stimuli (neutral or happy faces), negative stimuli could trigger larger P300 amplitudes in HCs, while there was no significant difference among stimulus types in SZs. This result is highly consistent with An et al.,[19] in which negative stimuli aroused a greater P300 amplitude HCs, while there was a smaller P300 amplitude in SZs compared with positive stimuli. A recent fMRI study also reported a similar result, demonstrating that HCs had a significantly different activation of the left amygdala in sad compared with neutral stimuli, while SZs did not have such differences.[38] As mentioned above, SZs had deficits in recognizing facial expressions,[4,5] thus, one explanation for our result could be attributed to different types of facial expressions.

Besides, a previous study reported a similar result related to the P300 amplitude of different subtypes of schizophrenia,[39] in which negative stimuli triggered a larger amplitude compared with positive stimuli in paranoid schizophrenics, whereas the data of non-paranoid schizophrenics were matched with An et al.[19] Ueno et al.[39] demonstrated that these diverse patterns of P300 amplitude in different subtypes of schizophrenia were due to distinct emotional arousal levels caused by emotional faces. Therefore, we assume that patients in our study also had particular emotional arousal levels when assessing facial expressions. Moreover, the allocation of attentional resources varied based on the arousal levels of different emotional facial expressions.[36] Thus, the results indicated that patients had different distribution patterns of attentional resources on facial expressions, causing the misunderstanding of social cues and unacceptable behaviors in public.

Furthermore, the results of the present study exhibited that patients’ amplitudes of P300 peaks were significantly negatively correlated with MOAS scores, indicating that patients with more violent behavior had lower positive P300 amplitudes in the face recognition task. This finding is similar to a result that was previously reported,[21,22] in which patients with impulsive aggression or violent criminal convictions had decreased P300 amplitudes compared with non-violent controls during the oddball tasks. As mentioned earlier, the P300 component could reflect the allocation of attentional resources.[36] Therefore, we suggest that aggressive patients with schizophrenia may have poor ability of attentional resources during facial recognition tasks. Furthermore, preceding research demonstrated that larger N250 amplitudes were found in schizophrenic patients with a history of violence than in schizophrenic patients without a history of violence during the emotion recognition task. They argued that patients’ higher arousal activated by emotional faces might cause larger N250 amplitudes.[25] Thus, whether the decreased P300 amplitudes in patients with schizophrenia were influenced by their higher arousal in the emotional facial recognition task should be further assessed.

One limitation of our study is that the SZs groups received antipsychotic medication. The dose of antipsychotic medication differed based on their weight, age, the severity of symptoms, etc. A meta-analysis in China indicated a significant but small improvement of P300 amplitude in SZs who received antipsychotic medication compared to those without medication.[40] Another meta-analysis found no significant difference in P300 amplitude between SZs who were under medication and not.[41] Since our study showed that the P300 amplitudes of SZs were smaller than those of HCs, the effect of medication might not account for the difference in P300 amplitude.


In summary, our findings replicated previous studies. Firstly, the lower negative N170 amplitudes and the lower positive P300 amplitudes were found in emotional facial expressions of SZs, indicating that patients could have a deficit in the structural coding of face recognition and available attentional resources. Secondly, the negative stimuli (fearful faces) could trigger a larger P300 amplitude in SZs, which could be related to different facial expressions or different distribution patterns of attentional resources. Moreover, patients with more violent behaviors had lower positive P300 amplitudes in the face recognition task, which suggested that aggressive SZs, may have poor ability of attentional resources during facial recognition tasks.

Ethics approval

This work has been carried out in accordance with the Declaration of Helsinki (2000) of the World Medical Association. This study was approved by the Ethics Committee of Fujian Medical University Affiliated Fuzhou Neuropsychiatric Hospital. (201802).

Financial support and sponsorship

This research was supported by Startup Fund for scientific research, Fujian Medical University (Grant number: 2018QH1247).

Conflicts of interest

There are no conflicts of interest.


1. American Psychiatric Association. Diagnostic and Statistical Manual of Mental Disorders. 5th ed. Washington, DC: American Psychiatric Association; 2013.
2. Bliksted V, Videbech P, Fagerlund B, Frith C. The effect of positive symptoms on social cognition in first-episode schizophrenia is modified by the presence of negative symptoms. Neuropsychology 2017;31:209–19.
3. Moran EK, Culbreth AJ, Barch DM. Emotion regulation predicts everyday emotion experience and social function in schizophrenia. Clin Psychol Sci 2018;6:271–9.
4. Savla GN, Vella L, Armstrong CC, Penn DL, Twamley EW. Deficits in domains of social cognition in schizophrenia:A meta-analysis of the empirical evidence. Schizophr Bull 2013;39:979–92.
5. Shah D, Knott V, Baddeley A, Bowers H, Wright N, Labelle A, et al. Impairments of emotional face processing in schizophrenia patients:Evidence from P100, N170 and P300 ERP components in a sample of auditory hallucinators. Int J Psychophysiol 2018;134:120–34.
6. Kohler CG, Turner TH, Bilker WB, Brensinger CM, Siegel SJ, Kanes SJ, et al. Facial emotion recognition in schizophrenia:Intensity effects and error pattern. Am J Psychiatry 2003;160:1768–74.
7. Tsoi DT, Lee K-H, Khokhar WA, Mir NU, Swalli JS, Gee KA, et al. Is facial emotion recognition impairment in schizophrenia identical for different emotions?A signal detection analysis. Schizophr Res 2008;99:263–9.
8. Das P, Kemp AH, Flynn G, Harris AW, Liddell BJ, Whitford TJ, et al. Functional disconnections in the direct and indirect amygdala pathways for fear processing in schizophrenia. Schizophr Res 2007;90:284–94.
9. Hall J, Whalley HC, McKirdy JW, Romaniuk L, McGonigle D, McIntosh AM, et al. Overactivation of fear systems to neutral faces in schizophrenia. Biol Psychiatry 2008;64:70–3.
10. Tsunoda T, Kanba S, Ueno T, Hirano Y, Hirano S, Maekawa T, et al. Altered face inversion effect and association between face N170 reduction and social dysfunction in patients with schizophrenia. Clin Neurophysiol 2012;123:1762–8.
11. Liu T, Pinheiro AP, Zhao Z, Nestor PG, McCarley RW, Niznikiewicz M. Simultaneous face and voice processing in schizophrenia. Behav Brain Res 2016;305:76–86.
12. Maher S, Mashhoon Y, Ekstrom T, Lukas S, Chen Y. Deficient cortical face-sensitive N170 responses and basic visual processing in schizophrenia. Schizophr Res 2016;170:87–94.
13. Zheng Y, Li H, Ning Y, Ren J, Wu Z, Huang R, et al. Sluggishness of early-stage face processing (N170) is correlated with negative and general psychiatric symptoms in schizophrenia. Front Hum Neurosci 2016;10:615.
14. Akbarfahimi M, Tehrani-Doost M, Ghassemi F. Emotional face perception in patients with schizophrenia:An event-related potential study. Neurophysiology 2013;45:249–57.
15. Komlósi S, Csukly G, Stefanics G, Czigler I, Bitter I, Czobor P. Fearful face recognition in schizophrenia:An electrophysiological study. Schizophr Res 2013;149:135–40.
16. Ramos-Loyo J, González-Garrido AA, Sánchez-Loyo LM, Medina V, Basar-Eroglu C. Event-related potentials and event-related oscillations during identity and facial emotional processing in schizophrenia. Int J Psychophysiol 2009;71:84–90.
17. Asanowicz D, Gociewicz K, Koculak M, Finc K, Bonna K, Cleeremans A, et al. The response relevance of visual stimuli modulates the P3 component and the underlying sensorimotor network. Sci Rep 2020;10:1–20.
18. Onitsuka T, Spencer KM, Nakamura I, Hirano Y, Hirano S, McCarley RW, et al. Altered P3a modulations to emotional faces in male patients with chronic schizophrenia. Clin EEG Neurosci 2020;51:215–21.
19. An SK, Lee SJ, Lee CH, Cho HS, Lee PG, Lee C-i, et al. Reduced P3 amplitudes by negative facial emotional photographs in schizophrenia. Schizophr Res 2003;64:125–35.
20. Turetsky BI, Kohler CG, Indersmitten T, Bhati MT, Charbonnier D, Gur RC. Facial emotion recognition in schizophrenia:When and why does it go awry?. Schizophr Res 2007;94:253–63.
21. Bernat EM, Hall JR, Steffen BV, Patrick CJ. Violent offending predicts P300 amplitude. Int J Psychophysiol 2007;66:161–7.
22. Žukov I, Hrubý T, Kozelek P, Ptáček R, Paclt I, Harsa P. P300 wave: A comparative study of impulsive aggressive criminals. Neuro Endocrinol Lett 2008;29:379–84.
23. Crago RV, Renoult L, Biggart L, Nobes G, Satmarean T, Bowler JO. Physical aggression and attentional bias to angry faces:An event related potential study. Brain Res 2019;1723:146387.
24. Helfritz-Sinville LE, Stanford MS. Looking for trouble?Processing of physical and social threat words in impulsive and premeditated aggression. Psychol Rec 2015;65:301–14.
25. Frommann N, Stroth S, Brinkmeyer J, Wölwer W, Luckhaus C. Facial affect recognition performance and event-related potentials in violent and non-violent schizophrenia patients. Neuropsychobiology 2013;68:139–45.
26. Li W, Yang Y, Hong L, An F-R, Ungvari GS, Ng CH, et al. Prevalence of aggression in patients with schizophrenia:A systematic review and meta-analysis of observational studies. Asian J Psychiatr 2020;47:101846.
27. Gong X, Huang Y-X, Wang Y, Luo Y-j. Revision of the Chinese facial affective picture system. Chin Ment Health J 2011;25:40–6.
28. Meissner CA, Brigham JC. Thirty years of investigating the own-race bias in memory for faces:A meta-analytic review. Psychol Public Policy Law 2001;7:3.
29. Hugenberg K, Young SG, Bernstein MJ, Sacco DF. The categorization-individuation model:An integrative account of the other-race recognition deficit. Psychol Rev 2010;117:1168–87.
30. Oribe N, Hirano Y, Kanba S, Del Re E, Seidman L, Mesholam-Gately R, et al. Progressive reduction of visual P300 amplitude in patients with first-episode schizophrenia:An ERP study. Schizophr Bull 2015;41:460–70.
31. Hamilton HK, Woods SW, Roach BJ, Llerena K, McGlashan TH, Srihari VH, et al. Auditory and visual oddball stimulus processing deficits in schizophrenia and the psychosis risk syndrome:Forecasting psychosis risk with P300. Schizophr Bull 2019;45:1068–80.
32. Oribe N, Hirano Y, Del Re E, Mesholam-Gately RI, Woodberry KA, Ueno T, et al. Longitudinal evaluation of visual P300 amplitude in clinical high-risk subjects:An event-related potential study. Psychiatry Clin Neurosci 2020;74:527–34.
33. Yang C, Zhang T, Li Z, Heeramun-Aubeeluck A, Liu N, Huang N, et al. Changes in event-related potentials in patients with first-episode schizophrenia and their siblings. BMC Psychiatry 2017;17:1–8.
34. Caharel S, Bernard C, Thibaut F, Haouzir S, Di Maggio-Clozel C, Allio G, et al. The effects of familiarity and emotional expression on face processing examined by ERPs in patients with schizophrenia. Schizophr Res 2007;95:186–96.
35. Sandhya G, Prakash HP, Nayak KR, Behere RV, Bhandary PR, Chinmay AS. Event-related potentials in response to facial affect recognition in patients with schizophrenia. Neurophysiology 2019;51:43–50.
36. Luck SJ. Introduction to the Event Related Potential Technique. Cambridge, MA: MIT Press; 2014.
37. Backes V, Kellermann T, Voss B, Krämer J, Depner C, Schneider F, et al. Neural correlates of the attention network test in schizophrenia. Eur Arch Psychiatry Clin Neurosci 2011;261:155–60.
38. Drucaroff LJ, Fazzito ML, Castro MN, Nemeroff CB, Guinjoan SM, Villarreal MF. Insular functional alterations in emotional processing of schizophrenia patients revealed by Multivariate Pattern Analysis fMRI. J Psychiatr Res 2020;130:128–36.
39. Ueno T, Morita K, Shoji Y, Yamamoto M, Yamamoto H, Maeda H. Recognition of facial expression and visual P300 in schizophrenic patients:Differences between paranoid type patients and non-paranoid patients. Psychiatry Clin Neurosci 2004;58:585–92.
40. Su L, Cai Y, Shi S, Wang L. Meta-analysis of studies in China about changes in P300 latency and amplitude that occur in patients with schizophrenia during treatment with antipsychotic medication. Shanghai Arch Psychiatry 2012;24:200–7.
41. Jeon YW, Polich J. Meta-analysis of P300 and schizophrenia:Patients, paradigms, and practical implications. Psychophysiology 2003;40:684–701.

Chinese faces; emotional faces; event-related potential; schizophrenia

Copyright: © 2023 Indian Journal of Psychiatry