认知者往往依据陌生个体面孔所携带的性别、年龄、种族等多重社会范畴信息对其进行加工,以期快速识别和了解他人。在基于面孔识别的多重社会范畴加工过程中,亚范畴间存在着复杂的交互作用。研究者分别采用"Who Said What"范...认知者往往依据陌生个体面孔所携带的性别、年龄、种族等多重社会范畴信息对其进行加工,以期快速识别和了解他人。在基于面孔识别的多重社会范畴加工过程中,亚范畴间存在着复杂的交互作用。研究者分别采用"Who Said What"范式、重复启动范式、加纳选择注意范式、鼠标追踪范式等方法,发现亚范畴间的内隐加工具有彼此削弱的特性,外显加工存在交互影响的不对称性和偏差性。动态交互理论对此进行了进一步的理论分析与阐释。今后需更加科学地区分社会范畴加工的各个阶段,凸显内隐和外显加工的区别与联系;同时进一步整合各研究范式,克服方法异质导致的结果偏差甚至矛盾。展开更多
Pain is a strong symptom of diseases. Being an involuntary unpleasant feeling, it can be considered a reliable indicator of health issues. Pain has always been expressed verbally, but in some cases, traditional patien...Pain is a strong symptom of diseases. Being an involuntary unpleasant feeling, it can be considered a reliable indicator of health issues. Pain has always been expressed verbally, but in some cases, traditional patient self-reporting is not efficient. On one side, there are patients who have neurological disorders and cannot express themselves accurately, as well as patients who suddenly lose consciousness due to an abrupt faintness. On another side, medical staff working in crowded hospitals need to focus on emergencies and would opt for the automation of the task of looking after hospitalized patients during their entire stay, in order to notice any pain-related emergency. These issues can be tackled with deep learning. Knowing that pain is generally followed by spontaneous facial behaviors, facial expressions can be used as a substitute to verbal reporting, to express pain. In this paper, a convolutional neural network (CNN) model was built and trained to detect pain through patients’ facial expressions, using the UNBC-McMaster Shoulder Pain dataset. First, faces were detected from images using the Haarcascade Frontal Face Detector provided by OpenCV, and preprocessed through gray scaling, histogram equalization, face detection, image cropping, mean filtering, and normalization. Next, preprocessed images were fed into a CNN model which was built based on a modified version of the VGG16 architecture. The model was finally evaluated and fine-tuned in a continuous way based on its accuracy, which reached 92.5%.展开更多
Autism spectrum disorder(ASD)is a neurodevelopmental disorder affecting social,communicative,and repetitive behavior.The phenotypic heterogeneity of ASD makes timely and accurate diagnosis challenging,requiring highly...Autism spectrum disorder(ASD)is a neurodevelopmental disorder affecting social,communicative,and repetitive behavior.The phenotypic heterogeneity of ASD makes timely and accurate diagnosis challenging,requiring highly trained clinical practitioners.The development of automated approaches to ASD classification,based on integrated psychophysiological measures,may one day help expedite the diagnostic process.This paper provides a novel contribution for classifing ASD using both thermographic and EEG data.The methodology used in this study extracts a variety of feature sets and evaluates the possibility of using several learning models.Mean,standard deviation,and entropy values of the EEG signals and mean temperature values of regions of interest(ROIs)in facial thermographic images were extracted as features.Feature selection is performed to filter less informative features based on correlation.The classification process utilizes Naive Bayes,random forest,logistic regression,and multi-layer perceptron algorithms.The integration of EEG and thermographic features have achieved an accuracy of 94%with both logistic regression and multi-layer perceptron classifiers.The results have shown that the classification accuracies of most of the learning models have increased after integrating facial thermographic data with EEG.展开更多
Deficits in facial emotion processing are features of mild Alzheimer’s disease (AD). These impairments are often dis-tressing for carers as well as patients. Such non-cognitive symptoms are often cited as a contribut...Deficits in facial emotion processing are features of mild Alzheimer’s disease (AD). These impairments are often dis-tressing for carers as well as patients. Such non-cognitive symptoms are often cited as a contributing reason for admis-sion into institutionalised care. The ability to interpret emotional cues is crucial to healthy psychological function and relationships and impaired emotional facility may lead to antisocial behavior. Understanding the origins of the non-cognitive aspects of AD may lead to an improvement in the management of sufferers and ease the carer burden. In a cross-sectional study we recorded patients’ facial processing abilities, (emotion and identity recognition) and disease severity (ADAS-cog, Neuropsychiatic Inventory) and investigated the regional cerebral blood flow correlates of facial emotion processing deficits using 99Tcm HMAPO rCBF SPECT. Using statistical parametric mapping (SPM) we iden-tified decreased blood flow in posterior frontal regions specifically associated with emotion perception deficits. Non-emotional facial processing abilities or disease severity. The posterior frontal lobe has been identified in previous stud-ies in the absence of dementia as being important in emotion processing. The results suggest that the cognitive disease severity, in combination with the facial processing ability, do not completely explain facial emotion processing in AD patients and that the posterior frontal lobe mediates such behaviour.展开更多
文摘认知者往往依据陌生个体面孔所携带的性别、年龄、种族等多重社会范畴信息对其进行加工,以期快速识别和了解他人。在基于面孔识别的多重社会范畴加工过程中,亚范畴间存在着复杂的交互作用。研究者分别采用"Who Said What"范式、重复启动范式、加纳选择注意范式、鼠标追踪范式等方法,发现亚范畴间的内隐加工具有彼此削弱的特性,外显加工存在交互影响的不对称性和偏差性。动态交互理论对此进行了进一步的理论分析与阐释。今后需更加科学地区分社会范畴加工的各个阶段,凸显内隐和外显加工的区别与联系;同时进一步整合各研究范式,克服方法异质导致的结果偏差甚至矛盾。
文摘Pain is a strong symptom of diseases. Being an involuntary unpleasant feeling, it can be considered a reliable indicator of health issues. Pain has always been expressed verbally, but in some cases, traditional patient self-reporting is not efficient. On one side, there are patients who have neurological disorders and cannot express themselves accurately, as well as patients who suddenly lose consciousness due to an abrupt faintness. On another side, medical staff working in crowded hospitals need to focus on emergencies and would opt for the automation of the task of looking after hospitalized patients during their entire stay, in order to notice any pain-related emergency. These issues can be tackled with deep learning. Knowing that pain is generally followed by spontaneous facial behaviors, facial expressions can be used as a substitute to verbal reporting, to express pain. In this paper, a convolutional neural network (CNN) model was built and trained to detect pain through patients’ facial expressions, using the UNBC-McMaster Shoulder Pain dataset. First, faces were detected from images using the Haarcascade Frontal Face Detector provided by OpenCV, and preprocessed through gray scaling, histogram equalization, face detection, image cropping, mean filtering, and normalization. Next, preprocessed images were fed into a CNN model which was built based on a modified version of the VGG16 architecture. The model was finally evaluated and fine-tuned in a continuous way based on its accuracy, which reached 92.5%.
基金This work was supported by Old Dominion University,Norfolk,Virginia and University of Moratuwa,Sri Lanka.
文摘Autism spectrum disorder(ASD)is a neurodevelopmental disorder affecting social,communicative,and repetitive behavior.The phenotypic heterogeneity of ASD makes timely and accurate diagnosis challenging,requiring highly trained clinical practitioners.The development of automated approaches to ASD classification,based on integrated psychophysiological measures,may one day help expedite the diagnostic process.This paper provides a novel contribution for classifing ASD using both thermographic and EEG data.The methodology used in this study extracts a variety of feature sets and evaluates the possibility of using several learning models.Mean,standard deviation,and entropy values of the EEG signals and mean temperature values of regions of interest(ROIs)in facial thermographic images were extracted as features.Feature selection is performed to filter less informative features based on correlation.The classification process utilizes Naive Bayes,random forest,logistic regression,and multi-layer perceptron algorithms.The integration of EEG and thermographic features have achieved an accuracy of 94%with both logistic regression and multi-layer perceptron classifiers.The results have shown that the classification accuracies of most of the learning models have increased after integrating facial thermographic data with EEG.
文摘Deficits in facial emotion processing are features of mild Alzheimer’s disease (AD). These impairments are often dis-tressing for carers as well as patients. Such non-cognitive symptoms are often cited as a contributing reason for admis-sion into institutionalised care. The ability to interpret emotional cues is crucial to healthy psychological function and relationships and impaired emotional facility may lead to antisocial behavior. Understanding the origins of the non-cognitive aspects of AD may lead to an improvement in the management of sufferers and ease the carer burden. In a cross-sectional study we recorded patients’ facial processing abilities, (emotion and identity recognition) and disease severity (ADAS-cog, Neuropsychiatic Inventory) and investigated the regional cerebral blood flow correlates of facial emotion processing deficits using 99Tcm HMAPO rCBF SPECT. Using statistical parametric mapping (SPM) we iden-tified decreased blood flow in posterior frontal regions specifically associated with emotion perception deficits. Non-emotional facial processing abilities or disease severity. The posterior frontal lobe has been identified in previous stud-ies in the absence of dementia as being important in emotion processing. The results suggest that the cognitive disease severity, in combination with the facial processing ability, do not completely explain facial emotion processing in AD patients and that the posterior frontal lobe mediates such behaviour.