Pubmed du 02/03/25

Pubmed du jour

1. Chen J, Du S, Zhu Y, Li D, Hu C, Mei L, Zhu Y, Chen H, Wang S, Xu X, Dong X, Zhou W, Xu Q. Facial characteristics description and classification based on 3D images of Fragile X syndrome in a retrospective cohort of young Chinese males. Comput Biol Med;2025 (Mar 2);189:109912.

PURPOSE: Fragile X syndrome (FXS) is a common cause of intellectual disability and autism. FXS presents with abnormal facial features, which in pediatric patients are subtler than what is seen in adults. The three-dimensional (3D) facial images, which contain more stereoscopic and subtle information than two-dimensional (2D) photographs, are increasingly being used to classify genetic syndromes. Here, we used 3D facial images to describe facial features and construct a classification model, especially in male patients with FXS. METHODS: We registered the 3D facial images of 40 Chinese boys with FXS and 40 healthy boys. We utilized seven machine learning models with different features extracted from dense point cloud and sparse landmarks. A linear regression model was performed between feature reduction of regional point cloud and genomic as well as methylation subtypes. RESULTS: The typical and subtle differences between 3D average faces of patients and controls could be quantitatively visualized. The projection of patients and controls in Fragile X-liked vectors are significantly different. The random forests model using coordinates of regional facial points (chin, eye, forehead, nose and upper lip) could perform better than expert clinicians in binary classification. Among the 63 hierarchical facial segmentation, significantly associations were found in 8 segments with genetic subtypes, and 2 segments with methylation subtypes. CONCLUSION: The 3D facial images could assist to distinguish male patients with FXS by machine learning, in which the selected regional features performed better than the global features and sparse landmarks. The genetic and methylation status might affect regional facial features differently.

Lien vers le texte intégral (Open Access ou abonnement)

2. Han C, Wu D, Tao FB, Gao H. [Sound and acoustic characteristics in children with autism spectrum disorders]. Zhonghua Er Ke Za Zhi;2025 (Mar 2);63(3):316-319.

Lien vers le texte intégral (Open Access ou abonnement)

3. Su L, Li Y, Wen M, Zhao Y, Yang F, Wei L. An fNIRS Study of the Effects of Sound in Real Scenes on Joint Attention among Individuals With Autistic Traits in China. Psychol Rep;2025 (Mar 2):332941251323260.

There is no consensus on whether adult with autistic traits joint attention (JA) deficits. Additionally, previous studies have mostly been conducted in traditional laboratory settings, lacking ecological validity. This study aimed to address these limitations by using real-life scenarios and functional near-infrared spectroscopy (fNIRS) techniques to investigate the impact of sound cues on JA among individuals with Autistic Traits in China. 23 high autistic trait and 26 low autistic trait adult participated, which examined brain activation during JA in real-life scenarios. The results revealed that high autistic trait adult showed stronger brain activation in the dorsolateral prefrontal regions during JA compared to non-joint attention, and stronger than that of low autistic trait. Conclusion: Sound cues were found to enhance the performance of high AQ adult during JA tasks in real-life scenarios.

Lien vers le texte intégral (Open Access ou abonnement)