Centre d'Information et de documentation du CRA Rhône-Alpes
CRA
Informations pratiques
-
Adresse
Centre d'information et de documentation
du CRA Rhône-Alpes
Centre Hospitalier le Vinatier
bât 211
95, Bd Pinel
69678 Bron CedexHoraires
Lundi au Vendredi
9h00-12h00 13h30-16h00Contact
Tél: +33(0)4 37 91 54 65
Mail
Fax: +33(0)4 37 91 54 37
-
Résultat de la recherche
3 recherche sur le mot-clé 'Computer vision'
Affiner la recherche Générer le flux rss de la recherche
Partager le résultat de cette recherche Faire une suggestion
Computer Vision Analysis of Reduced Interpersonal Affect Coordination in Youth With Autism Spectrum Disorder / Casey J. ZAMPELLA in Autism Research, 13-12 (December 2020)
[article]
Titre : Computer Vision Analysis of Reduced Interpersonal Affect Coordination in Youth With Autism Spectrum Disorder Type de document : Texte imprimé et/ou numérique Auteurs : Casey J. ZAMPELLA, Auteur ; Loisa BENNETTO, Auteur ; John D. HERRINGTON, Auteur Article en page(s) : p.2133-2142 Langues : Anglais (eng) Mots-clés : affect/emotion computer vision facial expression interpersonal coordination social-emotional reciprocity synchrony Index. décimale : PER Périodiques Résumé : Atypical social-emotional reciprocity is a core feature of autism spectrum disorder (ASD) but can be difficult to operationalize. One measurable manifestation of reciprocity may be interpersonal coordination, the tendency to align the form and timing of one's behaviors (including facial affect) with others. Interpersonal affect coordination facilitates sharing and understanding of emotional cues, and there is evidence that it is reduced in ASD. However, most research has not measured this process in true social contexts, due in part to a lack of tools for measuring dynamic facial expressions over the course of an interaction. Automated facial analysis via computer vision provides an efficient, granular, objective method for measuring naturally occurring facial affect and coordination. Youth with ASD and matched typically developing youth participated in cooperative conversations with their mothers and unfamiliar adults. Time-synchronized videos were analyzed with an open-source computer vision toolkit for automated facial analysis, for the presence and intensity of facial movements associated with positive affect. Both youth and adult conversation partners exhibited less positive affect during conversations when the youth partner had ASD. Youth with ASD also engaged in less affect coordination over the course of conversations. When considered dimensionally across youth with and without ASD, affect coordination significantly predicted scores on rating scales of autism-related social atypicality, adaptive social skills, and empathy. Findings suggest that affect coordination is an important interpersonal process with implications for broader social-emotional functioning. This preliminary study introduces a promising novel method for quantifying moment-to-moment facial expression and emotional reciprocity during natural interactions. LAY SUMMARY: This study introduces a novel, automated method for measuring social-emotional reciprocity during natural conversations, which may improve assessment of this core autism diagnostic behavior. We used computerized methods to measure facial affect and the degree of affect coordination between conversation partners. Youth with autism displayed reduced affect coordination, and reduced affect coordination predicted lower scores on measures of broader social-emotional skills. En ligne : http://dx.doi.org/10.1002/aur.2334 Permalink : https://www.cra-rhone-alpes.org/cid/opac_css/index.php?lvl=notice_display&id=434
in Autism Research > 13-12 (December 2020) . - p.2133-2142[article] Computer Vision Analysis of Reduced Interpersonal Affect Coordination in Youth With Autism Spectrum Disorder [Texte imprimé et/ou numérique] / Casey J. ZAMPELLA, Auteur ; Loisa BENNETTO, Auteur ; John D. HERRINGTON, Auteur . - p.2133-2142.
Langues : Anglais (eng)
in Autism Research > 13-12 (December 2020) . - p.2133-2142
Mots-clés : affect/emotion computer vision facial expression interpersonal coordination social-emotional reciprocity synchrony Index. décimale : PER Périodiques Résumé : Atypical social-emotional reciprocity is a core feature of autism spectrum disorder (ASD) but can be difficult to operationalize. One measurable manifestation of reciprocity may be interpersonal coordination, the tendency to align the form and timing of one's behaviors (including facial affect) with others. Interpersonal affect coordination facilitates sharing and understanding of emotional cues, and there is evidence that it is reduced in ASD. However, most research has not measured this process in true social contexts, due in part to a lack of tools for measuring dynamic facial expressions over the course of an interaction. Automated facial analysis via computer vision provides an efficient, granular, objective method for measuring naturally occurring facial affect and coordination. Youth with ASD and matched typically developing youth participated in cooperative conversations with their mothers and unfamiliar adults. Time-synchronized videos were analyzed with an open-source computer vision toolkit for automated facial analysis, for the presence and intensity of facial movements associated with positive affect. Both youth and adult conversation partners exhibited less positive affect during conversations when the youth partner had ASD. Youth with ASD also engaged in less affect coordination over the course of conversations. When considered dimensionally across youth with and without ASD, affect coordination significantly predicted scores on rating scales of autism-related social atypicality, adaptive social skills, and empathy. Findings suggest that affect coordination is an important interpersonal process with implications for broader social-emotional functioning. This preliminary study introduces a promising novel method for quantifying moment-to-moment facial expression and emotional reciprocity during natural interactions. LAY SUMMARY: This study introduces a novel, automated method for measuring social-emotional reciprocity during natural conversations, which may improve assessment of this core autism diagnostic behavior. We used computerized methods to measure facial affect and the degree of affect coordination between conversation partners. Youth with autism displayed reduced affect coordination, and reduced affect coordination predicted lower scores on measures of broader social-emotional skills. En ligne : http://dx.doi.org/10.1002/aur.2334 Permalink : https://www.cra-rhone-alpes.org/cid/opac_css/index.php?lvl=notice_display&id=434 A pilot study to identify autism related traits in spontaneous facial actions using computer vision / Manar D. SAMAD in Research in Autism Spectrum Disorders, 65 (September 2019)
[article]
Titre : A pilot study to identify autism related traits in spontaneous facial actions using computer vision Type de document : Texte imprimé et/ou numérique Auteurs : Manar D. SAMAD, Auteur ; Norou DIAWARA, Auteur ; Jonna L. BOBZIEN, Auteur ; Cora M. TAYLOR, Auteur ; John W. HARRINGTON, Auteur ; Khan M. IFTEKHARUDDIN, Auteur Article en page(s) : p.14-24 Langues : Anglais (eng) Mots-clés : ASD Behavioral marker Differential traits Facial action units Computer vision Spontaneous expressions Index. décimale : PER Périodiques Résumé : Background Individuals with autism spectrum disorders (ASD) may be differentiated from typically developing controls (TDC) based on phenotypic features in spontaneous facial expressions. Computer vision technology can automatically track subtle facial actions to gain quantitative insights into ASD related behavioral abnormalities. Method This study proposes a novel psychovisual human-study to elicit spontaneous facial expressions in response to a variety of social and emotional contexts. We introduce a markerless facial motion capture and computer vision methods to track spontaneous and subtle activations of facial muscles. The facial muscle activations are encoded into ten representative facial action units (FAU) to gain quantitative, granular, and contextual insights into the psychophysical development of the participating individuals. Statistical tests are performed to identify differential traits in individuals with ASD after comparing those in a cohort of age-matched TDC individuals. Results The proposed framework has revealed significant difference (p?0.001) in the activation of ten FAU and contrasting activations of FAU between the group with ASD and the TDC group. Unlike the TDC group, the group with ASD has shown unusual prevalence of mouth frown (FAU 15) and low correlations in temporal activations of several FAU pairs: 6–12, 10–12, and 10–20. The interpretation of different FAU activations suggests quantitative evidence of expression bluntness, lack of expression mimicry, incongruent reaction to negative emotions in the group with ASD. Conclusion Our generalized framework may be used to quantify psychophysical traits in individuals with ASD and replicate in similar studies that require quantitative measurements of behavioral responses. En ligne : https://doi.org/10.1016/j.rasd.2019.05.001 Permalink : https://www.cra-rhone-alpes.org/cid/opac_css/index.php?lvl=notice_display&id=401
in Research in Autism Spectrum Disorders > 65 (September 2019) . - p.14-24[article] A pilot study to identify autism related traits in spontaneous facial actions using computer vision [Texte imprimé et/ou numérique] / Manar D. SAMAD, Auteur ; Norou DIAWARA, Auteur ; Jonna L. BOBZIEN, Auteur ; Cora M. TAYLOR, Auteur ; John W. HARRINGTON, Auteur ; Khan M. IFTEKHARUDDIN, Auteur . - p.14-24.
Langues : Anglais (eng)
in Research in Autism Spectrum Disorders > 65 (September 2019) . - p.14-24
Mots-clés : ASD Behavioral marker Differential traits Facial action units Computer vision Spontaneous expressions Index. décimale : PER Périodiques Résumé : Background Individuals with autism spectrum disorders (ASD) may be differentiated from typically developing controls (TDC) based on phenotypic features in spontaneous facial expressions. Computer vision technology can automatically track subtle facial actions to gain quantitative insights into ASD related behavioral abnormalities. Method This study proposes a novel psychovisual human-study to elicit spontaneous facial expressions in response to a variety of social and emotional contexts. We introduce a markerless facial motion capture and computer vision methods to track spontaneous and subtle activations of facial muscles. The facial muscle activations are encoded into ten representative facial action units (FAU) to gain quantitative, granular, and contextual insights into the psychophysical development of the participating individuals. Statistical tests are performed to identify differential traits in individuals with ASD after comparing those in a cohort of age-matched TDC individuals. Results The proposed framework has revealed significant difference (p?0.001) in the activation of ten FAU and contrasting activations of FAU between the group with ASD and the TDC group. Unlike the TDC group, the group with ASD has shown unusual prevalence of mouth frown (FAU 15) and low correlations in temporal activations of several FAU pairs: 6–12, 10–12, and 10–20. The interpretation of different FAU activations suggests quantitative evidence of expression bluntness, lack of expression mimicry, incongruent reaction to negative emotions in the group with ASD. Conclusion Our generalized framework may be used to quantify psychophysical traits in individuals with ASD and replicate in similar studies that require quantitative measurements of behavioral responses. En ligne : https://doi.org/10.1016/j.rasd.2019.05.001 Permalink : https://www.cra-rhone-alpes.org/cid/opac_css/index.php?lvl=notice_display&id=401 Digital Behavioral Phenotyping Detects Atypical Pattern of Facial Expression in Toddlers with Autism / Kimberly L. H. CARPENTER in Autism Research, 14-3 (March 2021)
[article]
Titre : Digital Behavioral Phenotyping Detects Atypical Pattern of Facial Expression in Toddlers with Autism Type de document : Texte imprimé et/ou numérique Auteurs : Kimberly L. H. CARPENTER, Auteur ; Jordan HAHEMI, Auteur ; Kathleen CAMPBELL, Auteur ; Steven J. LIPPMANN, Auteur ; Jeffrey P. BAKER, Auteur ; Helen L. EGGER, Auteur ; Steven ESPINOSA, Auteur ; Saritha VERMEER, Auteur ; Guillermo SAPIRO, Auteur ; Geraldine DAWSON, Auteur Article en page(s) : p.488-499 Langues : Anglais (eng) Mots-clés : autism computer vision early detection facial expressions risk behaviors Amazon, Google, Cisco, and Microsoft and is a consultant for Apple and Volvo. Geraldine Dawson is on the Scientific Advisory Boards of Janssen Research and Development, Akili, Inc., LabCorp, Inc., Tris Pharma, and Roche Pharmaceutical Company, a consultant for Apple, Inc, Gerson Lehrman Group, Guidepoint, Inc., Teva Pharmaceuticals, and Axial Ventures, has received grant funding from Janssen Research and Development, and is CEO of DASIO, LLC (with Guillermo Sapiro). Dawson receives royalties from Guilford Press, Springer, and Oxford University Press. Dawson, Sapiro, Carpenter, Hashemi, Campbell, Espinosa, Baker, and Egger helped develop aspects of the technology that is being used in the study. The technology has been licensed and Dawson, Sapiro, Carpenter, Hashemi, Espinosa, Baker, Egger, and Duke University have benefited financially. Index. décimale : PER Périodiques Résumé : Commonly used screening tools for autism spectrum disorder (ASD) generally rely on subjective caregiver questionnaires. While behavioral observation is more objective, it is also expensive, time-consuming, and requires significant expertise to perform. As such, there remains a critical need to develop feasible, scalable, and reliable tools that can characterize ASD risk behaviors. This study assessed the utility of a tablet-based behavioral assessment for eliciting and detecting one type of risk behavior, namely, patterns of facial expression, in 104 toddlers (ASD N =?22) and evaluated whether such patterns differentiated toddlers with and without ASD. The assessment consisted of the child sitting on his/her caregiver's lap and watching brief movies shown on a smart tablet while the embedded camera recorded the child's facial expressions. Computer vision analysis (CVA) automatically detected and tracked facial landmarks, which were used to estimate head position and facial expressions (Positive, Neutral, All Other). Using CVA, specific points throughout the movies were identified that reliably differentiate between children with and without ASD based on their patterns of facial movement and expressions (area under the curves for individual movies ranging from 0.62 to 0.73). During these instances, children with ASD more frequently displayed Neutral expressions compared to children without ASD, who had more All Other expressions. The frequency of All Other expressions was driven by non-ASD children more often displaying raised eyebrows and an open mouth, characteristic of engagement/interest. Preliminary results suggest computational coding of facial movements and expressions via a tablet-based assessment can detect differences in affective expression, one of the early, core features of ASD. LAY SUMMARY: This study tested the use of a tablet in the behavioral assessment of young children with autism. Children watched a series of developmentally appropriate movies and their facial expressions were recorded using the camera embedded in the tablet. Results suggest that computational assessments of facial expressions may be useful in early detection of symptoms of autism. En ligne : http://dx.doi.org/10.1002/aur.2391 Permalink : https://www.cra-rhone-alpes.org/cid/opac_css/index.php?lvl=notice_display&id=443
in Autism Research > 14-3 (March 2021) . - p.488-499[article] Digital Behavioral Phenotyping Detects Atypical Pattern of Facial Expression in Toddlers with Autism [Texte imprimé et/ou numérique] / Kimberly L. H. CARPENTER, Auteur ; Jordan HAHEMI, Auteur ; Kathleen CAMPBELL, Auteur ; Steven J. LIPPMANN, Auteur ; Jeffrey P. BAKER, Auteur ; Helen L. EGGER, Auteur ; Steven ESPINOSA, Auteur ; Saritha VERMEER, Auteur ; Guillermo SAPIRO, Auteur ; Geraldine DAWSON, Auteur . - p.488-499.
Langues : Anglais (eng)
in Autism Research > 14-3 (March 2021) . - p.488-499
Mots-clés : autism computer vision early detection facial expressions risk behaviors Amazon, Google, Cisco, and Microsoft and is a consultant for Apple and Volvo. Geraldine Dawson is on the Scientific Advisory Boards of Janssen Research and Development, Akili, Inc., LabCorp, Inc., Tris Pharma, and Roche Pharmaceutical Company, a consultant for Apple, Inc, Gerson Lehrman Group, Guidepoint, Inc., Teva Pharmaceuticals, and Axial Ventures, has received grant funding from Janssen Research and Development, and is CEO of DASIO, LLC (with Guillermo Sapiro). Dawson receives royalties from Guilford Press, Springer, and Oxford University Press. Dawson, Sapiro, Carpenter, Hashemi, Campbell, Espinosa, Baker, and Egger helped develop aspects of the technology that is being used in the study. The technology has been licensed and Dawson, Sapiro, Carpenter, Hashemi, Espinosa, Baker, Egger, and Duke University have benefited financially. Index. décimale : PER Périodiques Résumé : Commonly used screening tools for autism spectrum disorder (ASD) generally rely on subjective caregiver questionnaires. While behavioral observation is more objective, it is also expensive, time-consuming, and requires significant expertise to perform. As such, there remains a critical need to develop feasible, scalable, and reliable tools that can characterize ASD risk behaviors. This study assessed the utility of a tablet-based behavioral assessment for eliciting and detecting one type of risk behavior, namely, patterns of facial expression, in 104 toddlers (ASD N =?22) and evaluated whether such patterns differentiated toddlers with and without ASD. The assessment consisted of the child sitting on his/her caregiver's lap and watching brief movies shown on a smart tablet while the embedded camera recorded the child's facial expressions. Computer vision analysis (CVA) automatically detected and tracked facial landmarks, which were used to estimate head position and facial expressions (Positive, Neutral, All Other). Using CVA, specific points throughout the movies were identified that reliably differentiate between children with and without ASD based on their patterns of facial movement and expressions (area under the curves for individual movies ranging from 0.62 to 0.73). During these instances, children with ASD more frequently displayed Neutral expressions compared to children without ASD, who had more All Other expressions. The frequency of All Other expressions was driven by non-ASD children more often displaying raised eyebrows and an open mouth, characteristic of engagement/interest. Preliminary results suggest computational coding of facial movements and expressions via a tablet-based assessment can detect differences in affective expression, one of the early, core features of ASD. LAY SUMMARY: This study tested the use of a tablet in the behavioral assessment of young children with autism. Children watched a series of developmentally appropriate movies and their facial expressions were recorded using the camera embedded in the tablet. Results suggest that computational assessments of facial expressions may be useful in early detection of symptoms of autism. En ligne : http://dx.doi.org/10.1002/aur.2391 Permalink : https://www.cra-rhone-alpes.org/cid/opac_css/index.php?lvl=notice_display&id=443