A3:

ACoding Guideline for HCI+Autism Research using Video Annotation

A3 Coding Guidlines

Contacts:

Joshua Hailpern

Guidline Download:

Click <here> to get A3 Coder's Guide

Publications:

Hailpern, J., Karahalios, K., Halle, J., DeThorne, L. S. and Coletto. A3: HCI Coding Guideline for Research Using Video Annotation to Assess Behavior of Nonverbal Subjects with Computer-Based Intervention. ACM Transactions on Accessible Computing (TACCESS) Volume 2 , Issue 2 Article No. 8 pdf

Hailpern, J., Karahalios, K., Halle, J., DeThorne, L. S. and Coletto, M. A3: A Coding Guideline for HCI+Autism Research using Video Annotation. In Proceedings of the ACM SIGACCESS- ASSETS 2008 (Halifax, Canada, 2008). ACM-PRESS, New York, NY, 2008. pdf

Abstract:

Due to the profile of strengths and weaknesses indicative of autism spectrum disorders (ASD), technology may play a key role in ameliorating communication difficulties with this population. This paper documents coding guidelines established through cross-disciplinary work focused on facilitating communication development in children with ASD using computerized feedback. The guidelines, referred to as A3 (pronounced A-Cubed) or Annotation for ASD Analysis, define and operationalize a set of dependent variables coded via video annotation. Inter-rater reliability data are also presented from a study currently in-progress, as well as related discussion to help guide future work in this area. The design of the A3 methodology is well-suited for the examination and evaluation of the behavior of low-functioning subjects with ASD who interact with technology

Motivation:

ASD-related tools create new challenges to software developers, due to the different subject demographic compared to existing techniques from other forms of assistive technology. We propose A3 to quantitatively assess a set of dependent variables identified through the digital video annotation process. Because we are required to rely entirely on subject behavior, rather than on feedback provided by subjects (due to the nature of ASD), the creation of such an assessment tool as A3 is critical for evaluation of technology used by the ASD community. Because no such coding scheme exists for ASD-related tools, it is incumbent on us to faithfully describe our scheme and its relationship to other available tools, and to provide data to support its reliability in the field. We proposed a new set of dependent variables to be assessed through the video annotation process called A3.

Results:

When taking all of the dependent variables together, the overall inter-rater agreement (IRA) was 88%. Upon closer examination, we determined that eight of the 20 measured variables had IRA that exceeded 85%, 12 had IRA that exceeded 80%, and 16 exceeding 75%. The Kappa statistics calculated from the data suggest a high level of agreement. Kappas ranged from 0.69 (Good) to 0.81 (Very Good). Our interpretation of agreement follows from that of Altman and Byrt. With this set of dependent variables, we have operationalized the coding process through detailed descriptions of the dependent variables and use of the VCode and VData system. As a result, time for annotation has been reduced to 20 minutes per 1 minute of footage, while still maintaining adequate reliability.


Acknowledgements:

We would like to thank NSF-0643502 for their support of this project.