CBDA POLYFON

Project Department: Uni Research Computing (group: Center for Big Data Analysis) period: 05.01.16 - 10.01.16

About the project

Developement of algorithms and implementation of Machine Learning analysis for the project POLYFON "Semiautomatic video analysis of music therapy sessions with children", in collaboration with Uni Research Health.

POLYFON and CBDA POLYFON aim to develop a semi­automatic video­analysis tool for therapy sessions to support the assessment and evaluation of therapies.  CBDA takes care of:

  • the development of computer vision methods to identify and track the motion of two individuals (one therapist and one client), with application to up to 180 clients (up to 10-30­ minute segments per client
  • the evelopment of Machine Learning algorithms to correlate the motion with the participant’s severity of illness

The outcome of the project will be a proof-­of-­concept for a fully automatic video­based motion analysis system to support video assessment and evaluation of therapies.

The background:

Video recordings are a rich source of information in assessment and evaluation of therapies and carry relevant information of individual and supra­individual nature. Manual analysis of videos is time-consuming and subjective [1]. Computer vision enables automated analysis, which may help to improve both speed and objectivity, although it needs to be balanced with careful clinical judgment. POLYFON's goal is thus to develop a semi­automatic video analysis tool, which will be applied to 2D videos from the project TIME-A [2]. The tool will quantify the relative movement of therapist and patient and use artificial intelligence to relate it to existing measures of autism severity.

In this way a quantification of participants' progress over 5 months of therapy can be given. The video   analysis is based on computer vision methods [3] to detect humans in videos. After both therapist and patient are detected, the 3D distance is reconstructed by using the apparent size of the individuals in combination with a priori information about the therapist and patient [3]. Finally, artificial intelligence methods are used to correlate the movement characteristics with severity measures (ADOS scores) given by mental health experts [4,5].

 

The technical details:

A Neural Network (NN) architecture will be designed to analyze the video recordings of the theraphy sessions and perform object detection, movement reconstruction and following analysis. To identify the objects in the video frames (which are basically sets of matrices representing the color of each pixel), the NN will resort to a cascade classifier which makes use of LBPs (Local Binary Patterns) and is more computationally efficient than the ones based on HAAR features or HOG (Histogram of Oriented Gradients). This classifier operates by using small decision trees to analyze the 8 neighbors for each pixel (yielding an 8 digit binary string of 1's and 0's depending if neighbor's value is greater or less than, respectively) and to create patterns which are consistent with the target object being trained for (for example if a target object often displays a binary string of 0100 0100 then if this area is found on another image it would be 'upvoted' as a potential area of interest).

 

The project relevance:

The automated evaluation of video recordings from music therapy sessions is of high clinical relevance, as it allows quantitative assessment of sessions, assess of the progress of patients and  the comparison of different intervention methods. CBDA POLYFON will deliver a proof­-of­-concept of such an analysis system. It brings together experts from the fields of mental health, computer vision and data analysis with  the potential of further ground breaking research in this interdisciplinary field.
This will be relevant for the field of child mental health in general and will be also linked to local music therapy practice in Bergen, where some of the therapies were conducted. Other therapies are carried out in other countries collaborating in TIME­-A, ensuring also international relevance.

 

 

References:

[1] T. Wosch and T. Wigram. "Microanalysis in music therapy: Methods, techniques and applications for clinicians, researchers, educators and students", Jessica Kingsley Publishers (2007).
[2] C. Gold. et al., TIME-A: "Trial of Improvisational Music therapy’s Effectiveness for children with Autism", RCN­project operated by group GAMUT at Uni Research Health and built upon an international collaboration of 9 countries world wide.
[3] Y. Zheng and C. Shen, "Pedestrian detection using center­-symmetric local binary patterns", presented at ICIP2010, Hong Kong (2010).
[4] A. Pratap, C. S. Kanimozhiselvi, R. Vijayakumar, and K. V. Pramod, Balas, E. V.; C. Jain, L. & Kovačević, B. (Eds.), "Parallel Neural Fuzzy-Based Joint Classifier Model for Grading Autistic Disorder", Soft Computing Applications: Proceedings of the 6th International Workshop Soft Computing Applications (SOFA 2014), Volume 1, Springer International Publishing, pp. 13-26 (2016).
[5] J. Hashemi et al., "A computer vision approach for the assessment of autism-related behavioral markers", 2012 IEEE International Conference on Development and Learning and Epigenetic Robotics (ICDL), San Diego, CA, pp. 1-7 (2012).

 

 

cp: 2019-11-17 22:17:02