Gestural human-machine interaction using neural networks for people with severe speech and motor impairment due to cerebral palsy
Roy, D. M. (1996). Gestural human-machine interaction using neural networks for people with severe speech and motor impairment due to cerebral palsy. (Unpublished Doctoral thesis, City, University of London)
Abstract
The long-term aim of this research is the development of a robust and appropriate method of high efferent bandwidth gestural human-machine interaction (HMI) that enhances and extends the multimodal expressive abilities of people with severe speech and motor impairment due to cerebral palsy (SSMICP). A human-factors driven approach was adopted to generate and identify candidate behaviour for gestural HMI. Neural methods were applied to investigate the automatic recognition of human move-ment with a high noise component using spastic-athetoid cerebral palsy arm movement data.
Human-machine interaction was considered as an emergent property leading to the development of a methodology based on human-human interaction to elicit a wide range of spontaneous or near spontaneous gestures. Twelve subjects with SSMICP aged five to 18 years took part in a gestural ability pilot study. From 30 to 141 concepts presented verbally were used to elicit a wide range of spontaneous or near spontaneous gestural responses. Subjects were encouraged to express each concept in any way they wished. Frequently gestural ability was beyond that anticipated by therapists, educators, parents and physicians. Therapeutic, educational, and medical records did not predict gestural ability observed in the study. Analysis of video-taped sessions indicated that gestures were frequently articulated using multiple parts of the body. Nine out of ten subjects used either the right or left arm more frequently that any other body part.
Instrumented gestural data comprising a subset of 27 gestures from a 17 years old subject with spastic-athetoid quadriplegia was used to investigate automatic gesture recognition. Co-articulated dynamic arm gestures were elicited in random order and gestural data recorded at 100 samples/second using a six-degree-of-freedom magnetic tracker attached distally to one forearm. The gestural data stream was examined using a simple body
model developed using MATLAB * and animated on a Silicon Graphics Workstation. In the absence of suitable features to automatically segment the gestural data stream, gestures were manually segmented.
Low-pass filtering was used to remove “jerkiness” and data reduction was achieved through re-sampling. The use of time-delay feedforward neural networks was investigated using features extracted over a fixed time interval as input. Neural network classifiers outperformed two k-nearest neighbour methods. Time windows of 160ms to 1120 ms were compared. A span of 640ms comprising four time samples yielded the optimum rate of recognition. Feature sets containing measures of position, forearm orientation, scalar and vector velocity, curvature and plane of motion were compared. A feature set comprising four time intervals of x,y,z position gave highest recognition rate. 12 gestures were recognised at or above 80% with an average recognition rate of 90%. Maximum results for all 26 gestures was 55%. Results suggest that the fixed time window approach coupled with low pass filtering may be a feasible method for the computer recognition of noisy gestural movement. Conversely, the results show that is possible for people classed as having no functional use of upper extremities by traditional assessment techniques to produce a repertoire of dynamic arm gestures with sufficient consistency to be recognised by machine.
Publication Type: | Thesis (Doctoral) |
---|---|
Subjects: | Q Science > QA Mathematics > QA75 Electronic computers. Computer science Q Science > QM Human anatomy |
Departments: | School of Science & Technology > Computer Science > Human Computer Interaction Design School of Science & Technology > School of Science & Technology Doctoral Theses Doctoral Theses |
Download (27MB) | Preview
Export
Downloads
Downloads per month over past year