Life Sciences

MOTION ANALYSIS/GESTURE RECOGNITION: Therapy system gets physical with Kinect-based motion analysis

Students at the University of Leeds' Department of Mechanical Engineering have developed a Kinect-based system designed to measure the rehabilitation progress of patients who suffer strokes or other neurological disorders.
Oct. 1, 2012
4 min read

Measuring the rehabilitation progress of patients who suffer strokes or other neurological disorders is currently a subjective process. Often, no two physicians can agree on this process because no definitive means exists to objectively measure how a patient's motion has improved over the rehabilitation period.

Students at the University of Leeds' Department of Mechanical Engineering have developed a Kinect-based system designed to perform just this task. "In the past," says Martin Levesley, professor of dynamics and control at the University of Leeds, "we have used motion capture systems such as the Optotrak Certus system from NDI (www.ndigital.com/index.php) to evaluate patient motion, although such systems are expensive and require outfitting the patient with tracking markers in a custom screening room."

Levesley tasked undergraduate students Dominic Clark, Barnaby Cotter, and Chris Norman with producing an inexpensive, portable solution that could be used at the patient's residence. With Microsoft's Kinect as the foundation, the system is designed to provide mental stimulation to the patient while accurately recording body position and movement. It allows physiotherapists to maintain patient motivation and extract detailed information on their movements. Stroke, for example, can result in movement problems on one side of the body, so using this system may offer further information into the extent of the stroke.

Dubbed Kinesthesia, the system consists of the Kinect motion sensor, which includes an integrated RGB camera communicating to a PC over a USB interface. Microsoft's Kinect software development kit (SDK) was first used to return the 20 x, y, and z coordinates from the motion sensor to the host PC. C# applications, encapsulated as Virtual Instruments (VIs) using the LabVIEW graphical programming language from National Instruments (NI), were then developed to create a depth map from the data, generate a user-programmable skeletal model from the data, and simultaneously display the video data (see figure, part a).

By embedding the skeletal data directly into an .avi file, the video can be reviewed, edited, and saved, providing a 3-D, rotatable rendering of the patient's skeleton alongside the raw video footage. Using the Kinect to track the user's skeleton, gait can be analyzed and metrics concerning the patient computed using LabVIEW and fed back to the operator. To allow physicians to evaluate behavior, patients are prompted to manipulate 3-D objects generated by the system.

a) Researchers at the University of Leeds have used Microsoft's Kinect image sensor and the LabVIEW programming language to develop a system for the rehabilitation of patients suffering from neurological disorders. First, the patient is prompted to virtually manipulate a CAD model.

b) The motion between specific joints can then be calculated, displayed, and recorded in real time along with an image for local or remote analysis.

To accomplish this, any STereoLithography (.STL) can be loaded into the system. After, an STL parser determines the size of the CAD model and user manipulations are scaled to the size of the model. By asking patients to perform tasks based upon the industry-standard Action Research Arm Test (ARAT; see http://bit.ly/QBMomj) that includes sub-tests such as grasping, gripping, pinching, and gross movement functions, a physician-friendly virtual stroke rehabilitation environment is created (see figure, part b).

Currently, Levesley and his colleagues are collaborating with a number of medical centers including the Neurosurgery Department of the University of California-Los Angeles (UCLA; www.ucla.edu) and local Leeds hospitals to develop gait analysis and Virtual ARAT software for post-operative and rehabilitation assessment tools. NI and the University of Leeds have also made the Kinesthesia/NI toolkit available free of charge through NI's LabVIEW Tools Network.

With the development of the Kinesthesia system and a rehabilitation regimen in place, a patient could perform these Virtual ARAT tasks at home, with recorded data and video data sent to a physiotherapist to analyze the information remotely. This will be useful for patients who struggle to leave their residence or feel burdened by regular rehabilitation appointments.

Sign up for Vision Systems Design Newsletters

Voice Your Opinion!

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!