PROJECTS
PUBLICATIONS |
We wish to create a database of movement trajectories, muscle activation patterns, pressure, and force data for various upper extremity functional tasks. We will use this database to construct generative models of functional tasks. These will be trained on the collected data and used to construct a library of probabilistic models representing each functional task. The models in the library could be used for labeling new data or for sampling an instance of a specific movement strategy. It is possible to construct models for successful or unsuccessful movement strategies. We will develop distance measures that will allow us to modify a person’s unsuccessful strategy into a successful one for therapeutic intervention This project presented significant challenges: synchronizing different systems; measuring kinetics in unconstrained, upper extremity movement; and simulating functional activity. Substantial progress has been made toward integrating data of upper extremity movement from three well-validated instruments. The SensAble Phantom® Premium 1.5 Haptic Device generates force at the hand, the Myomonitor® IV Transmission and Datalogging System, measures electrical activity in the muscle, and the MotionStar Wireless® 2 "flock of birds" captures 3D human body motion. Distortion in the position data produced by the metal in the haptic prohibits simultaneous data capture from the haptic and FoB. The research team had to capture the data separately and determine whether the force and position trajectories were similar. They were not homologous. The next approach was to capture position data during functional activity from the FoB only, and translate the position trajectories into force profiles. This resulted in reproducible haptic driven trajectories. The students achieved a breakthrough by employing a body-centered coordinate system as the intermediary for translating the trajectories from the "flock of birds" to the haptic. The body-centered coordinates were first calculated for the "flock of birds." The haptic's coordinate system was then calibrated to the body-centered system. An additional consideration was whether the force profiles that had been created simulated reasonably normal human movement. As the arm moves to accomplish a task, the forces controlling its motion vary throughout the movement's course. Considerable acceleration is needed to initiate a movement, but there is deceleration toward the movement's end. The force profiles generated from the FoB position take this into account, are adaptive, and simulate realistic movement perceived by the subject as "natural". We are now able to control the trajectory of the head of the gimbal, and the tip or contact point of the stylus. We can do this even in cases when the stylus moves relative to the gimbal during writing, which provides realistic trajectories of the tip. The trajectory of the tip is important to know, since the stylus can be slanted, distorting the contact pattern.. We have demonstrated that we can generate force profiles for the haptic that approximate positional trajectories obtained from functional, unencumbered activity, and have shown that these seem to simulate natural movement. Computer Vision Data Collection We propose to use optical motion capture to give us accurate positional information with markers placed at anatomical landmarks. We will next attempt to validate computer vision techniques in order to avoid use of special cumbersome markers. Electromyogram (EMG) Data Collection The studies of upper extremity movement which exist are either overly restrictive and focus only on muscle activity in one joint, or do not take into account muscle activity when modeling upper extremity movement. Nonetheless, electromyographic studies of shoulder muscle activity indicate that that there are large muscle activity pattern differences among healthy individuals performing the same task. This is significant because it implies that motion can be achieved in many different ways by altering muscle activity. If simple tasks can be accomplished using different patterns of muscle activity, then it is possible that weakness or failure in some muscles may be compensated for by using different muscle combinations. Thus, a complete understanding of the combinations of muscles which certain tasks may, in turn, be used to improve prosthesis and to rehabilitate those with disabilities. Supervised learning techniques are also being utilized to classify movements based on EMG signals, which again could lead to improved rehabilitation. A C++ framework is being developed that integrates several different library packages into a set of object-oriented classes for expedited coding of virtual environments, and other haptic-driven tasks. This package integrates Qt, OpenGL, and SOLID. Maya and Photoshop are used for object construction and texture mapping. Constructed objects can then be easily imported into the program. The objects are controlled by an object scene class which is in control of collision forces, displaying, and haptically moving all objects. Also included in the framework is a trajectory scene class which manages two trajectory objects corresponding to a plotting or current trajectory which the user is creating, and a target trajectory which the user is attempting to emulate. Haptic forces are calculated based on what is being plotted and what the target is. Ongoing work is on improving the proprioceptive and tactile feedback given by the object manipulation and the force control used to aid/guide the user to a desired trajectory. As our programs are going to be collecting data from human subjects, our programs need to be as intuitive to use as possible. If the programs are not easy to use or confusing then the data that is collected will not be an acurate representation of the speed and accuracy of the user's performance. Their difficulty will not be due to their skillset or other problem, but will be due to bad interface design. Thus we are looking into a variety of human computer interaction issues when designing the GUIs for our software. Our software is constantly evolving as more and more people test out our demonstrations. Learning to write is considered a fundamental requirement for literacy and for integration into society. Many children have difficulty learning to write and/or write legibly. These children are usually taught using one-to-one training, using a labor intensive approach. Teachers and parents have expressed interest in a device driven, game-like, neutral method for teaching letters and handwriting. Haptic technology has features that are likely to provide these elements. We designed a virtual hand-writing teaching system for children with handwriting difficulties due to attention or motor deficits, using a haptic interface that could provide a neutral, repetitive engaging approach to letter writing. The approach we took to accomplish this included: (a) Using letter primitives, (b) User friendly interface for teachers, therapists, subjects and parents, (c) Adjustable force and assessment mode, and (d) Quantitative reports. While other letter-writing approaches record a teacher's movement and use it as a template for a student to follow, we used letter primitives and created letters using them. The advantages of the primitives are that (a) we can create any letter from the primitives and that (b) the shape of letters can be easily modified by changing scaling and rotation parameters. We generate the trajectories of writing all letters using straight lines, pen lifts (ellipse), and arcs of a circle. We used an existing therapeutic intervention (Handwriting without tear) as a template for letter formation. We customized the pen-lift height to permit children with motor deficits to perform the task. The pen-lifts are marked in red and the letters are marked in blue. Haptic Technology to Persons with Traumatic Brain Injury Traumatic Brain Injury (TBI) affects 400/100,000 people in the US and 85/100,000 are admitted to hospital. As a group, they have the highest average number of years living with disability. Health care costs for TBI in the US are $60 billion. The indirect costs of TBI are also quite high, because TBI tends to occur in young people, predominantly males, and has substantial impact on the work force. There has been significant recent research into the pathophysiology of brain injury, but much more needs to be done to elucidate the mechanisms of functional loss and its recovery. Several key factors have emerged that have provided opportunities for predicting outcomes, identifying those likely to be at significant risk for death and disability, and suggesting management strategies to prolong life and improve functional outcomes. The impact of the primary injury and its secondary effects is no longer thought to be an event in time, after which nothing changes. Quite the contrary, the brain is thought to be in a dynamic state of flux during which injury and repair are progressing simultaneously. Good clinical practice mandates continuous monitoring of the key variables associated with outcomes for the duration of this volatile period. There seems to be agreement on some key variables used to assess severity, which include the Glasgow Coma Scale, measures of verbal fluency, motor activity and CT assessments. Domains relevant to functional outcomes thought to be important to measure in this population include physical, cognitive, behavioral/emotional status, presence of pain and awareness of self. Virtual reality has been applied to both the evaluation and treatment of persons with TBI. The application of simulated environments has helped engage patients in therapeutic activities and also has provided "real" life situations that call for integration of sensory, cognitive and motor activities. Based on this approach, models have been developed that examine both top down and bottom up learning. Stimuli that are physical and repetitive, are thought to be bottom up and those that are cognitive and interactive are considered top down. One type of virtual reality employs the use of haptic devices. A haptic device is an instrument that interfaces the user via touch using force or vibration as feedback. Haptics have been studied for decades in object recognition (texture and shape). For example, the somato-sensory cortex, is activated during haptic exploration. Parietal lobe dysfunction has been associated with tactile agnosia, the lateral occipital complex associated with visual agnosia. These abnormalities have been confirmed using brain imaging techniques. Because haptic devices can be programmed to simulate a wide variety of (fine motor) functional tasks, we believe they can be used to evaluate (fine motor) functional ability in persons with mild/moderate TBI and as a therapeutic intervention. They can also be programmed to capture data in real time and hence, serve as an outcome measure for therapeutic interventions.
|
|
Website designed and maintained by Cody Narber Room 2005 Engineering Bldg, 4400 University Dr, Fairfax, VA 22030 |
||