Laboratory for the Study and Simulation of Human MovementGMU Logo Home Activities Affiliations About People Research
 

Movement Library Construction

We wish to create a database of movement trajectories, muscle activation patterns, pressure, and force data for various upper extremity functional tasks. We will use this database to construct generative models of functional tasks. These will be trained on the collected data and used to construct a library of probabilistic models representing each functional task. The models in the library could be used for labeling new data or for sampling an instance of a specific movement strategy. It is possible to construct models for successful or unsuccessful movement strategies. We will develop distance measures that will allow us to modify a person’s unsuccessful strategy into a successful one for therapeutic intervention

 

Translating Movement

This project presented significant challenges: synchronizing different systems; measuring kinetics in unconstrained, upper extremity movement; and simulating functional activity. Substantial progress has been made toward integrating data of upper extremity movement from three well-validated instruments. The SensAble Phantom® Premium 1.5 Haptic Device generates force at the hand, the Myomonitor® IV Transmission and Datalogging System, measures electrical activity in the muscle, and the MotionStar Wireless® 2 "flock of birds" captures 3D human body motion.

Distortion in the position data produced by the metal in the haptic prohibits simultaneous data capture from the haptic and FoB. The research team had to capture the data separately and determine whether the force and position trajectories were similar. They were not homologous. The next approach was to capture position data during functional activity from the FoB only, and translate the position trajectories into force profiles. This resulted in reproducible haptic driven trajectories.

The students achieved a breakthrough by employing a body-centered coordinate system as the intermediary for translating the trajectories from the "flock of birds" to the haptic. The body-centered coordinates were first calculated for the "flock of birds." The haptic's coordinate system was then calibrated to the body-centered system.

An additional consideration was whether the force profiles that had been created simulated reasonably normal human movement. As the arm moves to accomplish a task, the forces controlling its motion vary throughout the movement's course. Considerable acceleration is needed to initiate a movement, but there is deceleration toward the movement's end. The force profiles generated from the FoB position take this into account, are adaptive, and simulate realistic movement perceived by the subject as "natural".

We are now able to control the trajectory of the head of the gimbal, and the tip or contact point of the stylus. We can do this even in cases when the stylus moves relative to the gimbal during writing, which provides realistic trajectories of the tip. The trajectory of the tip is important to know, since the stylus can be slanted, distorting the contact pattern.. We have demonstrated that we can generate force profiles for the haptic that approximate positional trajectories obtained from functional, unencumbered activity, and have shown that these seem to simulate natural movement.

 

Computer Vision Data Collection

We propose to use optical motion capture to give us accurate positional information with markers placed at anatomical landmarks. We will next attempt to validate computer vision techniques in order to avoid use of special cumbersome markers.

 

Electromyogram (EMG) Data Collection

The studies of upper extremity movement which exist are either overly restrictive and focus only on muscle activity in one joint, or do not take into account muscle activity when modeling upper extremity movement. Nonetheless, electromyographic studies of shoulder muscle activity indicate that that there are large muscle activity pattern differences among healthy individuals performing the same task. This is significant because it implies that motion can be achieved in many different ways by altering muscle activity. If simple tasks can be accomplished using different patterns of muscle activity, then it is possible that weakness or failure in some muscles may be compensated for by using different muscle combinations. Thus, a complete understanding of the combinations of muscles which certain tasks may, in turn, be used to improve prosthesis and to rehabilitate those with disabilities. Supervised learning techniques are also being utilized to classify movements based on EMG signals, which again could lead to improved rehabilitation.

 

Haptics/Force Control

A C++ framework is being developed that integrates several different library packages into a set of object-oriented classes for expedited coding of virtual environments, and other haptic-driven tasks. This package integrates Qt, OpenGL, and SOLID. Maya and Photoshop are used for object construction and texture mapping. Constructed objects can then be easily imported into the program. The objects are controlled by an object scene class which is in control of collision forces, displaying, and haptically moving all objects. Also included in the framework is a trajectory scene class which manages two trajectory objects corresponding to a plotting or current trajectory which the user is creating, and a target trajectory which the user is attempting to emulate. Haptic forces are calculated based on what is being plotted and what the target is. Ongoing work is on improving the proprioceptive and tactile feedback given by the object manipulation and the force control used to aid/guide the user to a desired trajectory.

 

Graphical User Interfaces/HCI

As our programs are going to be collecting data from human subjects, our programs need to be as intuitive to use as possible. If the programs are not easy to use or confusing then the data that is collected will not be an acurate representation of the speed and accuracy of the user's performance. Their difficulty will not be due to their skillset or other problem, but will be due to bad interface design. Thus we are looking into a variety of human computer interaction issues when designing the GUIs for our software. Our software is constantly evolving as more and more people test out our demonstrations.

 

Letter Writing

Learning to write is considered a fundamental requirement for literacy and for integration into society. Many children have difficulty learning to write and/or write legibly. These children are usually taught using one-to-one training, using a labor intensive approach. Teachers and parents have expressed interest in a device driven, game-like, neutral method for teaching letters and handwriting. Haptic technology has features that are likely to provide these elements.

We designed a virtual hand-writing teaching system for children with handwriting difficulties due to attention or motor deficits, using a haptic interface that could provide a neutral, repetitive engaging approach to letter writing.

The approach we took to accomplish this included: (a) Using letter primitives, (b) User friendly interface for teachers, therapists, subjects and parents, (c) Adjustable force and assessment mode, and (d) Quantitative reports.

While other letter-writing approaches record a teacher's movement and use it as a template for a student to follow, we used letter primitives and created letters using them. The advantages of the primitives are that (a) we can create any letter from the primitives and that (b) the shape of letters can be easily modified by changing scaling and rotation parameters. We generate the trajectories of writing all letters using straight lines, pen lifts (ellipse), and arcs of a circle. We used an existing therapeutic intervention (Handwriting without tear) as a template for letter formation. We customized the pen-lift height to permit children with motor deficits to perform the task. The pen-lifts are marked in red and the letters are marked in blue.

Read more...

 

Haptic Technology to Persons with Traumatic Brain Injury

Traumatic Brain Injury (TBI) affects 400/100,000 people in the US and 85/100,000 are admitted to hospital. As a group, they have the highest average number of years living with disability. Health care costs for TBI in the US are $60 billion. The indirect costs of TBI are also quite high, because TBI tends to occur in young people, predominantly males, and has substantial impact on the work force.

There has been significant recent research into the pathophysiology of brain injury, but much more needs to be done to elucidate the mechanisms of functional loss and its recovery. Several key factors have emerged that have provided opportunities for predicting outcomes, identifying those likely to be at significant risk for death and disability, and suggesting management strategies to prolong life and improve functional outcomes.

The impact of the primary injury and its secondary effects is no longer thought to be an event in time, after which nothing changes. Quite the contrary, the brain is thought to be in a dynamic state of flux during which injury and repair are progressing simultaneously. Good clinical practice mandates continuous monitoring of the key variables associated with outcomes for the duration of this volatile period. There seems to be agreement on some key variables used to assess severity, which include the Glasgow Coma Scale, measures of verbal fluency, motor activity and CT assessments. Domains relevant to functional outcomes thought to be important to measure in this population include physical, cognitive, behavioral/emotional status, presence of pain and awareness of self.

Virtual reality has been applied to both the evaluation and treatment of persons with TBI. The application of simulated environments has helped engage patients in therapeutic activities and also has provided "real" life situations that call for integration of sensory, cognitive and motor activities. Based on this approach, models have been developed that examine both top down and bottom up learning. Stimuli that are physical and repetitive, are thought to be bottom up and those that are cognitive and interactive are considered top down. One type of virtual reality employs the use of haptic devices. A haptic device is an instrument that interfaces the user via touch using force or vibration as feedback. Haptics have been studied for decades in object recognition (texture and shape). For example, the somato-sensory cortex, is activated during haptic exploration. Parietal lobe dysfunction has been associated with tactile agnosia, the lateral occipital complex associated with visual agnosia. These abnormalities have been confirmed using brain imaging techniques. Because haptic devices can be programmed to simulate a wide variety of (fine motor) functional tasks, we believe they can be used to evaluate (fine motor) functional ability in persons with mild/moderate TBI and as a therapeutic intervention. They can also be programmed to capture data in real time and hence, serve as an outcome measure for therapeutic interventions.

 

Publications

  • Gerber, Naomi L., Narber, Cody, Vishnoi, Nalini, Johnson, Sidney, Duric, Zoran, Chan, Leighton. "Feasibility of Haptic Use for Patients with Chronic Traumatic Brain Injury (TBI)," abstract accepted for presentation at the American Academy of Physical Medicine and Rehabilitation Annual Assembly, Atlanta, Georgia, November 15-18, 2012.
  • Vishnoi, Nalini, Duric, Zoran, Gerber, Naomi L.. "Markerless Identification of Key Events in Gait Cycle Using Image Flow," paper accepted for presentation to Engineering in Medicine & Biology Society 2012 (EMBS'12), San Diego, August 28 - September 1, 2012.
  • Narber, Cody, Duric, Zoran, Gerber, Naomi L.. "Haptic Devices as Objective Measures for Motor Skill," presented at the 5th International Conference on Human System Interactions (HSI-2012), Perth, Australia, June 6-8, 2012.
  • Vishnoi, Nalini, Narber, Cody, Duric, Zoran, Gerber, Naomi L.. "Methodology for Translating Upper Extremity Motion to Haptic Interfaces," presented at the 5th International Conference on Human System Interactions (HSI-2012), Perth, Australia, June 6-8, 2012.
  • Gerber, Naomi L., Narber, Cody, Vishnoi, Nalini, Johnson, Sidney, Duric, Zoran, Chan, Leighton. "Feasibility of Haptic Use for Patients with Chronic Traumatic Brain Injury (cTBI)" Poster presented at the Center for Neuroscience and Regenerative Medicine (CNRM) Annual Meeting, National Institutes of Health, Bethesda, MD, May 21, 2012.
  • Vishnoi, Nalini, Duric, Zoran, Gerber, Naomi L.. "Identifying Events of Gait Cycle using Image Flow," presented at the Gait and Clinical Movement Analysis Society (GCMAS) Conference, Grand Rapids, Michigan, May 9-12, 2012.
  • Vishnoi, Nalini, Duric, Zoran, Gerber, Naomi L. Abstract accepted for presentation at the 2011 Annual Meeting of the GCMAS. Bethesda, Maryland from April 26-29, 2011.
  • Narber, Cody, Avramovic, Ivan, Vishnoi, Nalini, Duric, Zoran, Gerber, Naomi L. Application of Haptic Technology to Fine Motor Learning in College Students. Abstract accepted for presentation at the 71st Annual Assembly of the American Academy of Physical Medicine and Rehabilitation, November 4-7, 2010, Seattle Washington.
  • Narber, Cody, Duric, Zoran. Analysis of collision detection algorithms in haptic environments. In: Haptic Audio-Visual Environments and Games (HAVE), 2010 IEEE International Symposium on.; 2010:1-4.
  • Nalini Vishnoi, Wallace Lawson, Naomi Lynn Gerber, Zoran Duric. Identifying Phases of Gait Using Image Flow. The Gait and Clinical Movement Analysis Society (GCMAS) and the European Society of Movement Analysis for Adults and Children, 2nd Joint ESMAC-GCMAS (JEGM) Conference May 12-15, 2010. Miami, Florida, pp. 344-345.
  • N.Vishnoi, C.Narber, N. L. Gerber and Z.Duric. Methodology for Simulating Upper Extremity Functional Activities using Haptic Interfaces. Proceedings of the The Gait and Clinical Movement Analysis Society (GCMAS) and the European Society of Movement Analysis for Adults and Children, 2nd Joint ESMAC-GCMAS (JEGM) Conference May 12-15, 2010. Miami, Florida, pp. 511-512.
  • W. Lawson, N. Vishnoi, N. L. Gerber and Z. Duric. Markerless Identification of Dissimilarities in Gait Sequences. Proceedings of The Gait and Clinical Movement Analysis Society (GCMAS) and the European Society of Movement Analysis for Adults and Children, 2nd Joint ESMAC-GCMAS (JEGM) Conference May 12-15, 2010. Miami, Florida, pp. 513-514

  • Vishnoi,  Nalini, Narber, Cody, Gerber, Naomi Lynn, and Duric, Zoran, “Guiding Hand: A Teaching Tool for Handwriting,” accepted as a demonstration presentation for the ICMI-MLMI 2009 (Eleventh International Conference on Multimodal Interfaces and Workshop on Machine Learning for Multi-modal Interaction) at the MIT Media Lab, Cambridge, Massachusetts, on Nov 2-4, 2009 [pdf]
  • Gene Shuman. Using Forearm Electromyograms to Classify Hand Gestures. 2009 IEEE International Conference on Bioinformatics and Biomedicine IEEE Computer Society, November 4, 2009, Washington D.C., pp 261-264.
  • Younhee Kim, Zoran Duric, Lynn Gerber, Arthur R. Palsbo, and Susan E. Palsbo
    "Teaching Letter Writing using a Programmable Haptic Device Interface for Children with Handwriting Difficulties," accepted as poster presentation to IEEE Symposium on 3D User Interfaces, 2009. [pdf] **Best Poster Award**
  • Lawson, Wallace, Duric, Zoran, Wechsler, Harry, “Gait Analysis Using Independent Components of Image Motion,” published in the Proceedings of the 8th International IEEE Automatic Face and Gesture Recognition Conference, September 17-19, 2008, in Amsterdam, The Netherlands [Link]