Secondary Logo

Journal Logo

Virtual Reality Environment for Simulating Tasks With a Myoelectric Prosthesis: An Assessment and Training Tool

Lambrecht, Joris M. MS; Pulliam, Christopher L. MS; Kirsch, Robert F. PhD

JPO Journal of Prosthetics and Orthotics: April 2011 - Volume 23 - Issue 2 - p 89-94
doi: 10.1097/JPO.0b013e318217a30c
Article
Free

Intuitively and efficiently controlling multiple degrees of freedom is a major hurdle in the field of upper-limb prosthetics. A virtual reality myoelectric transhumeral prosthesis simulator has been developed for cost effectively testing novel control algorithms and devices. The system acquires electromyogram commands and residual limb kinematics, simulates the prosthesis dynamics, and displays the combined residual limb and prosthesis movements in a virtual reality environment that includes force-based interactions with virtual objects. A virtual Box and Block Test is demonstrated. Three normally limbed subjects performed the simulated test using a sequential and a synchronous control method. With the sequential method, subjects moved on average of 6.7 ± 1.9 blocks in 120 seconds, which was similar to the number of blocks the transhumeral amputees are able to move with their physical prostheses during clinical evaluation. With the synchronous method, subjects moved 6.7 ± 2.2 blocks. The virtual reality prosthesis simulator is thus a promising tool for developing and evaluating control methods, prototyping novel prostheses, and training amputees.

Intuitively and efficiently controlling multiple degrees of freedom is a major hurdle in the field of upper-limb prosthetics. A virtual reality myoelectric transhumeral prosthesis simulator has been developed for cost-effectively testing novel control algorithms and devices. The system acquires EMG commands and residual limb kinematics, simulates the prosthesis dynamics, and displays the combined residual limb and prosthesis movements in a virtual reality environment that includes force-based interactions with virtual objects. The virtual reality prosthesis simulator was shown in testing to be a promising tool for developing and evaluating control methods, prototyping novel prostheses, and training amputees.

JORIS M. LAMBRECHT, MS, AND CHRISTOPHER L. PULLIAM, MS, are affiliated with the Department of Biomedical Engineering, Case Western Reserve University, Cleveland, Ohio.

ROBERT F. KIRSCH, PhD, is affiliated with the Department of Biomedical Engineering, Case Western Reserve University; and Louis Stokes Cleveland VA FES Center of Excellence, Cleveland, Ohio.

Disclosure: The authors declare no conflict of interest.

This study was supported by TATRC W81XWH-07-2-0044.

Correspondence to: Joris M. Lambrecht, MS, 10900 Euclid Avenue, Wickenden 311, Cleveland, OH 44106-7207; e-mail:joris.lambrecht@case.edu.

Within recent years, dramatic improvements have been made in the mechanical design of upper-limb myoelectric prostheses.1 For instance, DEKA Integrated Solutions Corporation has developed the 10 degrees of freedom (dof) “Luke” arm that has entered clinical trials.2 In contrast, a state-of-the-art commercially available myoelectric transhumeral prosthesis has 3 dof (elbow flexion, wrist rotation, and a terminal device) that is generally operated sequentially by using a switch command between the operation of individual dofs. A major unsolved problem in upper-limb prosthetics is providing reliable independent command sources for intuitively and efficiently controlling multiple dof simultaneously. Virtual reality has been suggested as a method to quickly develop and evaluate control strategies, prototype devices, and train subjects.2–6 Another recent study demonstrated that training with a virtual prosthesis was equivalent to training with a physical prosthesis.7

Previous myoelectric prosthesis simulators have included costly or complicated components making them impractical for widespread use in clinical settings.4,5 In addition, although simulators have accurately modeled the dynamics of the prostheses themselves,4,5 modeling of interactions with the virtual environment has been limited. Modeling of the behavior of the prosthesis under variable loads and describing the characteristics of objects to be grasped are necessary, but so far unachieved goals, to demonstrate functionality of a prosthesis in a virtual environment.5 Advances in computer hardware and development of real-time physics simulation software—driven by widespread use in commercial video games—have made these goals realizable.

This article describes the software and hardware components of a simulator for evaluating and training novel command and control strategies for transhumeral myoelectric prostheses. The system is composed of portable easy-to-setup components that can be taken home by a user, but which still realistically simulate functional tasks through real-time physics simulation. A virtual version of a common clinical assessment test was implemented as a demonstration, in which two prosthesis control methods were compared.

Back to Top | Article Outline

METHODS

SYSTEM OVERVIEW

The subject (intact-limbed or transhumeral amputee) views an animation of his or her humeral movement and the simulated prosthesis movement in a virtual environment. The system includes several components (illustrated in Figure 1 and described in detail below) that are interfaced to a single laptop or desktop PC, running Matlab/Simulink (The Mathworks Inc., Natick, MA), and a custom virtual environment application.

Figure 1.

Figure 1.

Back to Top | Article Outline

1. Kinematic Tracking

Kinematic tracking of the upper limb is achieved using a 3DM-GX1 Orientation Sensor (MicroStrain Inc., Williston, VT). This sensor combines triaxial accelerometer, triaxial magnetometer, and gyroscopes to provide an accurate 3D orientation with respect to an earth-referenced coordinate system. Gyro-stabilized “ZYX” Euler angles are polled from the sensor in Simulink at 20 Hz. Before using the simulator, a reference coordinate system (Aref), about which rotations will be calculated, must be determined. The sensor is held near the upper arm with a predefined orientation. Polled Euler angles are converted to a rotation matrix and stored. Next, the sensor is attached to the upper arm with a strap, and the subject is asked to hold the arm in the anatomic position (humerus parallel to trunk with palm facing anteriorly). A rotation matrix (R0) for that posture is similarly calculated and stored. All subsequent rotations are calculated relative to this nominal orientation using Equation (1), where Rout is the rotation matrix used to define the orientation of the upper arm and Rin is the rotation matrix computed from the raw sensor values.

Back to Top | Article Outline

2. EMG Acquisition

Electromyogram (EMG) signals are acquired and amplified using disposable surface electrodes and a BioRadio 150 wireless 12-channel physiological signal monitor (Cleveland Medical Devices Inc., Cleveland, OH). Up to eight channels can be configured to record EMG at 960 Hz. EMG data are digitally high-pass filtered—to remove motion artifact—and processed in packets in Simulink. The setup allows for unlimited customizability of the EMG processing/command algorithms. The device also has an auxiliary input that can be used to interface with an external push button or switch. More details on the specific EMG setup used in the experiment are explained in Experimental Design section.

Back to Top | Article Outline

3. Physics Simulation

Physics simulation is implemented using Newton Game Dynamics (NGD, newtondynamics.com), a deterministic force-based solver. Massera et al.8 used NGD to simulate an anthropomorphic neurorobotic arm to study the ability of evolutionary algorithms to achieve functional grasping patterns.

The prosthesis in the virtual environment was modeled after the Utah Arm 3 (Motion Control Inc., Salt Lake City, UT). The prosthesis has three independently controlled dof: elbow flexion/extension, wrist pronation/supination, and grasp aperture. The fingers and thumb are subjected to a kinematic constraint, so that during grasp, the thumb moves twice the angle of the fingers. Dynamic properties of the simulated prosthesis were tuned to mimic the behavior of the physical prosthesis. Figure 2 illustrates the joint axes and the collision/inertial hulls for the forearm, hand, finger, and thumb segments. The segment coordinate systems are shown at each segment's center of mass.

Figure 2.

Figure 2.

The simulated residual limb segment is kinematically constrained to match the orientation computed from the orientation sensor. The requirement for accurate placement of the upper limb based on the subject's limb orientation and simulation of physical interactions in the environment can result in a paradox. Because the virtual environment may limit the virtual arm because of an obstacle, but cannot limit the user's actual movement, the user could place his or her arm in such an orientation that the virtual joint constraints cannot be maintained. The kinematically controlled shoulder joint was therefore made less stiff than the prosthesis joints to allow the virtual shoulder joint to “dislocate” slightly (i.e., not exactly match the user's true posture) to help insure that the prosthesis constraints can be maintained. Users were also instructed to avoid upper-limb movements that would “jam” the prosthesis into walls and corners in the virtual environment.

Objects in the environment are treated as rigid bodies and can take any shape. NGD supports various collision hulls including boxes, ellipsoids, cones, cylinders, or any convex shape. Compound hulls allow multiple collision hulls to be combined to form complex shapes that include concavity, for example, a mug with a handle (Figure 3). The use of object “materials” allows for varying frictional coefficients between surfaces. For example, the friction between the hand and the blocks was set higher than the friction between the blocks and the box, or between the hand and the box, to more closely reflect the interactions between a myoelectric hand and physical objects.

Figure 3.

Figure 3.

Back to Top | Article Outline

4. Visualization

The virtual environment graphics were created using Gamestudio A7 game development system (Conitec Datasystems Inc., La Mesa, CA). The virtual environment includes realistic features expected in modern video games: soft skin deformation, dynamic shadows, and high-quality 3D graphics (Figure 3). The virtual reality simulator is displayed using the NVIDIA 3D Vision System (NVIDIA Corporation, Santa Clara, CA) to provide depth perception and enhanced immersion. The 3D Vision system consists of a 120-Hz computer monitor—which alternately displays left and right eye images—and active shutter glasses that are synchronized to the monitor by an infrared signal to produce a compelling stereoscopic view of the environment and the simulated arm.

The virtual camera is placed at the location of the virtual person's eyes to provide a first-person perspective. A common limitation of simulators is that a single-display monitor does not provide sufficient viewing angle to mimic peripheral vision. Flight simulators, for example, commonly use multiple monitors placed around the subject. Other simulators4 and video games, particularly with head-mounted displays, use head orientation tracking to allow the subject to move the virtual camera to point toward a region of interest. Instead, to simplify our setup, an automatic camera tracking algorithm is used to keep the camera pointed toward the region of interest—assumed to be the hand. The camera direction vector is calculated as a weighted average of the tracking vector and the nominal sight vector, where the tracking vector is a unit vector in the direction from the eye position to the center of the hand, and the nominal sight vector is a unit vector pointing anteriorly and inferiorly. In this manner, the virtual camera always keeps the hand within the view but also maintains a natural forward facing view.

The virtual reality environment and physics simulation are compiled (“published”) together as a standalone executable application. Virtual prosthesis commands and residual limb kinematics are sent to the application from Matlab/Simulink using shared memory mapping.

Back to Top | Article Outline

EXPERIMENTAL DESIGN

The prosthesis simulator is intended to be a customizable platform for evaluating command and control algorithms and for training and evaluating the performance of amputees using various prostheses and controller algorithms. For this study, as a demonstration, a modified Box and Block Test was simulated, and two control methods were evaluated. In the Box and Block Test of Manual Dexterity,9 subjects are asked to move as many blocks from one side of a box to another over a divider within 1 minute. Dimensions in the virtual world were based on the standardized box and block dimensions,9 except for the depth of the box (2.2 cm instead of 7.5 cm). Because of the difficulty of the task for transhumeral amputees, the time period was extended to 2 minutes, and the task was performed standing rather than seated.10

Table 1 summarizes the command sources used for each prosthesis action using a sequential and a synchronous command method. In the sequential method, the EMG signals from an antagonist pair of muscles, biceps, and triceps were used to control the prosthesis. A push button located on the clavicle was used to switch between each of the three actions. In the synchronous control method, each prosthesis function was commanded by a different muscle, allowing multiple functions to be controlled simultaneously. The upper trapezius and latissimus dorsi were able to be independently activated by subjects while elevating and retracting the shoulder, showing potential for controlling the hand and wrist. Because of the limited number of additional available EMG sources, the hand was configured to automatically close and the wrist was limited to rotate in one direction.

Table 1

Table 1

Three normally limbed subjects gave informed consent to participate in the study. Two disposable snap-type surface electrodes were placed approximately 2.5 cm apart on each muscle (from Table 1) parallel to the fiber direction— estimated using anatomical images as a guide—to record a differential signal. An additional reference electrode was placed on the elbow. All electrodes were connected to the BioRadio 150 with approximately 1-m long leads. EMG was digitally high-pass filtered (20 Hz third-order Butterworth) to eliminate motion artifact and notch filtered (59–61 Hz fourth-order Butterworth) to reduce 60-Hz noise. For both command methods, EMG data were processed by taking the mean rectified value of the EMG over 50-ms packets and applying a gain factor and minimal activity threshold (“deadzone”). Appropriate EMG gains and thresholds were determined before testing, such that a maximum command could be achieved with slightly less than a maximum voluntary contraction and that the upper arm could be moved around without eliciting an unwanted command.

Before testing, each subject practiced for at least 20 minutes with both command methods to become familiar with the simulator and the command methods. The command method tested first was randomly selected for each subject. The virtual Box and Block Test was completed five times using the first command method with 5-second pauses between each 2-minute test. The subject then rested, refamiliarized with the second command method, and completed five more tests in the same manner. This set of tests was repeated once in the same order, such that each subject completed a total of 10 virtual Box and Block Tests for each command method.

Back to Top | Article Outline

RESULTS

Three normally limbed subjects completed a virtual modified Box and Block Test10 10 times for the sequential and synchronous methods. Results are shown in Figure 4 alongside results reported by Miller et al. for transhumeral amputees performing the same test with a physical prosthesis, using conventional prosthesis command methods. The “conventional” command methods used by each amputee were optimized by prosthetists and occupational therapists, and may not have been identical for each subject.10 Although not identical to either command method used in our simulator, the conventional command methods used were sequential in nature. On average in the simulator, subjects moved 6.7 ± 1.9 blocks by sequential method and 6.7 ± 2.2 blocks by synchronous method. The amputee subjects moved, on average, 6.8 ± 3.1 blocks using a physical prosthesis.

Figure 4.

Figure 4.

A repeated measures two-way analysis of variance was performed on the Box and Block data, with control method and test set as factors. The test set—first or second set of 10 tests—was considered a factor to determine whether there were significant improvements (because of learning) or worsening (because of fatigue, attention loss, etc.) in performance. Across both control methods, the average number of blocks moved was 6.5 ± 2.2 in the first set and 6.9 ± 1.9 in the second set. No statistically significant differences were found between the control methods (p = 0.98) and the test sets (p = 0.64).

Although no statistical claims can be made on the closeness of the sequential (conventional) simulated results to the physical prosthesis results reported by Miller et al.—because the number of times each transhumeral subject was tested was not reported—it is evident that the results are very similar. However, it seems that the physical prosthesis results were substantially more variable than the simulated results (as determined by larger standard deviations within and between subjects).

Back to Top | Article Outline

DISCUSSION

IMPLICATIONS

A portable, cost-effective, and simple to use virtual reality myoelectric prosthesis simulator has been developed that accurately simulates both the dynamics of a transhumeral prosthesis and its interactions with the environment. The simulator allows for assessment in a virtual environment, such as with the Box and Block Test demonstrated in this study. However, virtual objects are not limited to blocks and can take any shape and size. Objects can also vary in weight, elasticity, and surface friction. The simulator allows clinicians and engineers to easily test new command and control algorithms and prototype novel devices in a functionally relevant manner. Many studies have reported on the classification accuracy of various pattern recognition-based control algorithms, but the functional relevance of their results is unclear. The simulator allows for functionally relevant training of an amputee before fitting and receiving a prosthesis. Furthermore, the portability of the setup would allow the amputee to train outside the clinic. Another major advantage of the virtual simulator is that normally limbed subjects can be used to test algorithms.

In the simulator, kinematic recording of the residual limb is required as most manual tasks require positioning the whole arm, not just the joints of the prosthesis. Positioning of the residual limb also has other implications: the biceps and triceps—commonly used as command sources in transhumeral prostheses—are both biarticular muscles that normally cross both the elbow and shoulder joint and therefore may become active when the shoulder is moved. This muscle activity could elicit unintended commands. Therefore, it is necessary to test control algorithms with various shoulder angles. Micera et al.11 highlighted the importance of posture on decoding in a recent review. By implementing kinematic recording of the residual limb, the prosthesis simulator allows evaluation of control algorithms in all possible postures.

Back to Top | Article Outline

LIMITATIONS AND FUTURE WORK

A major limitation of the virtual reality simulator is that the user does not feel the movements of the virtual prosthesis. The weight and inertial effects of the prosthesis alone could affect the muscle activity recorded from the residual limb. In addition, when collisions occur in the virtual environment, the user's real arm is unaffected, which can sometimes lead to a paradox (described in the Physics Simulation section), where the user can force the prosthesis into impossible configurations. Haptic feedback systems, such as the HapticMaster,12 could be used to provide a virtual haptic interface in combination with the virtual visual environment. The device could exert an opposing force on the user's real residual limb when the user's movements or commands cause a collision in the environment, eliminating this paradox. The loading from the weight of the prosthesis and any grasped objects could also be simulated. The requirement to develop a simple, portable, and cost-effective simulator limited the ability to use haptic devices in this study. Some prostheses use tactile vibrators to indicate the pressure being applied to a grasped object. Such devices could easily be incorporated into the simulator as well.

The quality of kinematic tracking depends largely on the three-axis magnetometer output. A room with steel file cabinets, desks, or chairs may have substantial fluctuations in the measured direction of north even over the small distances that the orientation sensor translates when mounted on the residual limb. This results in inaccuracies in the recorded rotation about the vertical axis. Therefore, it was important to insure that the volume in which the residual limb moved had a relatively constant magnetic field by monitoring the output of the sensor as it was translated through this volume with a fixed orientation. It was also necessary to define the reference coordinate system within this volume. In this study, the subject stood at least 1.5 m away from the nearest ferrous metal object.

Because of the real-time requirement of the simulation, physical interactions between objects must be limited to some extent. On a Dell Dimension dual core 2.4-GHz PC with NVIDIA 9800 GTX graphics card, the simulation ran near the maximum 100 frames per second except when multiple objects were being constrained simultaneously. For example, if multiple blocks were being pushed into the divider simultaneously, the frame rate dropped dramatically, causing instability in the simulation. Therefore, only one block at a time was placed in a random position and orientation. When the user successfully moved the block to the other side, a new block appeared. Because the box is normally filled with 150 blocks,9 the top layer of blocks is near the top edge of the box. The virtual box had a decreased depth to mimic this and reduce the likelihood of impossible to resolve collisions.

The currently simulated prosthesis has a conventional hand with a single dof, grasp aperture. The advent of multiarticulated hands with multiple grasp patterns such as the commercially available iLimb (Touch Bionics, Edinburgh, Scotland), BeBionic hand (RSLSteeper, Rochester, England), and soon to be available Michelangelo hand (Otto Bock, Duderstadt, Germany) has surely revolutionized prostheses. Future work will incorporate these sophisticated hands into the virtual environment to aid in the development of control methods. In addition, for a transradial prosthesis simulation, the elbow flexion and forearm pronation angles will need to be measured to kinematically constrain the distal residual forearm segment. This can be achieved with an additional orientation sensor.

Both the prosthesis and the objects with which it interacts can be modified in the simulator. Future work may involve developing more virtual functional assessment tasks such as the clothespin relocation task,4 nine-hole peg test of finger dexterity,13 Grooved Pegboard Test,14 or possibly the Southampton Hand Assessment Procedure,15 which incorporates bimanual tasks.

Back to Top | Article Outline

CONCLUSION

The feasibility of using a virtual reality prosthesis simulator for training and assessment has been demonstrated with a virtual Box and Block Test. The virtual test was compared with data from amputees performing the same test with a physical prosthesis. Our simulator approach should allow for functionally relevant testing of the command and control algorithms presented in the literature. The transhumeral prosthesis simulator presented here could easily be modified to incorporate transradial and shoulder disarticulation prostheses or any novel multi-dof devices. Simulated interactions with virtual objects of various shapes, sizes, and weights allow virtual representation of almost any task. Furthermore, the simulator system is simple to setup, portable, and relatively inexpensive, which may prove important in the acceptability for widespread clinical use in amputee training.

Back to Top | Article Outline

ACKNOWLEDGMENTS

The authors thank subject volunteers for their time and patience and Joyce Tyler, OTR/L, CHT for her advice.

Back to Top | Article Outline

REFERENCES

1. Adee S. The revolution will be prosthetized-DARPA's prosthetic arm gives amputees new hope. IEEE Spectrum 2009;46:45–48.
2. Kuiken TA, Li G, Lock BA, et al. Targeted muscle reinnervation for real-time myoelectric control of multifunction artificial arms. JAMA 2009;301:619–628.
3. Churko JM, Mehr A, Linassi AG, et al. Sensor evaluation for tracking upper extremity prosthesis. In: Proceedings of the 31st Annual International Conference of the IEEE EMBS, Minneapolis, MN, September 2–6, 2009.
4. Hauschild M, Davoodi R, Loeb GE. A virtual reality environment for designing and fitting neural prosthetic limbs. IEEE Trans Neural Syst Rehabil Eng 2007;15:9–15.
5. Jimenez GG, Ryuhei O, Akazaea K. Upper limb-hand 3D display system for biomimetic myoelectric hand simulator. In: Proceedings of the 23rd Annual EMBS International Conference, Istanbul, Turkey, October 25–28, 2001.
6. Li G, Schultz AE, Kuiken TA. Quantifying pattern recognition-based myoelectric control of multifunctional transradial prostheses. IEEE Trans Neural Syst Rehabil Eng 2010;18:185–192.
7. Bouwsema H, van der Sluis CK, Bongers RM. Learning to control opening and closing a myoelectric hand. Arch Phys Med Rehabil 2010;91:1442–1446.
8. Massera G, Cangelosi A, Nolfi S. Evolution of prehension ability in an anthropomorphic neurorobotic arm. Front Neurorobotics 2007;1:1–9.
9. Mathiowetz V, Volland G, Kashman N, et al. Adult norms for the Box and Block Test of manual dexterity. Am J Occup Ther 1985;39:386–391.
10. Miller LA, Stubblefield KA, Lipschutz RD, et al. Improved myoelectric prosthesis control using targeted reinnervation surgery: a case series. IEEE Trans Neural Syst Rehabil Eng 2008;16:46–50.
11. Micera S, Carpaneto J, Raspopovic S. Control of hand prostheses using peripheral information. IEEE Rev Biomed Eng 2010;3:48–68.
12. Van der Linde RQ, Lammertse P, Frederiksen E, et al. The HapticMaster, a new high-performance haptic interface. In: Proceedings of Eurohaptics, Edinburgh, UK, July 2002.
13. Mathiowetz V, Weber K, Kashman N, et al. Adult norms for the Nine Hole Peg Test of finger dexterity. Occup Ther J Res 1985;5:24–38.
14. Matthews CG, Haaland KY. The effect of symptom duration on cognitive and motor performance in parkinsonism. Neurology 1979;29:951–956.
15. Light CM, Chappell PH, Kyberd PJ. Establishing a standardized clinical assessment tool of pathologic and prosthetic hand function: normative data, reliability, and validity. Arch Phys Med Rehabil 2002;83:776–783.
Keywords:

simulator; transhumeral; amputee; myoelectric prosthesis; virtual reality; assessment; Box and Block Test

© 2011 American Academy of Orthotists & Prosthetists