Home > Research > Motion Capture

Motion Capture

Overview / Introduction

The 3D Bio-Motion Research Lab (3DBMRL) at the Center for Computer Aided-Design, the University of Iowa, performs applied and basic research in human motion analysis. The lab is equipped with a Vicon motion capture system with 12 SV cameras and a Motion Analysis system.  The Motion Analysis system features a 4 Megapixel resolution with only 1-2 frame latency. It is a real time specific system with Eagle-4 Digital cameras that can collect at up to 500 Hz with a shutter speed ranging from 0-2000 µs.  The focal length can be adjusted from 18-52 mm.  Sophisticated models for biomechanics and animation applications are available for whole body and hand.  Visual3D software is used in the 3DBMRL to analyze and share data with collaborators for various testing scenarios.  Visual3D data (collected from human subjects) can be used in human validation studies and to enhance simulation capabilities.

motion-capture-vicon.jpg
Vicon system

   

motion-capture-motion-analysis1.jpgmotion-capture-motion-analysis2.jpg
Motion Analysis system

 

motion-capture-motion-analysis3.jpgmotion-capture-motion-analysis4.jpg
Motion Analysis system

 

The 3DBMRL is committed to providing industry and government with innovative testing capabilities and analysis tools to study human motion and human response to external loading. Research activities include human response to whole body vibration, design and control of structures under dynamics loading, real-time data collection, and human motion validation.

Methods / Current Research

Marker placement protocols 

Marker placement protocols have been developed for a variety of testing environments, including standing, seated whole body scenarios and the hand.

motion-capture-marker1.jpgmotion-capture-marker2.jpgmotion-capture-marker3.jpg
Marker placement protocols

 

Inverse Kinematics

VSR has developed an Inverse Kinematics (IK) code that uses the Denavit-Hartenberg (DH) skeletal geometry and joint relations of Santos.  Traditionally in inverse kinematics software, the measured locations of the joint center and the end effecters are used to calculate the corresponding body joint angles. This can be done using several commercial software packages; such as Vicon, Motion Analysis, and Visual 3D. Yet, due to the complexity of the Santos skeleton model use of commercial software for the comparison purposes is difficult due to the incompatibility in the direction and orientation of the joint axes. The in-house IK uses the Denavit- Hartenberg (DH) method to correlate local coordinate systems of adjacent joints, based on the geometry of the skeleton, and expressed by a DH table.  For a fair comparison between the simulation and the experiments, the calculated joint angles from IK should be as accurate as possible. Therefore, the proposed IK has very tight error tolerance which does not exceed 5 mm at each point. Due to the large DOF of the model under investigation, 55 DOF at this time and potentially more in near future, and the redundancy in joint angles calculations, a nonlinear optimization problem is required to find a realistic feasible solution. An optimization based scheme is used to minimize the error between the locations of the joint centers obtained from the experiments with those predicted by the inverse kinematic software, subjected to the natural limits on the various joint angles. Due to the high degree of redundancy a large-scale sequential quadratic programming (SQP) approach in SNOPT [31] is used to solve the optimization problem.

The IK developed by VSR is capable of computing joint angles for complicated scenarios that include walking forwards, ascending a ladder and entering a cap.  The optimization scheme converges for the 55 DOF Santos skeleton, such that the experimental data is presented on the same skeleton used for simulation and the resulting motion can be animated on Santos.

Real Time

The real-time motion capture tools developed by Clarkson University and VSR allow the Santos environment to be driven by a motion capture subject in real-time.  Animators rely on instant feedback, but IK rigs often slow down character animation and artists tend towards simple skeletal representation of characters.  The development of the Santos real-time engine focused on proper scaling of the Santos avatar to the subject and stability of the animation throughout the motion capture session.  The animation occurs in real-time and the Santos avatar adds realism to the scene.

VSR conducts real time motion capture sessions with a twelve camera Vicon motion capture system, Vicon IQ 2.0 software and Virtools 4. 

Real-time interactive motion capture with SantosTM results in the accurate and simultaneous reproduction of a human subject’s motion. This process allows SantosTM to demonstrate real-life applications, such as maintenance or equipments testing, and to provide real-time feedback information to the users.

motion-capture-realtime.jpg
Real Time

 

Performance Measure Evaluation 

The inverse kinematics code computes joint angles in Santos’s skeletal structure from positional data and provides a means for animating Santos directly from motion capture.  Geometries, such as the Caterpillar cab can be inlayed into Santos’s environment based on the position and orientation of tracking markers relative to the subject during the experiment.  In simulation, performance measures are minimized or maximized to obtain joint angle profiles, but obtaining joint angles in the DH space from motion capture data allows for the forward calculation of Santos’s performance measures, such as biomechanical discomfort.    

The discomfort objective function accounts for the sequential movement of segments and tendency towards a comfortable posture by normalizing joint terms based on deviation from the neutral position, and applying weights, to the normalized joint term .  A penalty associates the movement of joints near the limits with increased biomechanical discomfort.

Discomfort in ingress/egress task as an example here has been evaluated to represent the enhanced feedback mechanism capabilities of the virtual human and is shown in Figure 2.  Phases are indicated by the color scheme shown in Figure 1 where the complex ingress/egress scenario is separated into dynamic tasks with transitions.  Discomfort is considered for the upper body and is presented for the time history of the trial.  Some key frames are indicated to highlight how the performance measure captures complexities of the human motion.

motion-capture-color-scheme.gif
Figure 1: Color scheme for key phases of the ingress/egress tasks.

 

motion-capture-door-open.gif
Figure 2: Discomfort throughout the Door Open task.

 

Discomfort has been normalized to a meaningful scale and reflects the increased biomechanical discomfort experienced at extreme reaching motions (eg. grabbing the handrails) and in awkward postures (eg. reaching for the handrails).

The sophisticated feedback mechanisms of the virtual human, Santos, provide valuable insight into the performance measures associated with complicated tasks, and the work done here with discomfort could be expanded to include vision, fatigue, effort, etc. 

Hand

There is significant interest at the 3DBORL in creating a hand model for Santos that is capable of performing various tasks such as grasping and moving objects. While the research to build a hand model at VSR is at a mature level, there is ongoing research to validate the analysis behind the model.  

A validation framework has been established for hand validation where Santos’s simulated data is compared with human subject’s data.  An apparatus was designed to experimentally collect data on reaching postures.  A home position has been identified to match data between Santos and the motion capture subject allowing the relation of joint angles for a valid comparison.  

motion-capture-hand.jpg
Hand 

 

Contact Info

Salam Rahmatalla, Ph.D., Assistant Professor, Civil and Environmental Engineering, Center for Computer-Aided Design (CCAD), The University of Iowa, Iowa City, IA 52242, USA. Tel: 319-335-5614, Fax: 319-384-0542

E-Mail: salam-rahmatalla@uiowa.edu

Related Publications

  1. S. Rahmatalla, H. Kim; M. Shanahan, C.C. Swan “Effect of Restrictive Clothing on Balance and Gait using Motion Capture and Dynamic Analysis,” Paper# 2005-01-2688, SAE 2005 Transactions Journal of Passenger Cars-Electronic and Electrical Systems, March 2006.
  2. S. Rahmatalla, T. Xia, M. Contratto, G. Kopp, D. Wilder, L. Frey-Law, James Ankrum “3D Motion Capture Protocol for Seated Operators in Whole Body Vibration,” International Journal of Industrial Ergonomics38, pp. 425-433, 2008.
  3. R. T. Marler, S. Rahmatalla, M. Shanahan, K. Malek “A New Discomfort Function for Optimization-Based Posture Prediction,” SAE Digital Human Modeling for Design and Engineering Conference, Iowa City, Iowa, June 14-16, 2005.