Motion capture technologies have been used for over 20 years in games, film, simulation and entertainment technology.

One of the motion capture solutions that we have used is through a partnership with the Ball State University Biomechanics Laboratory, allowing the ability to record a performer’s natural motion to animate a digital character. The research objective of the Biomechanics Laboratory is to further an understanding of mechanical and neuro-muscular characteristics of human movement. The same technology used in research, can also be employed for simulation, animation and visualization. Our lab has employed motion capture in their animation workflow on a host of student and commercial projects - using the motion data from actors or dancers to apply to three-dimensional animated characters.

The Vicon process involves placing reflective markers on the performers as they move in real time. The data from the moving markers is then mapped to a 3D character rigged with a virtual skeleton, which is in turn animated by the data from the capture session. The character then inherits the exact movements of the motion capture artist. The data set is then applied to a virtual human skeleton in software such as MotionBuilder. We are also experimenting with the Kinect as a low end but immediate workflow to provide real time puppeteering of virtual characters as well as gestural input. 

Motion Capture Motion Capture using Motion Builder