Dancer to perform with distant computer-generated character
Pairing a real dancer with an animated dance partner is nothing new – it’s a technique used in any number of movies or television shows. But collaborating artists and engineers at the University of Illinois at Urbana-Champaign are putting a new spin on the idea.
Working with engineers in the Beckman Institute for Advanced Science and Technology at Illinois, visiting Beckman scholar Yu Hasegawa-Johnson is the visionary behind a real-time, high-tech pas de deux by dancers located thousands of miles apart. The performance, scheduled to take place Oct. 29 at the University of Southern California’s Bing Theater in conjunction with the fall meeting of the Internet2 advanced-networking consortium, will feature a live dancer at USC, who will share the dance floor with “a fully-articulated avatar.” The avatar, a computer-generated character that will assume various appearances during the dance – a baby butterfly, fairy and robot among them – will represent the movements of a dancer performing live in Beckman’s Integrated Systems Laboratory. The dancer’s movements will be transmitted in real time over the Internet. The animated images, created by Beckman’s Lance Chong, will be projected onto a semi-transparent screen placed between the dancer and the audience in the theater at USC. The dancers will see each other’s images and be able to dance in synch.
“As far as we know, this is the first time something like this has been attempted,” said Hasegawa-Johnson, the production’s co-producer and art director, and a filmmaker with a passion for tapping into online technologies to create art forms. For the upcoming production, Hasegawa-Johnson recruited dancers Chih-Chuh Huang and Cho-Ying Tsai and enlisted the technical support of Hank Kaczmarski, director of Beckman Institute’s Integrated Systems Laboratory. Kaczmarski is co-producer and technical director for the production, which uses motion-capture technology.
“The motion-capture technology that will be used,” he said, “is called ‘multiple-camera optical tracking.’ An array of 10 cameras surrounds the performer, who wears retro-reflective markers to tell the cameras the exact position of up to 500 locations on the performer’s body. Those body locations are mapped onto an avatar, which is then animated by the movement of the performer’s markers.”
Kaczmarski said the dance project demonstrates how technology originally intended for scientific purposes can be adapted and used to explore human creativity. “We are taking a valuable tool used in our lab by kinesiologists to study human motion and adapting that tool to serve the arts,” he said. “The ISL provides the environment for non-computer-savvy researchers to conduct research using ultra-state-of-the-art computer-based tools. My role in this project is to ‘tame’ the technology so that it is a servant to the performers, not the other way around.”
The Beckman crew’s production is one of several collaborative works on the Oct. 29 program created by individuals at Internet2 member institutions nationwide to showcase various ways in which networking technologies can be harnessed and used for artistic exploration.
All latest news from the category: Information Technology
Here you can find a summary of innovations in the fields of information and data processing and up-to-date developments on IT equipment and hardware.
This area covers topics such as IT services, IT architectures, IT management and telecommunications.
In the quantum realm, not even time flows as you might expect
New study shows the boundary between time moving forward and backward may blur in quantum mechanics. A team of physicists at the Universities of Bristol, Vienna, the Balearic Islands and…
Hubble Spots a Swift Stellar Jet in Running Man Nebula
A jet from a newly formed star flares into the shining depths of reflection nebula NGC 1977 in this Hubble image. The jet (the orange object at the bottom center…