Showing posts with label motion sensing. Show all posts
Showing posts with label motion sensing. Show all posts

Thursday, April 15, 2010

Motion Sensing Technology and Dance

For my final Blog Post, I wanted to talk about how technology is being used in my field because it is an area I feel most people know nothing about. Therefore, I am going to inform you about how motion sensing technology is used in Dance today.

It sounds like a fictional story because the two just do not go together: a computer programmer working with a choreographer. Technology and art seem like two completely different things. But when you put them together, wonderful things start to happen. Together they created MidiDance. MidiDance is a wireless movement sensing outfit that transmits a dancer’s positions on stage to a computer. This information is then used to control video, audio, lighting, and set. It allows for an orchestra to be conducted with just a flick of a finger. You can set sounds to play when certain movements are executed. For example, when a dancer rolls a shoulder or lifts an arm, electronic drums and symbols can crash and echo. The computer used for Plane, which was the first piece to use this technology, was also programmed to sense the movement phrases of the dancers, and it detected when to begin visual projection.
Another form of motion sensing technology that also uses projection is Isadora. Isadora, named after the pioneer of modern dance in America, provides interactive control over digital media in real-time. With Isadora, you can build a series of interactive effects. Then, the effects can be manipulated by lighting, music, and other stage cues. A moment in the choreography, such as a deep pliƩ, will trigger a projected image. This image can be projected in real-time, but what separates it from the MidiDancer is that the real-time projection can be slower, faster, or repeated in loops.

The problem with projection that MidiDance and Isadora use is that it takes the focus away from the dancers. People tend to focus more on the screen than the actual dancer on stage. Therefore, new ways of using motion sensing technology in dance needed to be created to integrate dance and visual effects. Snappy Dance Theater and the Atlanta Ballet have done just that with their use of tracking cameras.

Snappy Dance Theater uses a camera that can track where people are onstage. When the dancer moves, the projected version is a series of glowing strings that form the shape of the body. The focus is not on a projected image of the dancer, but rather on shapes being made by the dancer. The Atlanta Ballet has also used tracking cameras. The Atlanta Ballet has had dancers wear infrared emitters, which are invisible to the audience but detected by special tracking cameras. The locations were fed through a graphic computer in real time. The animation video was fed to a high brightness video projector and animated particle trails of dancers’ hand movements were projected onto a sheet of see-through mesh. This is an even better way of integrating this technology with the dancers because it doesn’t require something to look at on a giant screen. Instead, the focus is more on the dancers and what they are doing.

The Interactive Media Technology Center has done many Dance Technology Projects. One featured a super-computer available through fiber-optics telecommunication. It featured motion tracked balls that were tossed around, causing tumbling 3D objects such as an elephant, house, and space shuttle to appear in place of the balls. A dancer with a motion tracking system, danced on stage next to her cyber re-embodiment.


Motion sensing technology has also been used in dance off the stage. Influential modern dancer and choreographer Merce Cunningham, who passed away this summer at the age of 90, worked with Credo Interactive and helped to develop the software called LifeForms. LifeForms Studio, is a 3D character animation and motion capture editing tool, used by professional animators, game developers, and film and broadcast specialists. Merce Cunningham used it to choreograph. It is now known as DanceForms, a choreography tool designed for dance educators, students, choreographers, and notators. It was designed with teachers and choreographers. You start with a digital dancer, appearing as a series of circles, a skeletal figure, or a human one. Then, you can move the figure using commands. It allows you to sketch out your choreographic ideas, mix, match, and blend sequences, use the existing libraries and palettes, animate single figures or large groups, and bring your dance ideas to 3D life.

Wednesday, March 10, 2010

Human Computer Interaction and the Future

Is technology becoming integrated with us so much that it will eventually cause major issues in how we live everyday life? For instance, take a look at this cartoon below:
This cartoon may seem funny, but is this really what life is coming to? Computers and humans are becoming closer to each other as the relationship between users and computers become stronger. The goal of Human Computer Interaction is to improve this interaction by making computers more usable and receptive to the user's needs. Just think about what we have seen in the past twenty years. Computers needed a mouse, a keyboard, speakers, etc. Now, everything is combined into one. We are starting to see the popularity of touch screens in the cell phone market, and pretty soon computers are all going to be touch screens as well so that we can interact with them more easily. But how far will this go? As we witnessed in class on Tuesday (March 9th), there is technology out there that relies on motion rather than touching a screen. We have already seen this with the Nintendo Wii (although this requires use of a controller). But Project Natal for Xbox is taking motion sensing for video games to another level without the use of controllers. Here is a video of what that might be like: http://www.youtube.com/watch?v=oACt9R9z37U

So what does this mean for our personal computers, and how is this going to effect the future of technology? Remember, the goal is to make the interaction between humans and computers easier, so will we soon live in a world that will require nothing more than motion sensing and tracking devices? We saw a video in class of a person that looked like he was shop-lifting, but actually, everything was paid for automatically when he walked out the door. This is going to make everything easier, and a world like this would be amazing to live in. However, what is going to happen if the technology fails, if systems crash, or if we lose power? Is a technologically based world going to come to a standstill every time this happens?


As you can see in the picture above, there are a lot of components that make up the Human Computer Interaction to make our lives easier. As we develop a deeper understanding in each of these areas, technology is only going to get stronger and become more integrated with each area. Just think about technology and how it relates to one area, such as engineering. How might technology improve everyday life in this area? How about linguistics? Will technology allow us to communicate with people who speak other languages across the world quickly and efficiently? We have already seen what technology is doing in areas such as neuroscience and artificial intelligence, as well as plans for future technology in these areas (during class discussions and videos). Imagine what our world will be like once all of these areas are improved and enhanced through advances in technology.