Main /

Home Page

PmWiki

pmwiki.org

edit SideBar

Home Page

Experiments in Motion Capture for Gaming and Urban Sensing


NEWS: WINTER SHOW DEMOS this Thursday 5pm (Dec 15) ClassProjects


Participants at Ars Electronica experimenting on September 3rd, 2010 with NYU's Movement Lab's latest motion capture system.


Class meets every Wednesday from 5pm-6:50pm in WWH 101.

Office Hours (Chris Bregler) in his 12th floor office, 719 Broadway: Thursday 3:00pm-4:00pm or by appointment

Office Hours (Mayank Rana, TA) at 1227-Vision Lab, 12th Floor, 719 Broadway: Tuesday: 2pm-4pm

Class Code: G22.3033-009

This is a project based class on the topic of motion capture and vision based input technologies applied to gaming and urban sensing.

What is motion capture ?

http://en.wikipedia.org/wiki/Motion_capture

But we are also interested in experimenting with computer vision such as http://opencv.willowgarage.com/wiki/

What is motion capture based gaming ?

More generally, motion capture based computer input or vision based computer input has been a fruitful research agenda in many labs for sometime now. It became prime-time on the consumer market first with the Sony Eye-Toy, then with the Nintendo Wii, and this fall with Microsoft Kinect: http://www.xbox.com:80/en-US/kinect

Also our research team is currently at Ars Electronica to experiment with the Wii / Kinect for the masses using our latest Motion Capture System: http://crowd2cloud.org

What is urban sensing ?

We are not just interested in gaming, we are also interested in bringing motion capture into every aspect of society and everyday life. Urban Sensing is a hot new research area that empowers entire communities to participate and influence environmental monitoring, social studies, etc. For concrete examples, check out one of the leading centers at UCLA: http://urban.cens.ucla.edu/

As part of this class we like to come up with new ideas and technologies that allow us to discover NYC and its vibrant street dynamics with sensing technologies. Part of this also fits into a new NYU/CUNY/IBM project called "Smarter City" that is part of Major Bloomberg's PLANYC 2030. Also, we are very much interested in visualizing urban sensing with Google Earth and Google Street View APIs, and with mobile platforms like iPhone, iPad, and other mobile devices.


Instructor:

Chris Bregler will be teaching most of this class.

If you have more questions about this class, feel free to email Chris at chris.bregler@nyu.edu

We will also have guest lectures and class participants giving presentations in this class.


Course Mechanics:

This class is very informal. The main goal is to expose students to the exciting areas of cutting edge research and let students explore how to push the envelope on what is possible with current technology.

We will cover basic technologies and principles in motion capture each week. This is a very hands-on class, be prepared to be in the motion capture suit in the studio, or mount IP-cams allover NYC, and do all kinds of innovative experiments.

Class participants are also expected to give a few lectures in the class on some topic of their interest. The main part of the class will be the execution of a class project in a group. There will be no final or no midterm, just milestone presentations of project progress and a final project demo.


Who should take this class?

Everybody who's interested in motion capture and its many applications. You should be able to program in a language like C, Matlab, or even processing.org. You should be able to quickly learn new concepts.


Chris has been teaching variants of this class over the past 10 years at Stanford and NYU. Every year, we choose a different focus. For instance, in 2008 it was Animation and Art and was co-taught at NYU Tisch http://movement.nyu.edu/mocap08f/ Another version is here http://cims.nyu.edu/~bregler/classes/mocap_spring06/

(All) | Edit SideBar Page last modified on December 12, 2011, at 11:04 AM Edit Page
Powered by PmWiki