You are here

SBIR Phase II: Plug and Play Characters for 3D Virtual Environments

Award Information
Agency: National Science Foundation
Branch: N/A
Contract: 1127499
Agency Tracking Number: 1127499
Amount: $500,000.00
Phase: Phase II
Program: SBIR
Solicitation Topic Code: Phase II
Solicitation Number: N/A
Timeline
Solicitation Year: 2011
Award Year: 2011
Award Start Date (Proposal Award Date): 2011-09-01
Award End Date (Contract End Date): 2013-08-31
Small Business Information
146 Montelena Ct.
Mountain View, CA 94040-1088
United States
DUNS: 832775675
HUBZone Owned: No
Woman Owned: No
Socially and Economically Disadvantaged: No
Principal Investigator
 Okan Arikan
 (510) 730-1212
 okarikan@gmail.com
Business Contact
 Okan Arikan
Title: PhD
Phone: (510) 730-1212
Email: okarikan@gmail.com
Research Institution
 Stub
Abstract

This Small Business Innovation Research (SBIR) Phase II project will complete the development of reusable, self-encapsulated animated characters for use in 3D virtual environments. 3D characters are difficult to develop because of the inflexibility of current motion representations. Presently animations are compiled into characters using a static data structure. This makes the addition of new animations an off-line, time-consuming process. Using extensive motion annotation, our technology allows applications to link together animations at run-time. The end-product objective is a network of 3D mobile applications that run on multitouch-enabled devices like smart phones. The technology enables 1) transfer of characters within a growing network of applications (i.e., 'plug and play' characters), 2) user selection of animations to use in each application, and 3) character control through a novel multitouch-based interface. Intellectual merits involve creation of a character authoring and control interface, and analysis of alternative, flexible representations of character animation at the semantic level. This project will have broader impact in three areas. First, multitouch is a new interactive paradigm that will become ubiquitous through the proliferation of smart phones and tablets. This project will investigate multitouch schemes for intuitive control of complex, articulated models such as 3D humanoid figures. Second, this project will advance our understanding of semantic categories for human motion. Such labels are important for motion synthesis and motion recognition. Third, this project will develop methods for building virtual environments incrementally. Virtual environments are used widely in entertainment, training simulations, virtual worlds, and other 3D applications. The company will develop technology for adding new assets in a scalable manner.

* Information listed above is at the time of submission. *

US Flag An Official Website of the United States Government