Playtime Computing: an environment to support Blended Reality
The Playtime Computing System is a technological platform that computationally models a blended reality interactive and collaborative media experience that takes place both on-screen and in the real world as a continuous space. On-screen audio-visual media (e.g., portraying virtual environments and characters – story world, etc.) have an extended presence into the physical environment using digital projectors, robotics, real-time behavior capture, and tangible interfaces. Player behavior is tracked using 3D motion capture as well as other sensors such as cameras and audio inputs.Physical objects can be instrumented or tracked so that they can serve as other tangible interfaces to affect the behavior of characters and objects both on screen and off screen. A digital paint interface is under development to allow players to add digital assets to the story world both on-screen and in the projected real-world space. These digital assets can be used to add interactivity, to author the world and its characters, or simply to embellish it aesthetically.
Characters in this system can seemingly transition smoothly from the physical world to the virtual on-screen world through a physical enclosure that metaphorically acts as a portal between the virtual and the real. Any events or changes that happen to the physical character in the real world are carried over to the virtual world. Digital assets can be transitioned from the virtual to the physical world. These blended reality characters can either be programmed to behave autonomously, or their behavior can be controlled by the players.
The Playtime Computing System supports co-creation of the blended reality story world through Creation Station workbenches and experiences that occur in the blended reality play space. The Creation Stations can either be co-located or remote from the blended reality play space. A live audio and video feed (or asynchronous image capture) from the blended reality play space environment is transmitted into the Creation Station workspace (via display or projection). Simultaneously, a camera captures and streams videos/images & audio from the Creation Station workspace into the blended reality play space. By doing so, local and remote players interact with each other and with physical and digital assets (either asynchronously or in real-time) at different spatial scales (room-scale in the play space, and desktop-scale at the creation station). Players can also create and share object-based media and digital assets between the blended reality play space environment and the Creation Station workspace. Multiple Creation Stations can be hooked into the blended reality play space. Furthermore, multiple blended reality play spaces can also be linked to create a “multi-chamber” play space that is geographically distributed.
The Playtime Computing space was developed to be accessible and education for children. The design elements elicit imaginative play through interactions with both physical and virtual elements, a design principle we call Blended Reality. The space has been utilized in the Fonduephone project to teach kids second languages by incorporating virtual play elements while interacting with a physical robot, Dragonbot. The space has also been used to understand kids perceptions of character continuity in mixed media interactions as well as to play physical and virtual games with robots and virtual agents through the Tofulandia project.
- D. Robert, C. Breazeal (2012) Blended reality characters. In Proceedings of 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI ’12), pp. 359-366.
- D. Robert, R. Wistort, J. Gray, C. Breazeal (2011) Exploring mixed reality robot gaming. Tangible and Embedded Interaction 2011, pp. 125-128
- D. Robert. (2011) Imaginative Play with Blended Reality Characters. S. M. Media Arts and Sciences, MIT.