We will develop methods and a pilot system for context-dependent search and presentation of information with the means of augmented reality. Information associated to physical objects and situations can be accessed and then aligned with the real environment for visual and auditory display. The user's context and foci of interest are measured with wearable cameras and eye tracking. Novel statistical machine learning methods are used for multimodal information retrieval and for taking the context into account.
The main applications considered in this project are in urban planning, design and construction, studies and enhancement of social interaction, and in creating a personal assistant that supports information retrieval, media access, and memory.
For more information see main project page.
Last updated on 9 Jun 2008 by Antti Ajanki - Page created on 20 May 2008 by Antti Ajanki