OL21 Mixed Reality modeling component for nudging AI+AR glasses
The latest development within HMD-based Augmented Reality and Context-Aware Systems points towards a potential future where people could carry nudging AI+AR glasses that almost invisibly help them perform both professional and everyday tasks. Apart from ethical and privacy challenges (do we really want our perception to be invisibly manipulated in this way?) a number of advanced system functionalities would need to be engineered. Core mechanisms include manipulation of how the human individual perceives the surrounding environment (subtle visual attention guidance) driven by a story generation process which on the fly, using available real-world and digital objects, gradually lead users of the device towards a better understanding of a phenomena and/or simply conclusion of the task at hand. An even more fundamental mechanism for these envisioned AI+AR nudging glasses would be a system component which constructs and maintains a model of the immediate surrounding of the user, e.g. determining what physical/digital objects are present in front of the user, which of these are currently attended to, etc.
The aim of this project is to develop a first version of such a component running on a State-of-the-Art Augmented Reality headset.
Iterative/agile development of a prototype which is able to a) identify and track the location of a given small set of real-world objects, in relation to the user’s body as well as digital objects placed in the environment, using suitiable available APIs and software libraries. The outcome should be a continously updated model of the situation in which the user is in, possible to be queried by other components of the AI+AR nudging glasses system such as for instance the story generation component developed in a different student project.
The developed prototype will be evaluated through user testing where both qualitative User Experience parameters and quantitative more technical aspects (time lag, modeling errors) will be measured.
The testing scenario will be one where users are to learn about climate change and complex relationships between human behaviour and effects on global warming.
Pederson, T., Witzner Hansen, D., Mardanbegi, D. (2011). Investigations of the Role of Gaze
in Mixed-Reality Personal Computing. Short paper and poster at the ACM International Conference on Intelligent User Interfaces, IUI 2011.
Parts of the prototype development activities demand access to our most advanced Mixed Reality headset (Varjo XR-1) and need to be performed in the Open Lab (J337). Other parts can be developed on less advanced and more mobile headsets.