Open Lab

OL22 Immersive Climate Data Visualization in VR and AI-Enhanced Nudging

Aim/goal/research question

In the developing field of AI-enhanced Augmented Reality (AR) and Virtual Reality (VR), the concept of “nudging”—subtle guidance to influence human cognition and behavior—presents both promise and challenge. Can these methods help us understand and manage complex issues, improving lives or helping us deal with societal problems? Do we really want our perception to be invisibly manipulated in this way? To properly investigate such issues, a range of advanced functionalities must be developed. These include systems that alter how individuals perceive their environment through subtle guidance, and using techniques to generate and guide developing narratives to help users gain a better understanding of complex subjects in real-time. This project aims to investigate these dynamics in the context of immersive VR-based climate data visualization. The objective is to develop a VR component that leverages AI-driven nudging to assist users in navigating and interpreting complex information. A central focus is the creation of narratives within the data; as users navigate the visualization, the system will direct their attention to key elements, forming narratives that enhance understanding. By exploring these issues, the project aims to contribute to ongoing research in AI-enhanced nudging techniques for educational and decision-making applications in AR/VR environments.

Method

Iterative/agile development of a prototype will be employed, designed to present complex information related to climate science within an immersive VR environment. The immersive visualization will leverage spatial and embodied interaction to provide a more intuitive and engaging user experience. The prototype will incorporate not just data points but also explanations and illustrations to enhance comprehensive understanding of the subject. AI-driven nudging techniques will be explored to guide users through this multifaceted, immersive information landscape. Evaluation will focus on qualitative measures related to user experience, such as ease of use and the effectiveness in aiding user understanding. Technical performance metrics will also be considered to identify any performance or implementation issues, highlighting both promising approaches and areas for further research. The testing scenario will involve users navigating through a VR environment that presents information about climate science, human behavior, and its impacts on global warming. The aim is to explore how AI-driven nudging can enhance user understanding of these complex interrelationships.

Recommended past experience/interest

Skills in Unreal Engine, C++, Data Visualization, Computer Graphics, Human-Computer Interaction, and a basic understanding of VR/MR technologies would be beneficial for participating in this project.

Related Work

  • Pederson, T., Janlert, L-E., & Surie, D. (2011). Setting the Stage for Mobile Mixed-Reality Computing – A Situative Space Model based on Human Perception. IEEE Pervasive Computing Magazine vol. 10, no. 4, pp. 73-83, Oct. 2011. DOI: 10.1109/MPRV.2010.51
  • Jalaliniya, S. and Mardanbegi, D. (2016) EyeGrip: Detecting Targets in a Series of Uni-directional Moving Objects Using Optokinetic Nystagmus Eye Movements. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (CHI ’16). Association for Computing Machinery, New York, NY, USA, 5801–5811. https://doi.org/10.1145/2858036.2858584

Other comments

The project is part of the bigger Nudging AI+AR glasses project (https://openlab.hv.se/project/nudging-aiar-glasses/) run by the Open Lab.

Resource limitations

The project will require access to high-end VR headsets for development and testing. Most of the work will be conducted in the Open Lab to ensure access to the necessary hardware and software resources.

Contact