Understanding Immersive Environments for Visual Data Analysis
Overview
This dissertation explores how Augmented Reality (AR) can be leveraged for visual data analysis, with a focus on human-centered design. This includes the influence of a multitude of parameters in the areas of users, environmental factors, and integration with existing systems on the human user of current and future AR systems. The research addresses the growing need to analyze complex data in immersive environments, utilizing AR’s potential for 3D visualization, natural interaction, and spatial integration.Research Questions
Since the amount of possible factors that can influence the design of future AR systems is staggering, in my thesis, I want to investigate exemplary factors across three different groups.
Generally, they focus on the characteristics of
- RQ1: What factors should be considered for designing AR applications in a human-centered way?
- RQ2: What influences does the environment have within AR applications?
- RQ3: How can AR applications be combined with existing systems and devices?
Human Ergonomics and my Research
To answer those research questions, I explore different human properties and real-world environments that can affect the same environment’s augmentations. For human factors, I investigate the competence in working with visualizations as visualization literacy, the visual perception of visualizations, and physical ergonomics like head movement. Regarding the environment, I examine two main factors: the visual background’s influence on reading and working with immersive visualizations and the possibility of using alternative placement areas in Augmented Reality. Lastly, to explore future Augmented Reality systems, I designed and implemented Hybrid User Interfaces and authoring tools for immersive environments.
Research Projects
Moderating Effect of Visualization Literacy and Visual Adaptation
This chapter describes a research project focused on understanding the effect of user competence (i.e., visualization literacy) in combination with different visualization styles on scatter plots and bar charts. As the competence to work and understand visualizations is independent of the device type they are presented on, I decided not to use AR for this project. This allowed a first look at the fundamental issues caused by the increasing number of visualization users and the number of visualization types. With that, this chapter presents an empirical online study, and the evaluation of the collected data via linear mixed models.
Influence of Real-World Backgrounds on the Perception of AR Visualizations
Following, a research project is presented, that centers around one of the basic features of AR devices: the integration of virtual visual content into real-world scenes. Especially the potential influence of the environmental parameter of the real-world background and its visual features, i.e., luminance or color, are of interest in this project. For that, two studies were conducted. One focused on the influence of the visual background and the visual complexity of the visualization while the second focused on the impact of the visual background and a split-focus task design. Both studies were evaluated through various statistical tests.
Understanding AR Content Placement on Ceiling and Floor
This chapter investigates alternative placement areas for virtual AR content beyond the eye level, i.e., on the ceiling and the floor. By exploring the placement areas as environmental parameters, it is possible to reduce the amount of content in the user’s FoV by placing it on the available and accessible spaces above and below the FoV. With that, this chapter presents a small survey on the use of ceiling and floor in the current literature, a definition of the placement, and the physical ergonomic parameters for placing content in either area. Following, two studies were conducted, one exploring how the ceiling and floor can be used in future AR applications while another defines optimal and preferred placement parameters for content in both areas. Lastly, a list of design recommendations on how both areas should be used in future AR systems is also presented.
Hybrid User Interfaces for Immersive Visualizations
Lastly, I explored how AR HMDs can co-exist and work in unison with other existing device classes in the current device ecology. I implemented such Hybrid User Interfaces (HUIs) for visualizations, focusing on an analysis or authoring workflow. The former workflow presents handcrafted visualizations spanning the different devices, while the latter enables in-situ visualization creation in any given immersive environment. Both workflows and systems demonstrate how a symbiosis of varying device types can be imagined and realized. With that said, the chapter investigates the term HUIs and the two described prototype systems in more detail.
