In this project you will develop an immersive Unity application for visualizing attributed graph data. The attributes should be mapped on the extents of the graph elements, for example the node size. As perceiving and comparing size differences in VR/AR can be challenging, different techniques supporting the perception of the size-mapped attributes should be developed. To compare the virtual environment setting (with/without perceptual improvements) with a 2D display setting, a web application supporting the visualization and exploration of graph data should be also developed. An extensive study is required to examine the practicability of the developed methods for improving size perception in VR and to compare the VR setting to a classical, non-immersive setup.
The accurate perception and comparison of size can be challenging in immersive settings. However, it is often desirable to map certain attributes on size. Therefore, the aim of this project is to develop and compare different techniques for improving the perception of size in an immersive VR/AR setting.
- Work alone or as team of two students.
- Get familiar with Unity and the technical requirements for developing a VR/AR application for graph exploration
- Create a framework for visualizing graph data and develop different techniques to improve the perception of size-mapped graph attributes
- Create synthetic graph data with different properties to test your implementation
- Evaluation of the developed techniques by means of a user study.
- Basic knowledge about visual analytics
- Advanced programming skills in Java or C#
- Good conceptual skills (software architectures)
- Useful: Git, Unity, VR-Experience, experience with user studies
- Scope: Bachelor/Master
- Park, Hannah, et al. "Judgments of object size and distance across different virtual reality environments: A preliminary study." Applied Sciences 11.23 (2021): 11510.
- Kelly, Jonathan W., Lucia A. Cherep, and Zachary D. Siegel. "Perceived space in the HTC vive." ACM Transactions on Applied Perception (TAP) 15.1 (2017): 1-16.