In this project you will develop an immersive Unity application for visualizing attributed graph data. Since exploring large, cluttered graphs can be challenging, the idea is to incorporate the eye gaze of the user to adapt the visual representation continuously. That way, the intended actions of the user should be facilitated without annoying them. To this end, you should create and implement different methods based on eye tracking to support the exploration process of users. Furthermore, appropriate synthetic data with different properties has to be generated. An extensive study is required to examine the practicability of the developed methods.
The exploration of networks in AR/VR offers lots of opportunities and potential. However, the classical interaction techniques currently used to facilitate the visual exploration of networks in an immersive setting do not fully make use of the capabilities offered by the new technology. Therefore, this project aims at incorporating eye tracking to facilitate the interaction and experience of users working with network data in the immersive setup.
- Work alone or as team of two students
- Get familiar with Unity and the technical requirements for developing a VR/AR application for graph exploration
- Create a framework for visualizing graph data and develop different techniques to support users by incoporating their eye gaze
- Create synthetic graph data with different properties to test your implementation
- Evaluation of the developed techniques by means of a user study
- Basic knowledge about visual analytics, graphs, and eye tracking
- Advanced programming skills in Java or C#
- Good conceptual skills (software architectures)
- Useful: Git, Unity, VR-Experience, experience with user studies
- Scope: Bachelor/Master
- Okoe, Mershack, Sayeed Safayet Alam, and Radu Jianu. "A gaze‐enabled graph visualization to improve graph reading tasks." Computer Graphics Forum. Vol. 33. No. 3. 2014.
- Majaranta, Päivi, and Andreas Bulling. "Eye tracking and eye-based human–computer interaction." Advances in physiological computing. Springer, London, 2014. 39-65.