Telling a story effectively in Virtual Reality, Mixed Reality or interactive 3D environments in general can be challenging as the user can look wherever they want to, and potentially miss important narrative events or items.
Some 3D applications such as games address this by constraining the view (controlling the camera), constraining the environment (e.g. blocking off where users can go) or by different degrees of suggestion e.g. a big arrow pointing at an object, an object highlighted with a halo, or sounds to attract the users attention.
This project looks at subtle gaze direction cues in realistic virtual environments. The project should take in a model of a real world 3D environment surrounding the viewer and apply rendering techniques to emphasize certain objects or locations.
The environment model can comprise different types of data (depending on availability, students abilities, and available hardware):
- a 360degree static image or video of the environment
- a 3D capture of the environment e.g. with Gaussian splat
- a live optical see through view of the environment viewed through AR glasses e.g. Hololens
The gaze direction can take different forms:
- enhancing salience of regions in the image e.g. edge enhancement, increasing saturation/contrast or conversely de-emphasis by contrast reduction, blurring/smoothing
- abstracted rendering e.g. more detailed points/meshes in some areas, more salient brushstrokes, changing focal depth
THIS PROJECT IS SUITABLE FOR an FYP or Masters project.