This hand-tracked AR interface, made for Introduction to Human-Computer Interaction (fall 2024), was rated as best overall in the class by the professor and course staff.
It communicates all key information at a glance, offers maximum functionality with minimal button presses, and mimics haptics through visual and auditory feedback.
virtual reality unity team project
- Developed in 10 days
- Made in Unity for the Meta Quest 2
- Rated best project by course staff
- Created with David Yuan and
Grace Hu
This video explains all the functions of the interface.
Much of the interface design was collaborative, or at least brought to group meetings for discussion, so it is difficult to pinpoint exact credit, but my roles in the project included:
Designing the wristwatch interface for key information like oxygen level, interface battery, and diving depth
Implementing and helping design the toggleable arm menu
After many iterations, anchoring the menu above the forearm and always facing the user’s head
Designing and implementing the entire crew info menu
Creating 3D arrows that point towards crew members (including name and distance), as well as a “Go To” navigational feature that enlarges them and keeps them forwards and in view for the dive
Creating crew call buttons to instantly dial any crew member
Designing and implementing an emergency button anchored on the right wrist that allows divers to press it even while blinded
Filming and editing the project explanation video
All implementation of the interface occurred within Unity and C#.