Designing AR for Open Image Guided Surgery with RHLab and Stanford University
My Responsibilities
Researched, designed, and prototyped AR for surgeons
Timeline
18 months
Lab Leader
Rania Hussein
The Problem
Surgeons currently experience uncomfortable setups as well as inefficient and redundant workflows in accessing real-time data during surgery
The Solution
The solution, VisiOR, improves how surgeons retrieve and interact with patient data by using the Apple Vision Pro
Place CT Scans directly on the area of care
Surgeons want patient CT scans to be readily accessible and simple to place wherever they need in their situation
Turn 2D scans into 3D models for closer inspection
Surgeons want patient CT scans to be easily accessible and simple to place wherever they need in their situation
Segment and save 2D and 3D CT Scans for reference
Surgeons who need to inspect pieces of a CT Scan versus the entire scan can pinch and drag out pieces of the 2D-3D model to view
Toggle between various OR monitor screens
Surgeons struggle with the current fixed setup of the monitoring room. By augmenting these screens, they can freely move around
Prototype
I prototyped a live demo in Shapes XR that was shared 50+ times
Users can place CT Scans in the OR room and change the opacity
They can convert their 2D CT Scan into a 3D model and save segments
They can toggle between different OR monitor screens
They can check vitals and low rated vitals are emphasized
They can record the surgery for training purposes and further professional development
Research
VisiOR began as a continuation of Dr. Alami and Paderno’s
research at Stanford University on XR for Open Image
Guided Surgery. Their findings include...
_1
AR provides early detection, judgment, and timely
processing for surgeons
_2
Most real-time monitoring machines and displays are
wired which causes inconvenience to surgeons who
need to adjust positions during surgery
Applicant Pool
Our research also spanned across the world with 11 clinicians from Asia, Italy, and the U.S
We were able to compare the various needs across different
hospitals as well as evaluate how AR may be accepted within
the operating room
Design Concepts
As I iterated my design concepts with these surgeons and clinicians, their feedback caught me by surprise
Surgeons prefer hand and eye gestures over voice commands
The surgery room is too loud with disussion amongst OR
staff and machinery that voice command would be drowned out
AR headsets are a comfy alternative to current headware
Surgeons prefer to wear the headset throughout the entire procedure because it is much more comfortable than current monitor set ups and head gear
Surgeons have to continuously interpret CT scan data
They have to continuously interpret and process segments of the CT scans in their minds (tacit knowledge) instead of using faster visual processes to interpret patient data
Design Concepts
Surgeons were particularly interested in the concept of turning 2D Scans into 3D models
They already have to file through “segments” of the scan to gain a 3D perspective, so why not do the work for them?
Initial Design
RHLab engineered the change in window sizes via gaze and hand gesture as an MVP
This most closely tied back to Dr. Alami and Paderno’s research which made it a priority for the paper we plan to publish under Stanford
Prototype
Our paper is being reviewed for publication under RHLab and Stanford University