Case Study: Data Visualization
Analytics Platform Redesign for Surgical Residents
Backend Software Engineers, Surgical Residents, Residency Programs, Director of Clinical Affairs
The Analytics Platform Redesign for desktop was created for surgical residents to capture their surgical proficiency to help them understand how to better prepare for future surgical cases.
Released to surgical residents for evaluation
70% Response rate from study and survey
Prior to the redesign of the analytics performance dashboard, a dashboard had been built but rarely seen or used by Osso VR’s customers. Due to requests from surgeons and surgical residents along with the need to harness and visually represent data collected from virtual reality run-throughs, I was placed in charge of identifying and displaying data that could be used by our end users to improve their surgical abilities in preparation for surgical cases in the future.
Surgical Residents requested four main types of data to help them evaluate and see where to improve their skills on surgical techniques:
Speed of their VR run-through on a given technique
How they compare to expert surgeons on a given technique
Where in the technique they can improve
How they compare to colleagues of the same year
Prioritizing the data
Through user interviews, we got a glimpse of how residents were assessed for surgical proficiency in the operating room. This helped us prioritize the data we wanted to show to both reflect the current assessment’s priorities while incorporating the data that surgical residents requested to see.
Attendings are the only ones evaluating a resident’s competency.
Residents are evaluated on a 1-5 scale. 1 being you’ve never picked up a knife to 5 being you’re exceptional at the surgery.
The assessment process can be very subjective and depend a lot on how well you work with a given attending.
What kind of graphs should be used to visualize this data?
I mocked up several different ways to visualize the above data and keeping the original style and surgical assessment in mind. We were trying to show more concrete data than was already being assessed in the OR. I established the idea of performance and proficiency levels to encompass the requested data. I aimed to incorporate ideas from the original dashboard to expand on what we already had built out.
Where and when should this data be presented?
Two Major Use Cases to Consider:
How do we evaluate the designs so far?
In collaboration with the Director of Clinical Affairs, we organized a beta Study with 2 residency programs to gain feedback on surgical resident overall perception of the dashboard, and to determine whether the data shown was valuable or what other aspects are needed to make the dashboard more useful.
What were our findings?
The use of stars for proficiency on a procedure was confusing. A number of the users reported associating stars with a rating not proficiency.
Few users pressed on the survey link which made it challenging to initially gather feedback. As a result, I followed up with residents to collect further feedback. Future surveys and feedback needs to be either done in person or more visible.
Updated Designs Prepared for Initial Release
What could have been done differently?
Instead of grouping all residents together, creating a slightly different dashboard for an intern, junior, and senior resident
Scheduling an in person walk through of the prototypes to receive feedback more efficiently and immediately following the experience.