Analytics Platform For Surgeons

Overview

The analytics platform design was in response to our current dashboard becoming outdated, multiple requests from customers and surgical residents to provide more data following a run through one of our VR simulations, and a pattern of low retention among our users. I’ll specifically focus on the design process behind the insights section of the dashboard.

I conceptualized our current versus ideal future dashboard experience for performance, conducted user research, created wireframes, created clickable prototypes, iterated on the designs, and coordinated a release internally and externally.

The analytics design was released to multiple residency programs for feedback and improvement.

Example of the Immediate Modal Presented to the User Following the Completion of a Technique.

Example of the Immediate Modal Presented to the User Following the Completion of a Technique.

Target Audience

Surgical Residents

Programs Used

Sketch, InVision

Role

UX/UI Designer


The Design Process

Defining the Problem

Due to low Osso VR usage among surgical residents and requests to provide more data following the completion of a VR run, I redesigned our analytics platform.

User Interviews

I conducted a number of user interviews with surgical residents at the American Association of Orthopedics Surgeons Conference in 2018. From these interviews I summarized a list of requests/needs.

  • How quickly they made it through the technique

  • How they compared to an expert surgeon

  • Where they could improve to be more efficient and accurate

  • How they compared to the rest of their cohort (other residents in their year)

Original Dashboard - Version 1.0 used to showcase data to the user.

Original Dashboard - Version 1.0 used to showcase data to the user.

Updated Dashboard - Version 2.0 used to showcase data to the user.

Updated Dashboard - Version 2.0 used to showcase data to the user.

Current Model versus Future Model

I created an overview of how our customers use the dashboard currently, and how their experience would be once the new dashboard was in place. This helped get a general sense of the bigger work items needed.

Overview of User Flow: Meant to get the overview of the clickable experience and where we want the updated dashboard to further focus the user on an insight associated with a specific technique.

Overview of User Flow: Meant to get the overview of the clickable experience and where we want the updated dashboard to further focus the user on an insight associated with a specific technique.

Part 1: The Dashboard Design

I began with a series of mock-ups looking at the placement and content to include for the insights dashboard. I also took note of several user cases, we had to design for: 1) a first time user who had never done a run through of a VR simulation, 2) a return user who had done multiple runs 3) a user who was just practicing a technique 4) A user who was testing themselves on a technique

As we developed the vision, we expanded to include and evaluate new designs. Below, a series of high-fidelity iterations for the insights tab of the dashboard. They were done in Sketch. The focus was on the representation of the data and how to effectively display it on the dashboard.

Each iteration, below, shows a different way of representing proficiency for the overall technique (tibial nail in this case) and proficiency for a section of the technique (ex: entry reaming, nail assembly, etc.) The designs were kept in black and white to focus users on the content.

Iteration 1. Shows the hypothetical users progress using the idea of filling a circle. This specific representation wasn’t received well by our users. The shape itself took up a lot of real estate on the page, and the circle were tough to quickly glance at and understand.

Iteration 1. Shows the hypothetical users progress using the idea of filling a circle. This specific representation wasn’t received well by our users. The shape itself took up a lot of real estate on the page, and the circle were tough to quickly glance at and understand.

Iteration 2. Shows the hypothetical users progress using the idea of vertical bars. This specific representation was received well by our users. It was easy for the users to read the bars, but they were slightly confused by the various shades of grey of the bars.

Iteration 2. Shows the hypothetical users progress using the idea of vertical bars. This specific representation was received well by our users. It was easy for the users to read the bars, but they were slightly confused by the various shades of grey of the bars.

Iteration 3. Shows the hypothetical users progress using the idea of stars. This specific representation was received well by our product owner and users. It was easy for the users to read the stars and understand that more stars is equal to more progress. As a UX designer my worry here is that the user would assume that the stars meant they were rating a technique not that they were efficient. Similar to when you give 5 stars on an Amazon review, etc.

Iteration 3. Shows the hypothetical users progress using the idea of stars. This specific representation was received well by our product owner and users. It was easy for the users to read the stars and understand that more stars is equal to more progress. As a UX designer my worry here is that the user would assume that the stars meant they were rating a technique not that they were efficient. Similar to when you give 5 stars on an Amazon review, etc.

Exploring New Layout Options
After several iterations, we converged on content and placement that captured our user’s initial requests. Next, we explored the idea of keeping the insights menu on the same page as our dashboard, so that our users didn’t need to leave the homepage. Below, is a mock-up of both a modal design (left) and an in-laid insights/performance tab (right).

Final Insight Tab Design

We had the designs built out and used by our customers and target users (surgical residents). We came to the final consensus that this would be the design/color scheme for the first updated version of the dashboard. The colors would need to be updated but would be focused on following our 1st updated release.

Version 1 of the Updated Dashboard With Color. The final page consolidates the information so that the user doesn’t feel that they are being taken away from the main page. The limitations with this design is the amount of space we have to work with to show the data.

Version 1 of the Updated Dashboard With Color. The final page consolidates the information so that the user doesn’t feel that they are being taken away from the main page. The limitations with this design is the amount of space we have to work with to show the data.

Part 2: The Scoring Algorithm

Defining the Scoring algorithm

After developing the general information needed for the dashboard and immediate feedback, it was important to spend time figuring out and developing a scoring algorithm that residents would find to be fair.

User Interviews

We scheduled 2-3 user interviews with both newer and older surgical residents in residency programs. Several of the questions asked during this session (for a 5th year resident) are as follows:

  1. How were you evaluated in residency? Is there a concrete measurement or more subjective measurement?

  2. Do you get to choose your attending?

  3. How do attendings know you’re ready for the OR?

  4. Is there a file or evaluation that an attending fills out on residents?

  5. What is the break down of the 1-5 scale used by attendings?

  6. How does the information or skills you need to know as a 5th year compare to that of an intern or junior resident?

  7. How do you prepare for a case that you haven’t seen in a while or before?

After creating the scoring algorithm we were ready to release the analytics dashboard internally.

Feedback

This initial release internally was extremely important for us to find the bugs that impacted the user experience and fix them before releasing the platform to multiple residency programs.

Releasing the Dashboard and Receiving Feedback

We released the dashboard to 2 residency programs for feedback.

We received feedback from approximately 50% of the total residents given the analytics dashboard. The feedback was as follows:

  • Love the ability to see a summary of stats right away

  • Enjoyed the ability to see more than just a score, liked seeing progress over the course of the technique

  • Scoring felt alright - would love to see specific places where they hesitated or would love to see their economy of motion in future iterations of the dashboard, if we collect that kind of data.

Lessons Learned

  • Product requirements weren’t solidified early in the design process due to uncertainty from the product owner. To avoid this situation in the future, the designer needs to determine or request more concrete requirements before starting work.

  • Initially, a lot of time was spent making the dashboard pretty, but at the cost of getting content and placement solidified. To avoid this situation in the future, content and placement should be determined before jumping straight into Sketch to make something pretty.

  • It was difficult to get feedback from residents. To avoid this problem in the future, a trusted contact or someone from the company should be at the program and running residents through the techniques.