Tracking User Data
Problem To Solve
As more techniques are created, we need to find ways to showcase a user’s data so that they can receive immediate feedback, learn from their mistakes, and be able to visually see their progress over the course of time to improve.
Create a dashboard that an orthopedic resident can access via laptop/computer
Create immediate feedback for the user on the laptop/computer
Create a scoring algorithm that fairly assesses a resident’s performance on a VR technique
Provide the user with an immediate feedback screen followed by a dashboard where they can visualize their time spent in the simulation, their accuracy, and how they compare to an expert surgeon on that technique.
The Design Process
Defining the Problem
After completing Osso VR techniques, users consistently asked for their results. This information/need was collected through informal conversations with resident’s and medical device representatives at the American Association of Orthopedics Surgeons Conference in 2018 as well as during demos consistently conducted by the CEO and Marketing Director at Osso VR. Beneficial feedback, according to users, would include:
How quickly they made it through the technique
How they compared to an expert surgeon
Where they could improve to be more efficient and accurate
How they compared to the rest of their cohort (other residents in their year)
User Flow: Current Model versus Future Model
To understand how we wanted our users (residents) to use the dashboard we started by creating an overview of how our customers use the dashboard currently, and how we’d imagine their experience would be once the new dashboard was in place. This helped get a general sense of the bigger items needed, without getting too stuck in the details.
Part 1: The Dashboard Design
To begin, I was given the task to not completely revamp the dashboard but to use what we already had, data wise, and to organize the information in a different way that would make it easier for our users to find what they were looking for. This in itself prompted the immediate question - what is the purpose of this dashboard? Based on conversations with the product manager, product owner, and users, the purpose was to provide residents with relevant information following their runs in VR (analytical dashboard), so that they could tell
a.)How they were currently performing on any given technique (Proficiency Level)
b.)How they compared to experts in the technique (Goal Time)
c.) How and where they could improve to be more efficient on the technique. (Improvement Opportunity)
Below are a series of different mid-fidelity mockups for the insights tab of the dashboard done in Sketch to try to capture different aspects of the insights dashboard the product owner and our current users were keen on keeping. The focus was how to chose the best/right representation of the data and display it on the dashboard. Other parts of the dashboard are available to view if that’s of interest! Including immediate modal and procedure page.
The first version of the dashboard aimed to include all 3 requests stated above. Each iterations shows different options on how to represent proficiency for the overall technique (tibial nail in this case) and proficiency for a section of the technique (ex: entry reaming, nail assembly, etc.) The design was kept in black and white to keep users from getting caught up on the color of the design and to focus on the content.
Following a Number of Iterations
Over the course of several iterations, inspiration from Netflix to combine pages so our procedure/technique page and insights page were all in one, and reviewing the designs with customers we came to the final consensus that this would be the design/color scheme for the first updated version of the dashboard. The colors would need to be updated but would be focused on following our 1st updated release. We should continue with the color scheme that had originally been used in the dashboard design.
Part 2: The Scoring Algorithm
Defining the Scoring algorithm
After developing the general information needed for the dashboard and immediate feedback, it was important to spend time figuring out and developing a scoring algorithm that residents would find to be fair. To do this, we started by conducting user interviews with several residents.
We were able to schedule 2-3 user interviews with both newer and older residents in residency to help dive into what would be considered a fair or an unfair score. Several of the questions asked during this session (for a 5th year resident) is as follows:
How were you evaluated in residency? Is there a concrete measurement or more subjective measurement?
Do you get to choose your attending?
How do attendings know you’re ready for the OR?
Is there a file or evaluation that an attending fills out on residents?
What is the break down of the 1-5 scale used by attendings?
How does the information or skills you need to know as a 5th year compare to that of an intern or junior resident?
How do you prepare for a case that you haven’t seen in a while or before?
By diving into the experience of the resident’s and understanding how they are currently evaluated, as a company we can think of ways of capturing those metrics in our own scoring algorithm for our VR simulations.
Following the implementation of both the designs and the scoring algorithm, we were interested in getting an initial round of feedback on our dashboard and immediate feedback modal. To do so, we released the analytics dashboard internally.
Among the feedback from other Osso VR colleagues and team members some of the feedback focused on both the look and the feel of the app, as well as bugs found through use of the launcher. This initial release was extremely important for us to find the most impacting bugs and fix them before beginning to think about testing this dashboard out with a small set of beta users.
To begin to test this new dashboard design out with our end users, residents, we chose 2 residency programs in our Osso Training Network (residency programs who have opted to incorporate Osso VR to help train their orthopedic residents on techniques) to provide feedback.
To do so, I worked with our Director of Clinical Affairs as well as the Product Owner to develop a pdf explaining the tasks we were asking these residents to do with the Osso software step-by-step, as well as helping to drive conversations between the program directors at these residency programs, our customer support team, and residents to get the most amount of feedback back and to limit the amount of time the programs had the beta version of the dashboard.
We received feedback from approximately 7 residents to help us update our dashboard experience. Among the feedback we received included:
Love the ability to see a summary of stats right away
Enjoyed the ability to see more than just a score, liked seeing progress over the course of the technique
Scoring felt alright - would love to see specific places where they hesitated or would love to see their economy of motion in future iterations of the dashboard, if we collect that kind of data.
It’s important to get product requirements solidified earlier in the design process so that the designer knows what to design instead of needing to guess.
When developing UI, it’s okay to have sketches first before jumping directly into Sketch to make the Ui look pretty. Content should trump aesthetic look if you’re looking to create a meaningful experience for users. If you don’t do this, it slows down your process and you’re at a risk of delivering something that isn’t actually needed.
Design should be completed and approved before engineering starts to implement the design. This means giving design a week or two ahead of engineering to define the user experience and interface. Also, if you have a gut feeling as a designer that something is off, you should leverage your ability to conduct user research to reach an answer and convince your product owner to change direction (ex: star rating for proficiency)
It’s tough to get feedback from residents and you really need someone at the school helping to get them to actually run through software. Helpful to get feedback by hosting a single day where program director runs a number of residents through the techniques