Tracking User Data

High Fidelity Version 1 of Feedback Screen for a Technique

High Fidelity Version 1 of Feedback Screen for a Technique

Problem To Solve

As more techniques are created, we need to find ways to showcase a user’s data so that they can receive immediate feedback, learn from their mistakes, and be able to visually see their progress over the course of time to improve.

Tasks

Design:

  • Create a dashboard that an orthopedic resident can access via laptop/computer

  • Create immediate feedback for the user on the laptop/computer

  • Create a scoring algorithm that fairly assesses a resident’s performance on a VR technique

Solution

Provide the user with an immediate feedback screen followed by a dashboard where they can visualize their time spent in the simulation, their accuracy, and how they compare to an expert surgeon on that technique.

Programs Used

Sketch, InVision

Role

UX Designer


The Design Process

Defining the Problem

After completing Osso VR techniques, users consistently asked for their results. This information/need was collected through informal conversations with resident’s and medical device representatives at the American Association of Orthopedics Surgeons Conference in 2018 as well as during demos consistently conducted by the CEO and Marketing Director at Osso VR. Beneficial feedback, according to users, would include:

  • How quickly they made it through the technique

  • How they compared to an expert surgeon

  • Where they could improve to be more efficient and accurate

  • How they compared to the rest of their cohort (other residents in their year)

User Flow: Current Model versus Future Model

To understand how we wanted our users (residents) to use the dashboard we started by creating an overview of how our customers use the dashboard currently, and how we’d imagine their experience would be once the new dashboard was in place. This helped get a general sense of the bigger items needed, without getting too stuck in the details.

Overview of User Flow: Meant to get the overview of the experience and where we want the updated dashboard to strengthen the user experience from finishing the technique to logging into the dashboard to reviewing their results for a given procedure.

Overview of User Flow: Meant to get the overview of the experience and where we want the updated dashboard to strengthen the user experience from finishing the technique to logging into the dashboard to reviewing their results for a given procedure.

Part 1: The Dashboard Design

To begin, I was given the task to not completely revamp the dashboard but to use what we already had, data wise, and to organize the information in a different way that would make it easier for our users to find what they were looking for. This in itself prompted the immediate question - what is the purpose of this dashboard? Based on conversations with the product manager, product owner, and users, the purpose was to provide residents with relevant information following their runs in VR (analytical dashboard), so that they could tell

a.)How they were currently performing on any given technique (Proficiency Level)

b.)How they compared to experts in the technique (Goal Time)

c.) How and where they could improve to be more efficient on the technique. (Improvement Opportunity)

Below are a series of different mid-fidelity mockups for the insights tab of the dashboard done in Sketch to try to capture different aspects of the insights dashboard the product owner and our current users were keen on keeping. The focus was how to chose the best/right representation of the data and display it on the dashboard. Other parts of the dashboard are available to view if that’s of interest! Including immediate modal and procedure page.

The first version of the dashboard aimed to include all 3 requests stated above. Each iterations shows different options on how to represent proficiency for the overall technique (tibial nail in this case) and proficiency for a section of the technique (ex: entry reaming, nail assembly, etc.) The design was kept in black and white to keep users from getting caught up on the color of the design and to focus on the content.

Iteration 1. Shows the hypothetical users progress using the idea of filling a circle. This specific representation wasn’t received well by our users. The shape itself took up a lot of real estate on the page, and the circle were tough to quickly glance at and understand.

Iteration 1. Shows the hypothetical users progress using the idea of filling a circle. This specific representation wasn’t received well by our users. The shape itself took up a lot of real estate on the page, and the circle were tough to quickly glance at and understand.

Iteration 2. Shows the hypothetical users progress using the idea of vertical bars. This specific representation was received well by our users. It was easy for the users to read the bars, but they were slightly confused by the various shades of grey of the bars.

Iteration 2. Shows the hypothetical users progress using the idea of vertical bars. This specific representation was received well by our users. It was easy for the users to read the bars, but they were slightly confused by the various shades of grey of the bars.

Iteration 3. Shows the hypothetical users progress using the idea of stars. This specific representation was received well by our product owner and users. It was easy for the users to read the stars and understand that more stars is equal to more progress. As a UX designer my worry here is that the user would assume that the stars meant they were rating a technique not that they were efficient. Similar to when you give 5 stars on an Amazon review, etc.

Iteration 3. Shows the hypothetical users progress using the idea of stars. This specific representation was received well by our product owner and users. It was easy for the users to read the stars and understand that more stars is equal to more progress. As a UX designer my worry here is that the user would assume that the stars meant they were rating a technique not that they were efficient. Similar to when you give 5 stars on an Amazon review, etc.

Following a Number of Iterations
Over the course of several iterations, inspiration from Netflix to combine pages so our procedure/technique page and insights page were all in one, and reviewing the designs with customers we came to the final consensus that this would be the design/color scheme for the first updated version of the dashboard. The colors would need to be updated but would be focused on following our 1st updated release. We should continue with the color scheme that had originally been used in the dashboard design.

Version 1 of the Updated Dashboard With Color. The final page consolidates the information so that the user doesn’t feel that they are being taken away from the main page. The limitations with this design is the amount of space we have to work with to show the data. This may lead to issues, that we’ll need to tackle if we continue down the path of this design. Otherwise, the data does in fact capture the initial requests from the residents to show current performance, show where they could improve, and how they compare to both expert and other residents in their year.

Version 1 of the Updated Dashboard With Color. The final page consolidates the information so that the user doesn’t feel that they are being taken away from the main page. The limitations with this design is the amount of space we have to work with to show the data. This may lead to issues, that we’ll need to tackle if we continue down the path of this design. Otherwise, the data does in fact capture the initial requests from the residents to show current performance, show where they could improve, and how they compare to both expert and other residents in their year.


Part 2: The Scoring Algorithm

Defining the Scoring algorithm

After developing the general information needed for the dashboard and immediate feedback, it was important to spend time figuring out and developing a scoring algorithm that residents would find to be fair. To do this, we started by conducting user interviews with several residents.

User Interviews

We were able to schedule 2-3 user interviews with both newer and older residents in residency to help dive into what would be considered a fair or an unfair score. Several of the questions asked during this session (for a 5th year resident) is as follows:

  1. How were you evaluated in residency? Is there a concrete measurement or more subjective measurement?

  2. Do you get to choose your attending?

  3. How do attendings know you’re ready for the OR?

  4. Is there a file or evaluation that an attending fills out on residents?

  5. What is the break down of the 1-5 scale used by attendings?

  6. How does the information or skills you need to know as a 5th year compare to that of an intern or junior resident?

  7. How do you prepare for a case that you haven’t seen in a while or before?

By diving into the experience of the resident’s and understanding how they are currently evaluated, as a company we can think of ways of capturing those metrics in our own scoring algorithm for our VR simulations.

Following the implementation of both the designs and the scoring algorithm, we were interested in getting an initial round of feedback on our dashboard and immediate feedback modal. To do so, we released the analytics dashboard internally.

Feedback

Among the feedback from other Osso VR colleagues and team members some of the feedback focused on both the look and the feel of the app, as well as bugs found through use of the launcher. This initial release was extremely important for us to find the most impacting bugs and fix them before beginning to think about testing this dashboard out with a small set of beta users.

Beta Users

To begin to test this new dashboard design out with our end users, residents, we chose 2 residency programs in our Osso Training Network (residency programs who have opted to incorporate Osso VR to help train their orthopedic residents on techniques) to provide feedback.

To do so, I worked with our Director of Clinical Affairs as well as the Product Owner to develop a pdf explaining the tasks we were asking these residents to do with the Osso software step-by-step, as well as helping to drive conversations between the program directors at these residency programs, our customer support team, and residents to get the most amount of feedback back and to limit the amount of time the programs had the beta version of the dashboard.

Feedback

We received feedback from approximately 7 residents to help us update our dashboard experience. Among the feedback we received included:

  • Love the ability to see a summary of stats right away

  • Enjoyed the ability to see more than just a score, liked seeing progress over the course of the technique

  • Scoring felt alright - would love to see specific places where they hesitated or would love to see their economy of motion in future iterations of the dashboard, if we collect that kind of data.

Learnings

  • It’s important to get product requirements solidified earlier in the design process so that the designer knows what to design instead of needing to guess.

  • When developing UI, it’s okay to have sketches first before jumping directly into Sketch to make the Ui look pretty. Content should trump aesthetic look if you’re looking to create a meaningful experience for users. If you don’t do this, it slows down your process and you’re at a risk of delivering something that isn’t actually needed.

  • Design should be completed and approved before engineering starts to implement the design. This means giving design a week or two ahead of engineering to define the user experience and interface. Also, if you have a gut feeling as a designer that something is off, you should leverage your ability to conduct user research to reach an answer and convince your product owner to change direction (ex: star rating for proficiency)

  • It’s tough to get feedback from residents and you really need someone at the school helping to get them to actually run through software. Helpful to get feedback by hosting a single day where program director runs a number of residents through the techniques