Customer feedback suggested our users were struggling to analyse one of our products, preventing them from getting the best outcomes possible.
Translating customer feedback into product insights to better understand the problem
We identified as a team that we needed a clearer picture of which parts of our product were contributing to the problem. As the feedback was centered around our tree-testing product, we needed to then narrow down what about it was causing confusion.
I began with desk research, looking into research previously done on the tool, and working with our data team to gather usage statistics on parts of our product to better understand the bigger picture.
We still required more details, so the next step was to write a discussion guide, and meet with customers to discuss how they use the tool, and what parts were confusing for them. I created a spreadsheet document to track which features each participant was using most, and correlating that to the feedback we originally gathered.
After we had concluded the interviews with our participants, I tagged and analysed the results, deriving further insight to take to the wider team.
Facilitating ideation workshops with the team to identify potential solutions
Once we had gathered the necessary information, the next step was to bring things to our team to discuss in more depth, and see what areas we may be able to improve to remedy the issues. This session was also invaluable for finding ways to measure our success, and which product metrics we should be monitoring, or potential follow-up research.
Through this process, we found there were 3 areas that we categorized our solutions into:
Documenting our existing product eco-system to scope the potential impact of solutions
To help my team to better understand this part of our product, I began documenting previous iterations of it through screenshots and other means. This helped us to understand the scope of our changes overall, and coming to a realization that our product had alot of legacy decisions that may be contributing to the confusion overall.
Re-thinking how customers analyse their tasks
We found through our research that customers were primarily analyzing their results on a per-task basis, which our tool wasn't great at assisting with because in many cases we would show all of the tasks at once for a single analysis method.
We decided that shifting the analysis workflow to be focused on per task analysis was a good way forward, and would hopefully reduce cognitive load when using the tools. Another relevant insight from our research was that showing an overview of the results still had a lot of utility for researchers when presenting back to stakeholders, so we found ways to balance this.
Here are is an small example of how our new approach enabled us to improve the workflow for one of our most well-known analysis methods:
The original iteration of the pie-tree opened in a new window, meaning that users had to move back and forth between this window and the rest of the tool. It also needed to be opened per-task, so there was no easy way to add easy task-switching in this view.
The updated version has been moved into a new analysis tab with the rest of the visualizations in our tool, and leverages a drop-down component to act as a way to switch between tasks easily.
We consulted users on this new workflow and found they all thought it was an improvement on the original design, and simplified a lot of the workflow regarding task analysis and presenting to their stakeholders. Hurrah!
Created a plan for delivering significant design changes over time, to minimize impact and risk
The team identified early on that this wouldn't be as simple as feature improvement, as it appeared to be more related to customer understanding of our data and workflow.
This meant to make some of the larger IA changes, we needed to be more strategic in how we planned delivery, so as to lower risk of breaking parts of our product experience in unintended ways.
I worked closely with the Product Manager and Senior Engineers on my team to ensure we were able to balance these changes correctly, and ensure that we would be able to measure them independently as best as possible.
Helped to bring our changes to market by partnering with product marketing
Working in tandem with the product marketing team, we wrote entirely new help center content to cover our changes to the product. In the process we set up a new process for auditing and cataloguing current content gaps for other teams to utilize.
Continuous monitoring of feedback & usage data to identify areas for improvement
To measure our success continually, we are always reviewing inbound feedback from Intercom and Pendo, as well as regular reviews of BI dashboards we set up using Holistics.
As a result of these mechanisms, we have been able to see an overall improvement in some areas, but also areas for more improvement. Having a balance of Qualitative and Quantitative data is critical to us building the best picture possible of how we are tracking.
We are currently in the process of performing more discovery testing with customers, to find capture anything we have missed, and validate the improvements.