My role
Summary
The process
Goal
After I advocated for a more user-centred, data-based design process, my team manager gave me the go ahead to create a “v1” process that we could test and iterate on.
User-focused, iterative design was still a new concept for many on our executive team. So we needed to focus on team and marketing oriented metrics that would speak to our leadership's current priorities and prove the value of this approach. With that in mind we set targets to improve build time and conversion-rates for Gifts of Compassion.
Research
To kick things off I gathered as much research as I could. I met with colleagues from the design industry working at startups and larger tech companies to discuss their typical team workflows. I also studied standard, industry leading processes used by companies like IDEO and Google.
I synthesized all the ideas into a living document in our project management tool, Asana, then organized all the different methods under five different categories: Define the vision, Observation/Prep, Ideation, Rapid prototyping, Implementation, and Post-launch Research & Iteration. This document was not supposed represent a strictly linear process, nor would we use every method for every project. Rather, it was built to be a toolbox we could use to quickly construct an effective process for any given project, regardless of unique user and business needs.
Collaboration
Once i had the prototype process built out, I brought it to my team to get input from our UX researcher, developers and team manager. Some of the feedback that came out the session was that user research was too front-loaded in the process; it should be present throughout design, testing and iteration. So we made some additions to each section to include it. We committed to iterating on the process during team retros and got ready to put it to use.
Constraints and trade-offs
Before we got started though, we had to embrace some constraints and trade-offs to make the process work this time around. First of all, our directors had a strong preference towards quantitative over qualitative data, being much more motived by the data gathered through Google Analytics and A/B testing than what we could bring to the table through UX research. This, combine with a heavy preference to having a product in market, meant that we were limited to testing and iterating post-launch.
It would have been helpful to quickly test and iterate at the low fidelity design stage, immediately incorporating feedback from real users between each iteration. That being said, because iteration and data-based decision making was a new concept for the organization, any chance to demonstrate its value was a huge opportunity.
With this in mind, we moved forward with plans to run multiple A/B tests on launch and to incorporate heat-mapping and user recordings into our post-launch analysis. Also, we continued to seek out opportunities with both colleagues and leaders to advocate for the value of user-centred research and design.
IRL: Component based iteration
E-commerce page
E-commerce heatmap
Mobile layout
Mobile product page
Initial A/B tests
Round 1: Small iterations
Because we’re a small team, we decided to focus on small iterations to begin with in order allow space for us to design and develop larger changes for the third test. Each test brought about incremental improvement.
Results: Remove video CTA from hero section
4.8%
increase in conversion rate
1.2%
decrease in bounce rate
Results : Change hero image to match ads
2%
increase in conversion rate
4%
decrease in bounce rate
Round 2: Mobile focus
After launching the initial tests we saw that, though our desktop conversion rate was great (13.09%), our mobile conversion rate was lagging significantly behind (3.03%). A variance between desktop and mobile conversion is to be expected, but this gap was much larger than industry benchmarks. Our heatmaps also showed us that, when it came to scroll depth, the majority of our mobile users were not getting beyond the initial hero section. So for our major iteration we decided to focus on improving our mobile experience.
In this case, our design changes focused on making content more scannable and reducing the depth of scroll necessary to reach the item section of our shop page. Because we were making multiple changes to content and layout, we decided to run a multivariate test.
Results: Mobile focus
16%
decrease in conversion rate
1%
decrease in bounce rate
37%
decrease in revenue
Overall results
Enlisting our new process for the first time brought about a lot of wins. During team retros we noted that cross team communication was much better. We were able to run multiple rounds of A/B testing for the first time and, because of this, saw measurable improvements in both bounce rate and conversion rate; 5.2% and 6.8% respectively.
Focusing on iterative, component based design meant that project completion time was cut in half from 12 to 6 weeks.
The biggest change, however, was the 74% increase in digital revenue over our previous year’s campaign. At this point I should callout the work of our Marketing team. They increased ad spend across our digital platforms and worked hard to improve the effectiveness of our ads, stories and social posts, enlisting their own testing methodologies. This metric wouldn’t have been hit without them funnelling a lot more people to our site and priming them to give.
That being said, I can’t sell short the impact that our team’s new approach had on the campaign’s success. Our focus shifted to the user, from the the beginning of the project onwards, and that created an experience that made giving feeling good. The increase in revenue confirmed our hypothesis that data-based, iterative design could help us better understand our users’ needs and, in turn, give them a barrier free experience when it came time to donate.
Overall learnings
Problem solving: While the obvious benefit of component based design and development was faster build times, we saw some unexpected benefits as well. It helped us break down design problems themselves and target specific metrics. Because of this we saw improved communication across teams and reduced scope creep.
We need more testing: As we thought at the beginning of the project, more varied research and testing is required to have insight into more complex problems. A/B testing and heatmaps weren’t enough to understand why users on phones weren’t following through on their intent to donate. Among other options, user testing the current “Add to cart” and “Checkout” flows might have provided us with the qualitative data we needed for deeper insights into the problem.
Relationships are key: Helping an organization grow in design maturity is as much about understanding and educating as it is about executing. In order to get buy-in to test this new approach, I first had to listen to understand what kind of progress my team members and leaders were targeting for the future. Next, I invested a lot of time in conversations, getting the organization pumped up about how iterative, user-focused design could help us achieve those goals. It was only after that I was able to move forward with solving the process and design oriented problems that would help us move forward.