How I developed a new, cross-functional design process that decreased project completion time by 50% and helped increase digital revenue by 74%.

How I developed a new, cross-functional design process that decreased project completion time by 50% and helped increase digital revenue by 74%.

How I developed a new, cross-functional design process that decreased project completion time by 50% and helped increase digital revenue by 74%.

How I developed a new, cross-functional design process that decreased project completion time by 50% and helped increase digital revenue by 74%.

My role

I researched and built out a flexible design process, initiating collaboration sessions with our UX and development team members. I also lead design and frontend development for the pilot project and contributed to quantitative research (heatmap and A/B tests).

I researched and built out a flexible design process, initiating collaboration sessions with our UX and development team members. I also lead design and frontend development for the pilot project and contributed to quantitative research (heatmap and A/B tests).

I researched and built out a flexible design process, initiating collaboration sessions with our UX and development team members. I also lead design and frontend development for the pilot project and contributed to quantitative research (heatmap and A/B tests).

Summary

In the fall of 2020, Compassion Canada’s design and development team moved from an ad-hoc product development approach to one that baked in user-centred, data-informed decision making. During the pilot project, our most important “giving season” campaign, we saw digital revenue increase by 74%. We also saw the project completion timeline shrink from months to weeks; a decrease of more than 50% over the previous year.

In the fall of 2020, Compassion Canada’s design and development team moved from an ad-hoc product development approach to one that baked in user-centred, data-informed decision making. During the pilot project, our most important “giving season” campaign, we saw digital revenue increase by 74%. We also saw the project completion timeline shrink from months to weeks; a decrease of more than 50% over the previous year.

In the fall of 2020, Compassion Canada’s design and development team moved from an ad-hoc product development approach to one that baked in user-centred, data-informed decision making. During the pilot project, our most important “giving season” campaign, we saw digital revenue increase by 74%. We also saw the project completion timeline shrink from months to weeks; a decrease of more than 50% over the previous year.

Context

Context

Context

Compassion Canada knew it needed to do things differently. Projects were run in an ad-hoc way, making it hard to scope timelines and leading to lots of bottlenecks as teams tried to figure “what to do next”. We also didn’t have good data to discern whether the changes we were making were helping or hindering our users. With our biggest campaign, Gifts of Compassion, on the horizon, we needed to be better.

Compassion Canada knew it needed to do things differently. Projects were run in an ad-hoc way, making it hard to scope timelines and leading to lots of bottlenecks as teams tried to figure “what to do next”. We also didn’t have good data to discern whether the changes we were making were helping or hindering our users. With our biggest campaign, Gifts of Compassion, on the horizon, we needed to be better.

Compassion Canada knew it needed to do things differently. Projects were run in an ad-hoc way, making it hard to scope timelines and leading to lots of bottlenecks as teams tried to figure “what to do next”. We also didn’t have good data to discern whether the changes we were making were helping or hindering our users. With our biggest campaign, Gifts of Compassion, on the horizon, we needed to be better.

The process

Goal

After I advocated for a more user-centred, data-based design process, my team manager gave me the go ahead to create a “v1” process that we could test and iterate on.

We needed to focus on team and marketing oriented metrics that would speak to our leadership's current priorities.

We needed to focus on team and marketing oriented metrics that would speak to our leadership's current priorities.

We needed to focus on team and marketing oriented metrics that would speak to our leadership's current priorities.

User-focused, iterative design was still a new concept for many on our executive team. So we needed to focus on team and marketing oriented metrics that would speak to our leadership's current priorities and prove the value of this approach. With that in mind we set targets to improve build time and conversion-rates for Gifts of Compassion.

Research

To kick things off I gathered as much research as I could. I met with colleagues from the design industry working at startups and larger tech companies to discuss their typical team workflows. I also studied standard, industry leading processes used by companies like IDEO and Google.

This document was not supposed represent a strictly linear process... it was built to be a toolbox we could use to quickly construct an effective process for any given project.

This document was not supposed represent a strictly linear process... it was built to be a toolbox we could use to quickly construct an effective process for any given project.

This document was not supposed represent a strictly linear process... it was built to be a toolbox we could use to quickly construct an effective process for any given project.

I synthesized all the ideas into a living document in our project management tool, Asana, then organized all the different methods under five different categories: Define the vision, Observation/Prep, Ideation, Rapid prototyping, Implementation, and Post-launch Research & Iteration. This document was not supposed represent a strictly linear process, nor would we use every method for every project. Rather, it was built to be a toolbox we could use to quickly construct an effective process for any given project, regardless of unique user and business needs.

Key categories

  1. Define the vision

  2. Observation/Prep

  3. Ideation

  4. Rapid prototyping

  5. Implementation

  6. Post-launch research and iteration

Collaboration

Once i had the prototype process built out, I brought it to my team to get input from our UX researcher, developers and team manager. Some of the feedback that came out the session was that user research was too front-loaded in the process; it should be present throughout design, testing and iteration. So we made some additions to each section to include it. We committed to iterating on the process during team retros and got ready to put it to use.

Key categories

  1. Define the vision

  2. Observation/Prep

  3. Ideation

  4. Rapid prototyping

  5. Implementation

  6. Post-launch research and iteration

Constraints and trade-offs

Before we got started though, we had to embrace some constraints and trade-offs to make the process work this time around. First of all, our directors had a strong preference towards quantitative over qualitative data, being much more motived by the data gathered through Google Analytics and A/B testing than what we could bring to the table through UX research. This, combine with a heavy preference to having a product in market, meant that we were limited to testing and iterating post-launch.

It would have been helpful to quickly test and iterate at the low fidelity design stage, immediately incorporating feedback from real users between each iteration. That being said, because iteration and data-based decision making was a new concept for the organization, any chance to demonstrate its value was a huge opportunity.

With this in mind, we moved forward with plans to run multiple A/B tests on launch and to incorporate heat-mapping and user recordings into our post-launch analysis. Also, we continued to seek out opportunities with both colleagues and leaders to advocate for the value of user-centred research and design.

Because iteration and data-based decision making was a new concept for the organization, any chance to demonstrate its value was a huge opportunity.

Because iteration and data-based decision making was a new concept for the organization, any chance to demonstrate its value was a huge opportunity.

Because iteration and data-based decision making was a new concept for the organization, any chance to demonstrate its value was a huge opportunity.

IRL: Component based iteration

E-commerce page

E-commerce heatmap

Mobile layout

Mobile product page

In previous years we always rebuilt our Gifts of Compassion e-commerce microsite from the ground up. This time we used our design process toolbox to skip the rebuild and focus on incrementally improving our conversion rate and digital revenue; we added a “fundraise” feature to allow users to fundraise for a specific item, we improved both real and perceived load times and we made detail-oriented improvements to our filter and mobile “add to cart” process. We also re-skinned the microsite to ensure consistent branding across the entire campaign experience.

In previous years we always rebuilt our Gifts of Compassion e-commerce microsite from the ground up. This time we used our design process toolbox to skip the rebuild and focus on incrementally improving our conversion rate and digital revenue; we added a “fundraise” feature to allow users to fundraise for a specific item, we improved both real and perceived load times and we made detail-oriented improvements to our filter and mobile “add to cart” process. We also re-skinned the microsite to ensure consistent branding across the entire campaign experience.

In previous years we always rebuilt our Gifts of Compassion e-commerce microsite from the ground up. This time we used our design process toolbox to skip the rebuild and focus on incrementally improving our conversion rate and digital revenue; we added a “fundraise” feature to allow users to fundraise for a specific item, we improved both real and perceived load times and we made detail-oriented improvements to our filter and mobile “add to cart” process. We also re-skinned the microsite to ensure consistent branding across the entire campaign experience.

Initial A/B tests

Round 1: Small iterations

Because we’re a small team, we decided to focus on small iterations to begin with in order allow space for us to design and develop larger changes for the third test. Each test brought about incremental improvement.

Results: Remove video CTA from hero section

4.8%

increase in conversion rate

1.2%

decrease in bounce rate

Key learning: Though larger changes often lead to larger shifts in metrics, small changes are worthwhile too. Marginal gains can add up to be significant gains over multiple tests.

Key learning: Though larger changes often lead to larger shifts in metrics, small changes are worthwhile too. Marginal gains can add up to be significant gains over multiple tests.

Results : Change hero image to match ads

2%

increase in conversion rate

4%

decrease in bounce rate

Key learning: Though larger changes often lead to larger shifts in metrics, small changes are worthwhile too. Marginal gains can add up to be significant gains over multiple tests.

Key learning: Using consistent imagery across ads and pages can lead to better performance. Furthermore, ad metrics can be used across platforms to improve the performance of e-commerce products.

Round 2: Mobile focus

After launching the initial tests we saw that, though our desktop conversion rate was great (13.09%), our mobile conversion rate was lagging significantly behind (3.03%). A variance between desktop and mobile conversion is to be expected, but this gap was much larger than industry benchmarks. Our heatmaps also showed us that, when it came to scroll depth, the majority of our mobile users were not getting beyond the initial hero section. So for our major iteration we decided to focus on improving our mobile experience.

In this case, our design changes focused on making content more scannable and reducing the depth of scroll necessary to reach the item section of our shop page. Because we were making multiple changes to content and layout, we decided to run a multivariate test.

Results: Mobile focus

16%

decrease in conversion rate

1%

decrease in bounce rate

37%

decrease in revenue

Key learning: MVT tests require a lot of traffic. Our changes weren’t major but we still had 9 variants to test. It took a long time to gather statistically significant metrics across all variants.

Furthermore, we confirmed some hypotheses that we had weren’t correct. The original performed best. It seems that content visibility can take precedent over scroll depth or reducing cognitive load.

Key learning: MVT tests require a lot of traffic. Our changes weren’t major but we still had 9 variants to test. It took a long time to gather statistically significant metrics across all variants.

Furthermore, we confirmed some hypotheses that we had weren’t correct. The original performed best. It seems that content visibility can take precedent over scroll depth or reducing cognitive load.

Overall results

Enlisting our new process for the first time brought about a lot of wins. During team retros we noted that cross team communication was much better. We were able to run multiple rounds of A/B testing for the first time and, because of this, saw measurable improvements in both bounce rate and conversion rate; 5.2% and 6.8% respectively.

Focusing on iterative, component based design meant that project completion time was cut in half from 12 to 6 weeks.

The biggest change, however, was the 74% increase in digital revenue over our previous year’s campaign. At this point I should callout the work of our Marketing team. They increased ad spend across our digital platforms and worked hard to improve the effectiveness of our ads, stories and social posts, enlisting their own testing methodologies. This metric wouldn’t have been hit without them funnelling a lot more people to our site and priming them to give.

Our focus shifted to the user, from the the beginning of the project onwards, and that created an experience that made giving feeling good.

Our focus shifted to the user, from the the beginning of the project onwards, and that created an experience that made giving feeling good.

Our focus shifted to the user, from the the beginning of the project onwards, and that created an experience that made giving feeling good.

That being said, I can’t sell short the impact that our team’s new approach had on the campaign’s success. Our focus shifted to the user, from the the beginning of the project onwards, and that created an experience that made giving feeling good. The increase in revenue confirmed our hypothesis that data-based, iterative design could help us better understand our users’ needs and, in turn, give them a barrier free experience when it came time to donate.

50%

Decrease in project completion time

50%

Decrease in project completion time

50%

Decrease in project completion time

74%

Increase in campaign revenue

74%

Increase in campaign revenue

74%

Increase in campaign revenue

Overall learnings

Problem solving: While the obvious benefit of component based design and development was faster build times, we saw some unexpected benefits as well. It helped us break down design problems themselves and target specific metrics. Because of this we saw improved communication across teams and reduced scope creep.

More varied research and testing is required to have insight into more complex problems

More varied research and testing is required to have insight into more complex problems

We need more testing: As we thought at the beginning of the project, more varied research and testing is required to have insight into more complex problems. A/B testing and heatmaps weren’t enough to understand why users on phones weren’t following through on their intent to donate. Among other options, user testing the current “Add to cart” and “Checkout” flows might have provided us with the qualitative data we needed for deeper insights into the problem.

Helping an organization grow in design maturity is as much about understanding and educating as it is about executing.

Helping an organization grow in design maturity is as much about understanding and educating as it is about executing.

Relationships are key: Helping an organization grow in design maturity is as much about understanding and educating as it is about executing. In order to get buy-in to test this new approach, I first had to listen to understand what kind of progress my team members and leaders were targeting for the future. Next, I invested a lot of time in conversations, getting the organization pumped up about how iterative, user-focused design could help us achieve those goals. It was only after that I was able to move forward with solving the process and design oriented problems that would help us move forward.

More varied research and testing is required to have insight into more complex problems

Helping an organization grow in design maturity is as much about understanding and educating as it is about executing.