TLDR;
I planned, conducted, synthesized, and communicated research to assess user adoption of a new ad effectiveness metric. Along the way, I established the foundations for a UX Research practice, educated internal stakeholders on its value, and strengthened the case for investing in future research.
CLIENT — Cuebiq
TIMELINE — July 2020 - October 2020
TEAM — 1 designer and a consultant
ROLE — UX Research
TOOLS — Airtable, Confluence
Challenge
In 2020, Cuebiq released a new ad effectiveness research metric called the “incrementality effect”. If people visited a store more often after seeing an ad, those additional visits were attributed to the ad campaign as “incremental visits” - a metric that users of the reporting could point to in proving their campaign’s success. But while product stakeholders were confident in the metric’s potential impact, six months after its release, our client-facing teams surfaced concerns that clients weren’t adopting it.
We decided to conduct some research to better understand the problem, theorizing that a revised incrementality data visualization might improve user adoption. Being a discovery-oriented research project, this initiative didn’t pose too many initial constraints, but we knew any design changes would need to fit well within the existing framework of reporting. Ultimately, we uncovered impactful findings to influence the product roadmap and set the stage for future research.
PROBLEM STATEMENT
We suspected that users struggled to understand incrementality, as well as garner next steps in their media planning, due to how the metric was visualized.
Approach
RESEARCH
METHODS
After aligning with Product and Data Science on a project brief, we started by interviewing a group of customer-facing stakeholders. This collaboration surfaced important feedback to guide external user discovery - predominantly, a hunch that customers lacked confidence in the data, which kept them from adopting new metrics. I synthesized these early results and shared them with the project stakeholders. From there, we conducted discovery research with 4 user organizations.
Goals
1. Confirm users’ understanding of incrementality
2. Gain clarity on how incrementality could be actionable to users
3. Inform any revisions to incrementality’s presentation in the platform
PROTOCOL
We interviewed 4 user organizations, divided between day to day users and higher level decisionmakers. The interview protocol consisted of open-ended questions about not only incrementality, but also the ad effectiveness research landscape and users’ reporting process.
Moving to user recruitment, I took the opportunity to establish a foundation for similar future initiatives, creating a research hub in our internal documentation tool to educate and get customer-facing teams excited about involving their clients in UX research.
BUILDING RESOURCES FOR INTERNAL COLLABORATORS This initiative was an opportunity to educate internal stakeholders on the purpose and impact of UX research.
SYNTHESIS
I took synthesis of the interviews as a second phase for establishing a UX Research foundation, creating an Airtable research base to house quotes and feedback points tagged by categories like organization, feedback, and theme. Using this method, I was able to draw out common conclusions across all our interviews.
This research base was eventually synced to a project management base, with the two being connected by a synced base of key terms common to both research and project planning. I came to use the project management base for project scoping and to maintain roadmap alignment with the product team. This infrastructure was not built overnight - it evolved over time to the format it lives in today, and will continue to evolve to meet the organization’s needs.
PROCESS DEVELOPMENT
Key feedback was logged in an Airtable base and tagged by categories like organization, feedback, and theme to draw out common conclusions.
SYNTHESIS THROUGH AIRTABLE By synthesizing the feedback in Airtable, I was able to draw common conclusions and connect the findings to other UX initiatives.
FINDINGS
Going into our research, we suspected the metric was not presented in a way that users could understand and use. What we found was that adoption takes time, and our customers are people responsible for educating their own customers in turn. They need to feel confident in what they’re presenting to their own clients before they’ll be willing to adopt a new KPI.
Before we put in the resources to change the way incrementality looks and conveys information, we need to give users the context they need to feel comfortable using it.
theme #1
Education and organizational readiness are the biggest hurdles to user adoption of incrementality.
3 of 4 organizations spoke to difficulty understanding, explaining, or applying the concept of incrementality to their media. And they noted that because media companies act as advisors to their own clients, they need to feel confident in what they choose to recommend.
As a result, 100% of active organizations hadn’t yet broached incrementality with their clients.
Bandwidth to self-educate also poses a challenge, with 50% of organizations lacking data science resources to leverage on their accounts.
theme #2
Users turn to familiar ways of validating their success.
Users need benchmarks and historical context to make their KPIs mean something in their data stories.
3 of 4 organizations noted that they build personalized benchmarks outside of the platform based on their historical performance.
Results
DESIGN RECOMMENDATIONS
After synthesizing the research, I sat down with the entire product and data science team (including those further removed from this topic) to discuss the results of my findings and communicate design recommendations.
-
Users aren’t comfortable explaining what incrementality means to their end clients. We can improve educational resources to address this.
-
Users don’t have a way to present incrementality to their clients in a way that feels natural. We can make it easier for them by providing a reporting template or generating a downloadable report (e.g. “week in review”).
-
We can take several approaches to address this, including improving how historical performance is tracked, allowing individual campaigns to be grouped for holistic metrics, or showing benchmarks calculated at the vertical level within campaign reporting.
REFLECTION
NEXT STEPS
Based on the findings from this initiative, our team decided not to move forward with the initial idea to change the way that incrementality was visualized. In efforts to directly address user needs, the design recommendations were added to our roadmap and backlog for future design and development.
This project also served as an example of how user research can drive product development. As a result, some other next steps from this project also include plans to conduct additional discovery research for Cuebiq’s most strategic initiatives.
KEY LEARNING
Setting up the infrastructure for regular research with customer account leads ahead of time can reduce lead time when discovery research projects do come around.
LESSONS LEARNED
This initiative uncovered some of the unique challenges that B2B user research can pose. Notably, we found that recruiting and finding availability with users who are high level decision-makers at their organization can be a long process. Setting up the infrastructure for regular research with customer account leads ahead of time can reduce lead time when discovery research projects do come around.
Ultimately, one of the biggest lessons learned was also that research is not always linear. We might form hypotheses, come to high impact conclusions, and find at the end of research that design and development resources aren’t yet merited. Instead, by uncovering what users really need to make their lives easier, we can direct our resources to design changes that are more likely to have an immediate impact on the product experience.