top of page
Solving Participant Quality Issues: How I Designed a Customer-Driven Research Channel to Improve UX Research Recruitment
Conversations.avif

Objectives

Traditional recruitment platforms weren’t delivering genuine, high-quality participants for UX research on business insurance.

 

This case study explores how I improved recruitment by implementing a dual-channel approach - leveraging both enhanced screening on our specialist platform and piloting a direct outreach program to real customers.

 

The result? Stronger insights, better business decisions, and a repeatable framework for future research.

Role: UX Designer

Timeline: 3 months

Tools: GA, Usertesting.com, Figma, Microsoft Forms

WHO?

SME business owners and decision-makers in business utilities, recruited for UX research.

WHAT?

A new customer-driven research recruitment channel, combining enhanced screening on a specialist platform and direct outreach to real customers.

WHEN?

The initiative ran over Q3 2024, solving ongoing participant quality issues that had delayed research for months.

WHERE?

Research recruitment was conducted via email outreach, an external agency, and a specialist recruitment platform, with interviews scheduled remotely.

 

WHY?

The existing specialist platform was delivering unreliable participants, making it difficult to complete research rounds on time. The new approach improved participant quality, streamlined scheduling, and enabled better product insights.

The Process

DISCOVER

DELIVER

MEASURE

Understanding The Problem

Discover

Discover

The Challenge:

  • Recruiting high-quality, genuine participants was extremely difficult using the specialist platform.

  • Many participants were disingenuous, signing up with multiple accounts and misrepresenting demographics (e.g., claiming business insurance expertise they didn’t have).

  • Screening issues made it impossible to plan research, delaying testing for months, and undermining confidence in insights—around 50% of research findings were unreliable.

The Business Impact:

  • Delays in research cycles, impacting roadmaps and decision-making.

  • Wasted time and resources screening participants who turned out to be unsuitable.

  • Poor confidence in insights - business decisions were being made based on questionable data.

Thinking face-bro (1).png

Image by Stories

Define

Define

Defining a Solution

 A Multi-Channel Recruitment Approach

Enhancing the Specialist Platform with Video Screening

  • Documented issues with fake participants and worked with the platform provider.

  • Introduced short video tasks for screening - allowing me to review responses in my own time.

  • ​Improved filtering of candidates while still leveraging the platform we had a contract with.

Piloting a Direct Customer Outreach Channel

  • Why Customers? They were our exact target audience.

  • Email Campaign: Collaborated with the CRM team to reach a 5,000-person audience pool.

  • Screening & Scheduling: Designed a fully automated system:

    • Email ➝ Screener ➝ Calendar booking system linked to my work schedule.

  • This gave participants confidence that the research was legitimate.

Bringing in an External Agency for Additional Participants

  • Worked with Marketing’s recruitment agency to supplement participants.

  • Handled costings, scheduling, and payments.

  • Pros: Higher quality candidates, pre-screened.

  • Cons: Less control over scheduling, one participant accidentally joined another’s session.

2

3

1

Research Flow

Outreach Flow

Deliver

DELIVER

Running & Managing the Pilot

End-to-End Process

  • Managed two separate scheduling calendars to prevent conflicts.

  • Ensured session attendance & correct incentive distribution (briefed colleagues on session details).

  • Ran two rounds of testing - one with customer-recruited participants, one with agency-recruited participants.

  • Validated that business hours were better for participation (backed by GA data).

Case Studies - EMAIL 1.jpg
Case Studies - EMAIL 2.jpg

Customer Outreach Campaign

MEASURE

Measure

The Measurable Impact

  • 44% open rate (well above company average).

  • 36% screener completion rate.

  • Recruited 5 participants from outreach + 5 from the agency, enabling two full rounds of user testing.

Business Outcomes

  • Research insights improved significantly - stronger data led to better product decisions.

  • Validated that direct customer outreach is a viable, repeatable method for participant recruitment.

  • Created an internal research database, allowing teams to recruit future participants without relying solely on external sources.

Retrospective & Next Steps

What went well

  • Successfully created a repeatable participant recruitment process that could be expanded.

  • Demonstrated that customers were willing to participate, showcasing that the company was listening to feedback.

  • Solved a long-standing recruitment issue, which other teams later adopted.

Challenges & Areas for Improvement

  • Agency scheduling issues - some mix-ups due to lack of direct control.

  • Customer outreach candidates harder to follow up (some no-shows, less accustomed to user research).

  • A potential improvement would be targeted follow-ups - such as a personalised email or a second, softer CTA - to re-engage those who completed the screener but didn't accept the invite to a session

Yes or no-bro.png

Image by Stories

Final Thought

A strong UX research process isn’t just about gathering insights - it’s about making research sustainable and scalable for the business.

Thanks for reading.

Get in touch!

© 2025 by Kate Hayes UX Design.

bottom of page