Skip to content Skip to footer

Improving Qualtrics Survey Data Collection with Advanced Testing

AgencyU.S. Department of Health & Human Services (HHS)Op/DivOffice of the Assistant Secretary of Public Affairs (ASPA)URLhttps://www.hhs.gov/ServicesStrategic test planning and hypothesis formulation, Implementation of A/B tests using Qualtrics, Detailed analytics to measure response rates and satisfaction, Technical adjustments to survey designs based on user feedback.Share

Introduction

Project Context: ASPA Digital faced challenges with the invasive nature of popover survey intercepts on HHS.gov, which negatively impacted user satisfaction scores. To address this, Analytics Logic was tasked with optimizing the feedback collection process through strategic A/B testing.

Challenges

Identify Key Challenges:

  • Disproportionate negative feedback from popover surveys due to their intrusive nature.
  • Difficulty in optimizing survey intercepts to balance response rates and user satisfaction.

Solutions

Solution Implementation: Analytics Logic implemented a rigorous A/B testing framework:

  • Hypothesis Development: Proposed that while popover intercepts might increase response rates, they could potentially lower satisfaction due to their disruptiveness.
  • Experimentation: Conducted A/B tests comparing popover to slider intercepts across desktop and mobile platforms. Adjusted variables like intercept type, activation conditions, and survey design elements to determine the optimal setup for user interaction.
  • Analysis and Implementation: Analyzed the data to identify which intercepts achieved the best balance of response rate and user satisfaction. The findings led to the deployment of the slider intercept on desktop and a dialog variant on mobile, which were less intrusive and more effective.

Team Expertise: Led by data scientists and UX specialists, the project used advanced analytical tools and user experience testing to ensure precise and actionable outcomes.

Services Rendered

  • Strategic planning and hypothesis testing,
  • Implementation of A/B tests using front-end tools like Optimizely,
  • Detailed analytics to measure response rates and satisfaction,
  • Technical adjustments to survey designs based on user feedback.

Results

Measurable Outcomes:

  • Improvement in overall feedback rate by 15% through the introduction of less intrusive survey methods.
  • Significant enhancement in user experience on the site, allowing easier and more effective feedback collection.

Detailed Findings and Recommendations:

  • Desktop Testing: Slider intercepts outperformed popovers with a 19.69% higher response rate but a 37.90% lower satisfaction rate, suggesting further optimization needed in survey design to enhance user satisfaction.
  • Mobile Testing: Initial tests with slider intercepts revealed poor mobile responsiveness, leading to lower satisfaction and response rates. Subsequent testing with dialog intercepts showed improved response rates by up to 316.38% with the dialog variant.

Recommendations for Further Optimization:

  • Test different activation depths for surveys and reduce cognitive friction by simplifying survey questions.
  • Consider design modifications such as text field changes, progress indicators, and survey navigation improvements to enhance user interaction.

Conclusion

Summary of Success: This case study illustrates how targeted A/B testing can refine user feedback mechanisms on government websites, enhancing both the quantity and quality of user interactions. The adjustments made from these tests significantly improved the user experience, demonstrating the value of data-driven website optimization.