Introducing AB Group: Run A/B Experiments Inside a Single Survey

Excited to announce AB Group — a new question group type that lets you run A/B experiments inside a single survey, no second link required.

Until now, testing two versions of a question meant building two surveys, distributing two links, and stitching results together afterward. With AB Group, you add multiple questions to one group, assign each a display probability, and Zonka serves only one of them to each respondent — automatically.


What's New?

🎲 Probability-Based Question Display

Add multiple questions to an AB Group and assign each a display weight — 50/50, 70/30, 60/20/20, or any split that adds up to 100. Zonka handles the random assignment per respondent.

📊 Built-In Comparison

Because both variants live in one survey under one response stream, results sit side-by-side in your reports — no manual reconciliation, no cross-survey math.


Why It Matters

Survey experimentation usually carries enough overhead that teams skip it entirely. AB Group removes the friction:

  • Test the metric, not just the wording — run NPS against CES on the same audience to see which one your customers actually engage with
  • Find phrasings that get clearer answers — test "How likely are you to recommend us?" against "Would you recommend us to a friend?" without splitting your sample
  • Compare response richness — buttons vs. open-text key driver questions, side by side
  • Make incremental survey decisions with real data — not opinions

Example Use Cases

📈 Metric Selection Test NPS vs. CES on a post-purchase survey to see which one your e-commerce audience answers more consistently.

✍️ Question Phrasing Compare two wordings of the same key question to find which gets higher completion and clearer responses.

🎛️ Answer Format Testing Pit a button-choice key driver question against an open-text version to see which one yields richer qualitative input.


Plan Availability

Available on all plans.