Yesterday, we shared a powerful prompt for analyzing A/B test results. But reading a prompt and seeing it work on real data are two very different things.

That’s why today, our Premium readers get to see it in action—with real data, real analysis, and real takeaways.

If you’re on free, you’ve got the prompt:

I’m [mention the problem you’re facing in detail with background context]. Act as a data analyst specializing in A/B testing. I will give you test results in this format: [insert sample data, e.g., impressions, clicks, conversions for A vs. B].

Your tasks are:

1. Analyze the results and determine whether the difference is statistically significant.
2. Explain which variation is better and why.
3. Highlight behavioral insights revealed by the data.
4. Provide 3 actionable recommendations for the next round of testing based on these insights.

Format the output as:

1. Quick Summary (plain-language takeaway)
2. Detailed Analysis (math/statistics + behavioral reasoning)
3. Actionable Recommendations (concrete next steps)

Keep the explanation clear, visual if possible (tables, percentages), and aligned with the goal of making smarter test decisions.

Reply

or to participate

More From The Automated

No posts found