Author Archives: Georgi Georgiev

Analysis of 115 A/B Tests: Average Lift is 4%, Most Lack Statistical Power

Observed Percent Change Significant

What can you learn from 115 publicly available A/B tests? Usually, not much, since in most cases you would be looking at case studies with very basic data about what was tested and the outcome of the A/B test. Confidence intervals, p-values and other measurements of uncertainty will often be missing, and when present they […] Read More…

Posted in A/B Testing, Conversion Optimization | Tagged , , , , , , , , , , , | 2 Responses

Confidence Intervals & P-values for Percent Change / Relative Difference

In many controlled experiments, including online controlled experiments (a.k.a. A/B tests) the result of interest and hence the inference made is about the relative difference between the control and treatment group. In A/B testing as part of conversion rate optimization and in marketing experiments in general we use the term “percent lift” (“percentage lift”) while in […] Read More…

Posted in A/B Testing, Conversion Optimization, Statistical Significance, Statistics | Tagged , , , , , , , , , | Leave a comment

Affordable A/B Tests: Google Optimize & AGILE A/B Testing

The problem most-often faced by owners of websites who want to take a scientific approach to improving them by using A/B testing is that they might have relatively small revenue. Thus, when the ROI calculation for the A/B test is done it might turn out that it is economically unfeasible to test. In some cases, […] Read More…

Posted in A/B Testing, AGILE A/B Testing, Analytics-Toolkit.com, Conversion Optimization | Tagged , , , , , , | Leave a comment

The Google Optimize Statistical Engine and Approach

Frequentist vs Bayesian A/B testing - Google Optimize

Google Optimize is the latest attempt from Google to deliver an A/B testing product. Previously we had “Google Website Optimizer”, then we had “Content Experiments” within Google Analytics, and now we have the latest iteration: Google Optimize. While working on the integration of our A/B Testing Calculator with Google Optimize I was curious to see […] Read More…

Posted in A/B Testing, Conversion Optimization, Statistics | Tagged , , , , , , , , , , , , | Leave a comment

20-80% Faster A/B Tests? Is it real?

Percent Runs and Stopping Stage 1Delta

I got a question today about our AGILE A/B testing calculator and the statistics behind it and realized that I’m yet to write a dedicated post explaining the efficiency gains from using the method in more detail. This despite the fact that these speed gains are clearly communicated and verified through simulation results presented in our AGILE […] Read More…

Posted in A/B Testing, AGILE A/B Testing, Statistics | Tagged , , , , , , , , | Leave a comment

Risk vs. Reward in A/B Tests: A/B testing as Risk Management

Risks vs Rewards in AB Testing

What is the goal of A/B testing? How long should I run a test for? Is it better to run many quick tests, or one long one? How do I know when is a good time to stop testing? How do I choose the significance threshold for a test? Is there something special about 95%? […] Read More…

Posted in A/B Testing, Conversion Optimization, Statistical Significance, Statistics | Tagged , , , , , , , | Leave a comment