Skip to main content

The #1 CRO Principle: Remove, Don't Add

The highest-impact CRO tests consistently involve removing something — fields, clicks, visual weight. Adding more rarely wins. Here's the pattern and why subtraction is your best conversion lever.

A
Atticus LiApplied Experimentation Lead at NRG Energy (Fortune 150) · Creator of the PRISM Method
1 min read

Editorial disclosure

This article lives on the canonical GrowthLayer blog path for indexing consistency. Review rules, sourcing rules, and update rules are documented in our editorial policy and methodology.

Fortune 150 experimentation lead100+ experiments / yearCreator of the PRISM Method
A/B TestingExperimentation StrategyStatistical MethodsCRO MethodologyExperimentation at Scale

Your argument reframes CRO from an additive craft to a subtractive one, and your evidence backs it up:

  • The biggest wins you cite all come from removal:
  • Removing non-essential form fields → ~12 percentage point absolute lift in completion.
  • Removing an intermediate confirmation click → 3x increase in the primary metric.
  • Reducing visual weight of a secondary option → meaningful lift in primary conversions.
  • The addition tests mostly underperformed or were flat:
  • "Recommended" badges → zero win rate, sometimes slightly negative.
  • Extra value prop copy on CTAs → negligible effect (Cohen's h < 0.01).
  • Transparency/reassurance copy → flat or directionally negative due to signaling new concerns.

About the author

A
Atticus Li

Applied Experimentation Lead at NRG Energy (Fortune 150) · Creator of the PRISM Method

Atticus Li leads applied experimentation at NRG Energy (Fortune 150), where he and his team run more than 100 controlled experiments per year on customer-facing surfaces. He is the creator of the PRISM Method, a framework for high-velocity experimentation programs at large enterprises. He writes regularly about the statistical and operational details of A/B testing — the parts most CRO content skips.

Keep exploring