AMA with Andres Glusman of Do What Works

Peter connected with Andres Glusman, Co-Founder/CEO of DoWhatWorks.io and experimentation veteran, to surface the insights that matter most right from a vast library of observed live experiments. Each insights is a low-friction nudge you can test this quarter to boost win-rates, cut noise, and keep your roadmap focused on what actually moves the needle.

5 key takeaways below. Join the conversation in the LinkedIn Group here.

1. Most tests miss, so stack the odds before you launch.

Optimizely and DoWhatWorks’ own data show that ~89 % of A/B tests don’t move the needle. With typical traffic, teams typically get only 12 shots a year to win, so front-load learning: mine other brands’ experiments, pair quantitative signals with quick qualitative research, and pre-filter ideas until you’re betting on near-certainties, not hunches.

2. Sometimes subtraction beats addition.

One of the easiest, highest-impact moves Andres sees is removing clutter rather than adding new elements. Stripping out weak CTAs, redundant copy, or “nice-to-have” elements often lifts conversion faster than any addition, yet most teams never test it because it feels emotionally risky.

3. Conventional “best practices” are often losers.

The data shows sacred cows like logo bays for B2B social proof frequently underperform, and blindly copying a competitor’s live page can backfire (you might be cloning their losing variant). Treat every borrowed idea as a hypothesis and validate before you imitate.

4. Bundle high-potential changes in a “Unified Variable Set.”

While single-variable tests delight data purists, they burn precious time. Andres recommends grouping several thematically aligned tweaks (headline, imagery, layout) into one bigger variant: fewer tests, bigger swings, faster learnings. Accept a bit of analytical fuzziness in exchange for meaningful business impact.

5. AI is a tool, not the value prop.

Tests across industries show that shouting “AI” in your H1 rarely wins; customers care about outcomes, not algorithms. Use AI behind the scenes for idea generation or rapid mock-ups, but keep human judgment (and clean data) in the loop to avoid shipping polished, high-velocity junk.

Next
Next

AMA with Marc Choquette of Boston Globe Media