April 13, 2026 · 6 min read

Dave McClure coined AARRR (“Pirate Metrics”) in 2007 as a simple way to think about the five stages of a user lifecycle. Seventeen years later it’s still the most widely-used mental model for mapping where users are in your funnel and where intervention has the highest ROI.

Here’s how each stage applies specifically to SaaS onboarding, what to measure at each step, and the common pitfalls teams hit when using AARRR in practice.

Acquisition: getting users to sign up

What it measures: The volume and quality of users arriving at your product. Not just traffic — traffic from channels that generate users who actually convert.

Onboarding relevance: Acquisition quality determines your activation ceiling. High-intent traffic (organic search for your product name, direct referral) activates at 2–5x the rate of low-intent traffic (display ads, cold outbound). If your activation rate looks bad, audit your acquisition channels before redesigning your onboarding flow.

What to measure: Signup rate per channel, cost per signup, and — critically — activation rate per acquisition channel. Segment your funnel by referrer / UTM source and compare.

Activation: the first moment of value

What it measures: The percentage of signups who reach your product’s core value for the first time. Activation is the single highest-leverage metric in most SaaS products.

Onboarding relevance: This is where onboarding makes or breaks. Everything your onboarding flow does — signup, setup, feature discovery, first-use experience — lands or fails at activation. A 5-point activation lift typically produces 10–20% retention and revenue lift downstream.

What to measure: Activation rate (% of signups who activate), time to activation (how fast), activation rate by cohort (are new signups activating better or worse than last month?), and drop-off points in your signup-to-activation funnel.

Retention: whether users come back

What it measures: Whether users continue to use your product over time. Measured as day-N retention curves and monthly retention rates.

Onboarding relevance: The highest-risk retention window is days 0–7. Users who don’t reach activation in week 1 almost never retain. Users who do activate are candidates for retention — but retention is won or lost through repeat value, not just first value. Onboarding doesn’t end at activation; it extends through the first 1–2 weeks of use.

What to measure: Day-1, day-7, day-30 retention by signup cohort. Compare activated vs. non-activated cohort retention separately — they behave very differently. Flat retention curves after day 30 indicate habit formation; steep curves indicate ongoing churn risk.

Referral: users bringing other users

What it measures: The rate at which existing users invite, refer, or share your product with others. Often measured via viral coefficient (k-factor) or per-user invite rate.

Onboarding relevance: Most referral programs underperform because they trigger referral prompts before users have hit their aha moment. Asking a user to refer your product on day 2 — before they’ve experienced value — gets declined. Trigger referral prompts after activation, ideally tied to a moment of success (completed a project, shipped a feature, etc.).

What to measure: Invites per activated user, invite acceptance rate, and — crucially — activation rate of referred users compared to cold signups. Referred users typically activate at 1.5–2x the rate of cold signups.

Revenue: conversion to paid

What it measures: The percentage of users who convert to paid plans, upgrade to higher tiers, or take revenue-generating actions (for marketplace products).

Onboarding relevance: Freemium SaaS revenue funnels are downstream of activation. Users who activate convert to paid at 10–20x the rate of users who don’t. Paywalls shown before activation churn users; paywalls shown after activation (or at contextual moments — exceeding free tier limits, accessing premium-only features) convert well.

What to measure: Free-to-paid conversion rate, time to first paid conversion, MRR by cohort, and — for B2B — expansion revenue (users upgrading over time).

Why AARRR is misused

Three common pitfalls:

  1. Treating stages as parallel instead of sequential. Users move through AARRR in order. Fixing Revenue without fixing Activation first is building on sand — you’re optimizing the conversion of users who shouldn’t have activated anyway.
  2. Over-indexing on Acquisition. Many teams treat the top of the funnel as the primary lever because it’s the most visible metric (“we need more leads!”). Activation is usually 5–10x higher leverage per point of improvement.
  3. Measuring without defining. AARRR only works if each stage has a specific, measurable definition. “Activation” has to be a specific event your team agrees on — not a vague concept.

The practical upshot: AARRR is most useful as a diagnostic tool. When a metric looks bad, trace it back to the upstream stage. Revenue problem? Probably an Activation problem. Retention problem? Probably an Activation or Acquisition-quality problem. The earlier you fix it, the more compounding effect downstream.

AI drop-off diagnosis runs this upstream-tracing automatically. More on how to calculate activation rate and how activation relates to engagement, adoption, and retention.

Onboardics measures your activation rate automatically and uses AI to diagnose what's blocking it.

Try the interactive demo →

Frequently asked questions

What does AARRR stand for?

Acquisition, Activation, Retention, Referral, Revenue. Coined by Dave McClure in 2007 (originally called "Pirate Metrics" because AARRR sounds like a pirate’s call). The five stages map sequentially through a user’s lifecycle with a product.

Why is AARRR called Pirate Metrics?

Because "AARRR" sounds like a pirate’s exclamation ("Arrr!"). Dave McClure used the memorable label to make the framework stick in product teams’ heads. The name is just a mnemonic — the framework itself is a straightforward sequential lifecycle model.

Is AARRR still relevant in 2026?

Yes. Despite newer frameworks (RICE, AAARRR with two A’s for Awareness + Acquisition, HEART), AARRR remains the most widely-used lifecycle mental model in product teams because it’s sequential, memorable, and maps cleanly onto funnel dashboards. Newer frameworks tend to add specificity to individual stages (e.g., separating Awareness from Acquisition) rather than replace the overall structure.