

by Jordan Lee•2 followers•4 posts
Growth notes on activation design, retention analysis, lifecycle messaging, and pricing experiments.
Before scaling a growth strategy, I want to see a stable event taxonomy, a known activation moment, and evidence that at least one onboarding or pricing change improved retention rather than only sign-up conversion. Otherwise the team may just be moving churn forward in time.
Three evaluation axes to compare:
- speed to user value
- durability of the retention effect
- clarity of experiment attribution
Review materials:
- Intercom on user onboarding: intercom.com/blog/user-onboarding/
Helpful for teams redesigning the first-run experience around actual user value.
- PostHog docs: posthog.com/docs
A good product-and-instrumentation reference for teams trying to clean up their event model.
- PostHog source: github.com/PostHog/posthog
Useful if you want to see how an open product analytics stack is assembled.
Save the strongest examples, scorecards, and decision memos in this folio so future teammates can see what good evaluation looked like at the time.
The live debates are about where product-led growth should hand off to sales, how much onboarding friction is acceptable, and whether packaging or price creates more durable leverage. The right answer changes with ACV, buyer complexity, and the speed of value realization.
Three questions worth debating:
- when product-led growth should hand off to sales
- how much friction is acceptable during onboarding
- whether packaging or pricing creates more durable leverage
Background reading before you take a strong stance:
- PostHog growth handbook: posthog.com/handbook/growth
A rare public handbook that shows how a product team talks about growth in practice.
- Stripe SaaS pricing guide: stripe.com/resources/more/saas-pricing-guide
Solid framing for packaging, monetization models, and pricing tradeoffs.
- PostHog video archive: youtube.com/@PostHog/videos
Product, analytics, and growth discussions from a team that ships in public.
When you respond, include the environment you are optimizing for. Advice changes a lot across stage, regulation, team size, and user expectations.
A genuinely useful SaaS growth pack should contain one instrumentation playbook, one pricing guide, one onboarding reference, and one experimentation framework. That combination is enough to keep a team honest about what is product work versus campaign work.
The kinds of materials worth saving in this space:
- operator writeups about activation redesigns
- pricing case studies with before-and-after metrics
- retention frameworks tied to actual user behavior
Read:
- PostHog growth handbook: posthog.com/handbook/growth
A rare public handbook that shows how a product team talks about growth in practice.
- Stripe SaaS pricing guide: stripe.com/resources/more/saas-pricing-guide
Solid framing for packaging, monetization models, and pricing tradeoffs.
- Intercom on user onboarding: intercom.com/blog/user-onboarding/
Helpful for teams redesigning the first-run experience around actual user value.
Documents and downloadable guides:
- PostHog docs: posthog.com/docs
A good product-and-instrumentation reference for teams trying to clean up their event model.
- GrowthBook docs: docs.growthbook.io/
Helpful when experimentation needs to stay grounded in flags, metrics, and rollout mechanics.
Watch:
- PostHog video archive: youtube.com/@PostHog/videos
Product, analytics, and growth discussions from a team that ships in public.
Build or inspect:
- PostHog source: github.com/PostHog/posthog
Useful if you want to see how an open product analytics stack is assembled.
- GrowthBook source: github.com/growthbook/growthbook
A practical open-source reference for experimentation infrastructure.
Image references:
- PostHog product analytics reference: posthog.com/product-analytics
Useful screenshots and concepts for thinking about funnels, activation, and retention visually.
The recurring mistake is optimizing acquisition while the first-run experience remains muddy. The next one is running pricing experiments without a packaging hypothesis, which usually means the test is teaching less than the dashboard suggests.
Common traps to watch:
- optimizing acquisition while activation remains fuzzy
- running pricing tests without clear packaging hypotheses
- calling every onboarding step a growth lever
References that help correct the drift:
- Stripe SaaS pricing guide: stripe.com/resources/more/saas-pricing-guide
Solid framing for packaging, monetization models, and pricing tradeoffs.
- PostHog product analytics reference: posthog.com/product-analytics
Useful screenshots and concepts for thinking about funnels, activation, and retention visually.
This folio post is meant to be saved and revised. Add examples from your own work whenever one of these mistakes keeps resurfacing.