![Qayyum Rajan](https://cdn.feather.blog?src=https%3A%2F%2Fwww.notion.so%2Fimage%2Fhttps%3A%252F%252Fprod-files-secure.s3.us-west-2.amazonaws.com%252F03b09b06-3bbf-4428-bbbe-6803462a0b1b%252F0635f71f-3d23-4d19-bdf5-afac0f674091%252F1_-_Qayyum_Rajan_-_HeadshotPro_(1).png%3Ftable%3Dblock%26id%3Dc99d9b71-ee15-4992-af21-b8ab5609b352%26cache%3Dv2&optimizer=image&quality=80&width=280)
Qayyum (βQβ) is a serial builder with more than 5 startups to his name with 3 exits. He specializes in shipping products fast, and early with an focus on driving traffic across the marketing funnels
Table of Contents
- What is SaaS customer activation and CTR?
- How A/B testing can improve your SaaS CTR:
- The A/B testing process
- How to source test ideas
- Prioritizing tests
- Other prioritization questions
- Keys to setting up tests
- Tools for running tests
- Statistically validate tests
- Sample sizes and revenue
- Track tests
- When each test is finished, make note of:
- Then ask: What can we learn from the test?
- Conclusion
![How to use A/B test to skyrocket your SaaS CTR](https://cdn.feather.blog?src=https%3A%2F%2Fwww.notion.so%2Fimage%2Fhttps%3A%252F%252Fprod-files-secure.s3.us-west-2.amazonaws.com%252F03b09b06-3bbf-4428-bbbe-6803462a0b1b%252F05890fe1-32ef-4d5b-8911-4b7a57eb2f67%252F45.png%3Ftable%3Dblock%26id%3D63ffc25a-fb8a-448b-bfe9-e19cba0a6d8a%26cache%3Dv2&optimizer=image&quality=80&width=280)
What is SaaS customer activation and CTR?
- Clicking on the post and performing any action (liking, commenting, sharing, etc)
- Link click
- Clicking on the profile pic or name
- Clicking to expand the creative
How A/B testing can improve your SaaS CTR:
- Deciding what to A/B test
- Prioritizing valuable tests
- Tracking and recording your results
The A/B testing process
- Decide on and prioritize high-leverage changes
- Show some % of your visitors the change
- Run it until you reach a statistically significant sample size
- Implement changes that improve conversion
- Log design/results to inform future tests.
How to source test ideas
- Survey users. Ask what they love about your product.
- Use tools like Hotjar or FullStory to find engagement patterns: What are they clicking vs ignoring?
- Your best ads have value props, text, and imagery that can be repurposed for A/B tests.
- Mine competitors' sites for inspiration. Do they structure their content differently? Do they talk to visitors differently?
- Your support/sales teams interact with customers & know best what appeals to them.
- Revisit past A/B tests for new ideas.
Prioritizing tests
- Micro variants are small, quick changes: Changing a CTA button color
- Macro variants are significant changes: Completely rewriting your landing page
- Earlier steps have larger sample sizesβand you need a sufficient sample size to finish a test.
- It's easier to change ads, pages, and emails than it is down-funnel assets like in-product experience.
Other prioritization questions
- How confident are you the test will succeed?
- If a test succeeds, will it significantly increase conversion?
- How easy it is to implement?
- Is your test similar to an old test that failed?
Keys to setting up tests
- Run one A/B at a time. Otherwise, visitors can criss-cross through multiple tests when changing devices (e.g. mobile to desktop) across sessions.
- Run A/B variants in parallel. Otherwise, the varying traffic sources will invalidate your results.
Tools for running tests
- Google Optimizeβfree A/B testing tool that integrates with Google Analytics and Ads.
- Optimizelyβbetter flexibility and insights.
Statistically validate tests
- 1,000+ visits to validate a 6.3%+ conversion increase
- 10,000+ visits to validate a 2%+ increase
Sample sizes and revenue
Track tests
- Conversion target you're optimizing for: Clicks, views, etc.
- Before & after: Screenshots + descriptions of what's being tested.
- Reasoning: Why is this test worth running? Use your prioritization framework here.
When each test is finished, make note of:
- Start & end dates
- Sample size reported by your tool
- Results: The change in conversion, and whether the result was neutral, success, or failure.
Then ask: What can we learn from the test?
- Use heatmaps to figure out why your variant won. e.g. maybe users were distracted by a misleading CTA in one variant.
- Survey customers who were impacted by the test.
Conclusion
Written by
![Qayyum Rajan](https://cdn.feather.blog?src=https%3A%2F%2Fwww.notion.so%2Fimage%2Fhttps%3A%252F%252Fprod-files-secure.s3.us-west-2.amazonaws.com%252F03b09b06-3bbf-4428-bbbe-6803462a0b1b%252F0635f71f-3d23-4d19-bdf5-afac0f674091%252F1_-_Qayyum_Rajan_-_HeadshotPro_(1).png%3Ftable%3Dblock%26id%3Dc99d9b71-ee15-4992-af21-b8ab5609b352%26cache%3Dv2&optimizer=image&quality=80&width=280)
Qayyum (βQβ) is a serial builder with more than 5 startups to his name with 3 exits. He specializes in shipping products fast, and early with an focus on driving traffic across the marketing funnels