Motyw
83. A/B Testing Landing Pages - optymalizacja konwersji
Poziom: Średni | Czas: 8 min
A/B testing = systematicznie testowanie zmian. 1% improvement tygodniowo = 67% improvement w року. Small changes compound.
Czym jest A/B Testing?
VERSION A (control): VERSION B (variant):
Oryginalny page Zmieniony element
↓ ↓
50% traffic 50% traffic
↓ ↓
5% conversion 7% conversion
WINNER: Version B (40% improvement!)
KEEP Version B → Test нову changeCo testować (w kolejności IMPACT):
1. Headline (BIGGEST IMPACT):
VERSION A:
"Welcome to Our Coaching Program"
Conversion: 3%
VERSION B:
"Transform Your Business in 90 Days (Guaranteed)"
Conversion: 7%
RESULT: 133% improvement od jednego headline!
WHY: Headline = first thing seen = decyduje CZY чита dalej2. Call-to-Action (CTA):
BUTTON TEXT:
A: "Submit" → CR: 4%
B: "Get My Free Guide" → CR: 6.5%
C: "Send It To Me Now!" → CR: 7%
WINNER: Version C (+75% vs A)
BUTTON COLOR:
A: Blue → CR: 5%
B: Green → CR: 5.2%
C: Orange → CR: 6.8%
WINNER: Orange (high contrast = more visible)3. Form Length:
A: 5 fields (Name, Email, Phone, Company, Message)
CR: 3%
B: 2 fields (Name, Email)
CR: 8%
RESULT: Fewer fields = 167% higher conversion
BUT: Quality может бути lower (less qualified)
BALANCE: Collect minimum, enrichuj później4. Social Proof:
A: No testimonials
CR: 4%
B: 3 testimonials з photos
CR: 6%
C: "Join 10,523 customers" counter
CR: 7.5%
WINNER: Social proof counter (+87%)5. Images/Video:
A: Stock photo
CR: 5%
B: Real customer photo
CR: 6.5%
C: 2-min explainer video
CR: 8%
WINNER: Video (+60% vs stock photo)
People resonate z real > fakeGHL A/B Testing Setup
Built-in Split Testing:
Sites → Funnels → [Your Funnel] → Split Test
STEP 1: Create Variations
- Version A: Original landing page
- Version B: Variant (clone and modify)
STEP 2: Traffic Split
○ 50/50 split (даєfast results)
○ 70/30 split (if cautious)
○ 90/10 split (minor test)
STEP 3: Primary Goal
• Form submission
○ Button click
○ Page visit
○ Purchase
STEP 4: Launch
Duration: Run until statistical significance
Minimum: 100 conversions per variantStatistical Significance
PROBLEM: Was improvement luck або real?
EXAMPLE:
Short test:
A: 10 conversions / 100 visits = 10%
B: 15 conversions / 100 visits = 15%
Difference: 50% improvement!
BUT: Too small sample → Could be luck
Proper test:
A: 100 conversions / 2,000 visits = 5%
B: 140 conversions / 2,000 visits = 7%
Difference: 40% improvement
95% confidence = REAL difference
CALCULATION TOOL:
Google "A/B test calculator"
Enter: Visitors + conversions each variant
Output: "95% confidence" → VALID WINNERMinimum Sample Size:
BASELINE CR: 5%
DESIRED IMPROVEMENT: +20% (до 6%)
SIGNIFICANCE LEVEL: 95%
NEEDED: ~4,000 visitors/variant (~8,000 total)
IF low traffic:
→ Test BIGGER changes (easier detect)
→ OR Run longer (gather more data)Test Duration:
❌ TOO SHORT: 2 days
Not enough data, day-of-week affects vary
✅ GOOD: 2-4 weeks
Full business cycle
Accounts for weekly patterns
Достатньо data
❌ TOO LONG: 3+ months
Market changes, offer expiresWhat NOT to Test (Common Mistakes):
❌ Multiple changes simultaneously
Changed headline + CTA + image
Win? Don't know WHICH change worked!
✅ Test ONE element at a time
Headline only → Find winner → THEN test CTA
❌ Too small traffic
10 visitors/day → Need 400+ days dla wyniku!
✅ Minimum 100 visitors/day dla practical testing
❌ Stopping test too early
B leading after 1 day → Declare winner (TOO FAST)
✅ Wait for statistical significance
❌ Ignoring segmentation
Mobile CR: 2% | Desktop CR: 8% → Different experiences needed!
✅ Separate tests for mobile vs desktopMultivariate Testing (Advanced):
A/B Test: 1 element at time
Multivariate: Multiple elements simultaneously
EXAMPLE:
ELEMENTS TESTED:
- Headline: 2 variations
- CTA Button: 2 variations
- Image: 2 variations
TOTAL COMBINATIONS: 2 × 2 × 2 = 8 variants
TRAFFIC SPLIT: 12.5% каждая
REQUIRES: LOTS of traffic (thousands/day)
BENEFIT: Finds BEST combination faster
GHL: Use external tool like Google Optimize (free)A/B Testing Checklist:
☑ Hypothesis defined:
"Changing headline to benefit-driven will increase CR by 20%"
☑ ONE element isolated for test
☑ Two distinct variants created (clear difference)
☑ Traffic split: 50/50
☑ Primary goal: Form submission
☑ Tracking verified (test submissions yourself)
☑ Minimum sample size calculated (e.g., 4,000/variant)
☑ Test duration: 2-4 weeks planned
☑ Review ritual: Check daily, decide when significant
☑ Document результат: What worked, what didn'tTesting Priority Framework:
START HERE (Highest impact, easiest test):
1. Headline (1 week test)
↓ winner
2. CTA button (1 week)
↓ winner
3. Form length (2 fields vs 5 fields) (1 week)
↓ winner
4. Social proof (add vs none) (2 weeks)
↓ winner
5. Hero image (stock vs real vs video) (2 weeks)
↓ winner
6. Page length (short vs long copy) (2 weeks)
↓ winner
7. Colors/design (Optional, lower impact)
TIMELINE: ~10 weeks of testing
RESULT: Potentially 2-5X improvement in CR!Analyzing Results:
TEST COMPLETE:
Version A:
2,000 visitors → 100 conversions = 5% CR
Cost per lead: $50
Version B:
2,000 visitors → 140 conversions = 7% CR (-40%)
Cost per lead: $35.71 (-28.6%)
Statistical significance: 99% confidence
DECISION:
✅ Keep Version B (clear winner)
✅ Document зміну
✅ Move to NEXT test (CTA button)
IF no clear winner:
→ Try більш dramatic change
→ OR Move to другий elementReal-World A/B Test Examples:
Example 1: Coaching Business
ORIGINAL Headline:
"Professional Business Coaching Services"
CR: 3.2%
TESTED Headline:
"Add $50K to Your Revenue in 90 Days"
CR: 7.8%
IMPROVEMENT: +144%
LESSON: Specific outcome > vague descriptionExample 2: SaaS Landing Page
ORIGINAL CTA:
"Start Free Trial"
CR: 5%
TESTED CTA:
"Get Full Access Free for 14 Days"
CR: 6.9%
IMPROVEMENT: +38%
LESSON: Emphasize "full access" + durationExample 3: E-commerce Product Page
ORIGINAL:
No reviews visible
CR to cart: 8%
TESTED:
Display "3,421 5-estrela reviews" badge
CR to cart: 11.5%
IMPROVEMENT: +43.75%
LESSON: Social proof = trust = higher conversionAdvanced: Heatmap Analysis
BEFORE A/B test, USE heatmaps:
TOOLS:
- Hotjar (external)
- Microsoft Clarity (free!)
SHOWS:
- Where користувачі click (red = most clicks)
- How far scroll (where दоп off)
- Mouse movement (attention areas)
USE DATA:
"50% не scroll до форми → Move form up"
"No clicks on CTA → Not visible enough → Тест bigger button"
Informed testing > guessingTools for A/B Testing:
BUILT-IN GHL:
✅ Split test funnels/pages
✅ Track conversions
✅ Traffic distribution
EXTERNAL (Advanced):
- Google Optimize (FREE, integrates z GA)
- VWO (Visual Website Optimizer)
- Optimizely (Enterprise)
For GHL: Built-in достатньо dla most use casesCommon A/B Test Mistakes:
❌ Testing без hypothesis
✅ "I believe [change] will improve [metric] because [reason]"
❌ Declaring winner після 1 day
✅ Wait for statistical significance (95%+)
❌ Testing tiny changes (blue vs navy blue button)
✅ Test meaningful differences (can eyeball відмінність)
❌ Not documenting results
✅ Keep test log: Date, test, result, decision
❌ Testing without sufficient traffic
✅ Calculate minimum sample потрібен before starting
❌ Testing during atypical period (holidays)
✅ Normal business weeks onlyContinuous Optimization:
NEVER STOP TESTING:
Win Rate: 50-60% of tests (many fail!)
MINDSET:
"Every page can improve"
"1% weekly improvement = 67% yearly"
"Testing = knowing > guessing"
CULTURE:
Test → Learn → Implement → RepeatQuick A/B Test Launch Plan:
WEEK 1:
☑ Choose element to test (headline)
☑ Create hypothesis
☑ Build variant B
☑ Launch 50/50 split test
WEEK 2-3:
☑ Monitor daily
☑ Check for significance
WEEK 4:
☑ Analyze results
☑ Keep winner
☑ Document learning
☑ Plan next test (CTA)
REPEAT continuousNastępny krok: 84. Facebook Pixel Integration
