Expert Guide Series

What Data Tells Me My Growth Strategy Isn't Working?

I've seen dozens of apps that looked successful on paper but were quietly bleeding users and money. The download numbers would be climbing, the press coverage seemed positive, and the team would be celebrating milestones—but something fundamental was broken underneath. It's a bit mad really, how easy it is to fool yourself with vanity metrics whilst the actual health of your app deteriorates. After building apps across healthcare, fintech, and e-commerce for close to a decade now, I can tell you that knowing which data points actually matter is what separates apps that scale from those that burn through their funding and quietly shut down.

The problem is that most founders and product teams focus on the wrong numbers. They obsess over total downloads or monthly active users without understanding the quality of those users or how long they stick around. I mean, what's the point of 100,000 downloads if 95% of users never open your app a second time? And yet I've watched clients spend tens of thousands on acquisition campaigns without ever looking at their day-7 retention rate. Its like filling a bucket with a massive hole in the bottom—you can keep pouring water in, but you'll never get anywhere.

The data that tells you your growth strategy isnt working is usually the data you're least excited to look at

This guide breaks down the specific warning signs I look for when auditing an apps performance. These aren't theoretical metrics from some textbook; they're the actual indicators that have helped me identify problems early enough to fix them before things got catastrophic. Some of these will be uncomfortable to confront, but that discomfort is precisely why they're so valuable.

When Your Download Numbers Look Great But Everything Feels Wrong

I've worked on apps that hit 50,000 downloads in the first month and the client was over the moon—until we looked deeper at what was actually happening. Downloads are a vanity metric, honestly, and its something I've had to explain more times than I care to remember. Sure, they look brilliant on a board presentation but they don't tell you anything about whether your app is actually working for people.

The thing is, you can spend a fortune on user acquisition campaigns and rack up impressive download numbers whilst your app is quietly bleeding users through the door. I worked on a fintech app where we hit our download targets three months running; the marketing team was celebrating but I couldn't shake the feeling something was off. When we dug into the data we found that 78% of users never completed the signup process. They downloaded the app, opened it once, and disappeared forever. That's not growth—that's expensive noise.

What to Look For Instead

You need to track metrics that actually matter for your apps health. Here's what I monitor within the first week of any launch:

  • Day 1, Day 7, and Day 30 retention rates (if Day 1 is below 25% somethings seriously wrong)
  • Time to first meaningful action (not just app opens but actual feature usage)
  • Completion rate of your onboarding flow (under 40% means you're losing people before they even start)
  • Crash rates and app performance scores (users won't tolerate buggy experiences anymore)
  • The ratio between downloads and active users after 30 days

One e-commerce client came to me after spending £40,000 on Facebook ads that generated 15,000 downloads. Sounds great? Only 600 users were still active after two weeks and just 89 had made a purchase. The maths didn't work—we were paying nearly £450 per actual customer. The app looked successful on paper but the unit economics were completely broken. We had to completely rebuild the onboarding experience and simplify the checkout flow before those download numbers meant anything real. This is exactly the kind of situation where understanding app lifecycle management becomes crucial for long-term success.

The Retention Rate Reality Check

Right, let's get into something that honestly makes or breaks apps more than anything else I've seen—retention rates. You could have a million downloads but if nobody's coming back after day one, youve basically built an expensive digital paperweight. I've worked on apps that got featured in the App Store, pulled in 50,000 downloads in the first week, and then completely died because only 8% of users came back the next day. It's a bit mad really, but it happens all the time.

The industry benchmarks aren't pretty either; most apps lose about 75% of their users within the first three days. I mean, that's mental when you think about it. But here's the thing—if your retention rates are worse than these already-terrible averages, that's your data screaming that something fundamental is broken. I worked on a fintech app where day-7 retention was sitting at 4% and the client kept wanting to spend more on advertising. We had to stop everything and fix the onboarding experience first because what's the point of pouring water into a leaky bucket? Often, issues stem from onboarding UX problems that make users abandon the process before they see real value.

What Your Retention Numbers Should Look Like

These benchmarks vary by category, but here's what I typically see with successful apps versus struggling ones:

Time Period Struggling Apps Healthy Apps
Day 1 Below 20% 35-45%
Day 7 Below 8% 15-25%
Day 30 Below 3% 8-15%

If your numbers are consistently in the left column, no amount of marketing spend is going to save you. I've seen healthcare apps with brilliant concepts fail because the initial user experience was confusing—people downloaded them, opened them once, got frustrated, and never came back. The data was telling us the growth strategy wasn't working, but it took looking at retention cohorts to actually see it clearly.

Track retention by cohort, not just overall averages. Users who found you through paid ads might behave completely differently than organic users, and lumping them together hides the real problems with your acquisition strategy.

Cost Per Acquisition That Doesn't Make Sense

The numbers started looking weird about three months into a fitness app campaign I was managing. We were acquiring users at £4.50 each, which seemed decent for the health and wellness space, but when I dug into the lifetime value data it became clear we were spending £4.50 to acquire users worth about £2.80. Its the kind of realisation that makes you want to throw your laptop out the window, honestly. The thing is, this happens more often than you'd think—and most app owners don't catch it until they've burned through a frightening amount of their marketing budget.

When your CPA doesn't make sense, it usually means one of three things is broken. First, you're targeting the wrong audience entirely; I've seen e-commerce apps waste thousands advertising to people who never intended to buy anything, they just liked browsing. Second, your conversion funnel has massive leaks somewhere between install and first value action—maybe your onboarding is confusing or your paywall appears too early. Third, and this is the one that stings, your app simply doesn't deliver enough value to justify what you're paying to acquire users. Building pre-launch marketing momentum can help reduce these acquisition costs significantly.

Warning Signs Your CPA Is Broken

  • Your CPA is more than 30% of your average user lifetime value
  • Cost per install keeps rising month over month without corresponding LTV increases
  • Different ad channels show wildly inconsistent conversion rates (suggests poor targeting)
  • Users from paid channels have significantly worse retention than organic users
  • You're hitting your install targets but missing revenue projections

I worked with a fintech startup that was celebrating 10,000 installs in their first quarter. Brilliant numbers, right? Except they'd spent £87,000 on acquisition and their average user generated £6.20 in revenue. The maths just didn't work. We had to completely rebuild their targeting strategy and improve their onboarding flow before the unit economics started making any sense. Sometimes you need to pause your acquisition spend entirely and fix the product first—its not what investors want to hear, but burning money on users who won't stick around is far worse.

Session Length and Engagement Patterns That Should Worry You

Session length is one of those metrics that people obsess over for the wrong reasons. I mean, everyone wants users spending ages in their app, right? But here's what I've learned from building apps across different sectors—session length only matters when you understand what it actually means for your specific app type.

A banking app I worked on had average session lengths of 90 seconds and the client was panicking, thinking users weren't engaged. But when we dug into the data, we realised that was perfectly fine; people were logging in, checking their balance, maybe transferring some money, then leaving. Job done. Compare that to a meditation app where 90 second sessions would be a disaster because the core value requires at least 10-15 minutes of use.

What should worry you is when session patterns change without explanation. If your e-commerce app suddenly drops from 8 minute sessions to 3 minutes, somethings broken—maybe the search function isn't working properly or your checkout process has introduced friction. I saw this happen with a retail client where a seemingly minor update caused session times to plummet by 60%. Turned out a new loading screen was frustrating users enough that they'd just close the app. This is often a sign that your visual design needs updating or core functionality has become confusing.

The real warning sign isn't short sessions, its when engagement patterns shift dramatically without any obvious reason behind it

You also need to watch for what I call "zombie sessions" where users open the app, stare at the home screen for a few seconds, then leave without taking any action. If more than 30% of your sessions involve zero meaningful interactions—no taps, no scrolls, no conversions—thats a massive red flag that your value proposition isn't clear or your onboarding failed to explain what users should do next. Track the ratio of active sessions to passive ones; if that ratio starts shifting towards passive, your growth strategy needs serious attention because you're losing people who've already downloaded your app, which is honestly worse than not acquiring them in the first place.

Conversion Funnel Drop-offs Nobody Wants to Talk About

You know what makes me uncomfortable when I'm reviewing analytics with clients? When we get to the conversion funnel and there's a massive drop-off that nobody wants to acknowledge. I've seen it hundreds of times—30% of users make it to the checkout screen, then 80% of them bail. Its not a small problem, it's a catastrophic leak in your growth strategy that's costing you real money every single day.

The worst culprit I see is the registration process. We built an e-commerce app where users had to create an account before they could even browse products properly—seemed logical at the time, right? Wrong. We lost 65% of users at that exact step. When we moved registration to after they'd added items to their basket, conversion jumped by 40%. Sometimes the friction you think is necessary is actually killing your business.

Payment Screen Abandonment

Here's something that genuinely frustrates me; developers who don't test their payment flows on real devices with real network conditions. I worked on a fintech app where the payment confirmation screen took 8 seconds to load on 3G connections. Eight seconds. Users thought the app had frozen and closed it, then we'd hit them with a confusing "payment pending" notification later. That single technical issue was costing the client £200k per month in lost transactions. This is where ensuring your development tools are up to standard becomes absolutely critical for performance.

Form Field Failures

Look at your form completion rates—I mean really look at them. If you've got a form field with a 50% drop-off rate, something's broken. Maybe its asking for information users don't have (I've seen apps request National Insurance numbers at signup, which is mad). Maybe your validation is too strict and rejecting valid inputs. I once found an address field that rejected any postcode with a space in it... half of UK postcodes have spaces. Basic stuff, but it was losing thousands of conversions weekly.

User Feedback Signals You're Probably Ignoring

Look, most app teams obsess over star ratings and ignore everything else. I've done it myself, its human nature to focus on the number that's most visible. But here's the thing—the really valuable signals are buried in places most people don't even check. After working on apps that have collected millions of user feedback points, I can tell you that app store reviews are just the tip of the iceberg; the stuff that actually predicts churn lives in your support tickets, in-app feedback forms, and weirdly enough, in feature request patterns.

Support ticket volume is probably the most underrated metric I track. When I worked on a healthcare booking app, we saw downloads climb but engagement was dropping off a cliff. Nobody could figure out why until I pulled support data and realised we were getting the same three complaints over and over—just worded differently. People weren't leaving bad reviews, they were just quietly uninstalling after contacting support about confusing appointment confirmation flows. The ratio of support tickets to active users jumped from 2% to 8% in six weeks. That's your canary in the coal mine right there.

Feature requests tell you what users wish your app did, sure, but the pattern of requests tells you something more important—what your app is failing to do right now. If you're getting repeated requests for "better search" or "easier navigation," that's not really a feature request, its a usability complaint dressed up nicely. I've seen fintech apps ignore this signal for months because the requests seemed scattered across different features, when actually they all pointed to the same core problem: people couldn't find their transaction history quickly enough. Often this happens when teams haven't properly aligned on what features to prioritise in the first place.

The Silence Problem

Dead serious here, sometimes the absence of feedback is the biggest red flag. When we launched an e-commerce app feature and got... nothing? No complaints, no praise, no support tickets? That meant nobody was using it. Zero feedback often means zero adoption, and I'd rather have complaints than silence because at least complaints mean people care enough to tell you whats wrong.

Set up a weekly report that tracks support ticket categories, in-app feedback submission rates, and feature request themes. If any of these spike or drop suddenly, dig into it immediately—dont wait for your monthly review. The apps I've seen recover fastest from declining growth are the ones that treat user feedback as real-time performance data, not quarterly research material.

What Different Channels Actually Mean

App store reviews skew negative because angry people are more motivated to write them. In-app feedback tends to be more balanced but lower volume. Support tickets? Those are your most engaged users trying to make your app work for them—losing these people hurts twice as much because they actually wanted to stick around. I always weight support feedback heavier in my analysis because these users invested time in reaching out rather than just deleting your app. When a payment processing app I worked on saw support requests drop by 40% while active users only dropped 15%, we knew we had a serious problem—people had stopped bothering to ask for help, they'd just moved to competitors.

Revenue Metrics That Don't Match Your Projections

This one hits different because it affects the bottom line directly—and I've seen it enough times to know when the numbers are telling you something's seriously wrong. You built a fintech app, projected £50k in monthly revenue by month six based on your user numbers, and you're sitting at £8k. Or you've got an e-commerce app with thousands of active users but the average order value keeps dropping month after month. The disconnect between what you expected and what's actually happening? That's your data screaming that your growth strategy needs a complete rethink.

The most common culprit I see is when teams focus too much on vanity metrics and not enough on the revenue path. Sure, you've got 10,000 monthly active users—but how many of them are actually converting to paid features? I worked on a healthcare app where the client was convinced they'd hit their revenue targets because sign-ups were through the roof. Problem was, less than 2% were converting from the free tier to the subscription model. The onboarding flow wasn't showing enough value before asking for payment, and the pricing tiers didn't match what users actually needed. We restructured the entire value proposition and moved the paywall further into the user journey... revenue jumped 340% in two months. Understanding how different monetisation strategies affect fundability is crucial at this stage.

Key Revenue Indicators Worth Tracking

  • Average revenue per user (ARPU) compared to your customer acquisition cost—if you're spending £12 to acquire a user who generates £8 lifetime value, you've got a fundamental problem
  • Conversion rate from free to paid users, broken down by user segment and acquisition channel
  • Payment failure rates and subscription cancellations (payment issues alone can cost you 5-10% of expected revenue)
  • Time to first purchase—if its taking users 30 days to make their first transaction when you projected 7 days, something in your value communication is broken
  • Revenue retention rate (not just user retention)—are your paying customers spending more or less over time?

Another thing that catches people out is seasonal variance they didn't account for. An education app client of mine projected consistent revenue growth, but they didn't factor in that university students (their main audience) basically disappear during summer holidays. Their revenue dropped 60% for three months straight, and they panicked thinking the app was failing when really they just needed to diversify their user base or build features that worked for summer usage patterns.

When The Math Just Doesn't Add Up

Sometimes the issue is your monetisation model itself doesn't fit user behaviour. I've seen subscription models fail because users only needed the app occasionally, not monthly. One-time purchases fail because there's no recurring revenue to sustain development. In-app purchases don't work if the core experience isn't compelling enough to make users want more. You need to match your revenue model to how people actually use your app—not how you wish they'd use it.

The data doesn't lie about this stuff. If your revenue per user is declining month over month, if your payment conversion rates are below industry benchmarks (typically 2-5% for freemium apps, higher for apps with clear value propositions), or if your lifetime value calculations aren't trending upward... your growth strategy isn't working. And the longer you wait to acknowledge that and make changes, the more expensive the fix becomes. I mean, you can keep throwing money at user acquisition, but if the fundamental revenue model is broken? You're just burning cash faster. Creating shareability features can help reduce your dependence on paid acquisition by encouraging organic growth.

Cohort Analysis Shows Users Aren't Sticking Around

Cohort analysis is where the real story lives—and honestly, its often the data that makes clients go quiet on calls. I've had finance app clients celebrating 50,000 downloads only to discover that their Week 4 retention sits at 3%. Three percent. That means 97 out of every 100 users who downloaded their app a month ago have completely abandoned it. The numbers don't lie, and cohort analysis makes it impossible to hide behind vanity metrics.

What I look for first is the shape of the retention curve; if you're losing more than 70% of users in the first week, something fundamental is broken. Maybe the onboarding is confusing. Maybe the core value proposition isn't clear enough. I worked on an e-commerce app where cohort data showed us that users who didn't complete a purchase within 48 hours had a 91% chance of never returning. That single insight changed our entire push notification strategy and how we designed the first-time user experience. Often, these issues are compounded when your app starts to lose visibility in search results, making it even harder to replace churning users with new ones.

The difference between a cohort that retains at 40% versus 15% at Day 30 isn't just about better features—it's about whether you've actually solved a real problem for real people.

Here's what makes this tricky though... cohort analysis requires patience and proper tracking setup from day one. You cant retrofit this data easily. I've seen teams wait three months before implementing proper cohort tracking, which means they burned through their marketing budget without understanding which user segments actually stuck around. The healthcare apps I've built typically show much stronger Month 2-3 retention than social apps because the problem they solve is ongoing, not momentary. Your cohort curves should reflect the natural usage pattern of your apps purpose—if they dont, you've got a mismatch between what you built and what users actually need. When retention issues persist, you might need to consider improving your app's search ranking to attract higher-quality users who are more likely to stick around.

Conclusion

Look, I've spent years pulling apart growth strategies that seemed solid on paper but fell apart in practice, and the one thing I've learned is that the data always tells you the truth—even when you don't want to hear it. If your retention curves are dropping off sharply after day seven, if your CAC has crept up 40% over three months, if users are opening your app but barely touching core features... these aren't minor hiccups. They're your app screaming at you that something fundamental isn't working.

The thing about mobile apps is that bad data compounds quickly; a 5% weekly drop in retention might not seem like much until you realise you've lost half your users in two months. I've seen this happen with a fitness app where downloads were great but actual workout completions were abysmal—turned out the onboarding was confusing and nobody understood how to start their first session. We fixed it in a week and retention jumped 23%. Simple stuff, but the data had been sitting there telling us for months.

So here's what you need to do: pick the three metrics that matter most for your apps success (probably retention, engagement time, and conversion rate), set up proper tracking if you haven't already, and actually look at them every week. Not once a month. Weekly. And when they start trending the wrong way? Don't explain it away or blame seasonality or whatever else feels comfortable. Dig in, talk to users, run tests, and fix whats broken. Your growth strategy isn't set in stone—its a living thing that needs constant adjustment based on what the data's actually showing you, not what you hoped it would show.

Frequently Asked Questions

What's a realistic Day 1 retention rate I should aim for with my new app?

From my experience auditing hundreds of apps, you want to see at least 35-45% Day 1 retention for a healthy app—anything below 25% suggests something's seriously broken in your onboarding or core value proposition. I've worked with apps that had 8% Day 1 retention and wondered why their marketing spend wasn't working; the problem wasn't acquisition, it was that users couldn't figure out what to do after downloading.

How do I know if my customer acquisition cost is too high?

Your CAC should never exceed 30% of your user's lifetime value, and ideally it's much lower than that. I once worked with a fintech client spending £87,000 on acquisition for users who only generated £6.20 each—the unit economics were completely broken and no amount of scale would fix it.

Should I be worried if users only spend 90 seconds in my app per session?

It depends entirely on your app type—90 seconds is perfect for a banking app where users check balances and leave, but disastrous for a meditation app that needs 10-15 minute sessions for core value. What should worry you is when session patterns change dramatically without explanation, like dropping from 8 minutes to 3 minutes suddenly, which usually indicates something's broken in your user experience.

How many of my users should be converting from free to paid features?

For most freemium apps, you're looking at 2-5% conversion rates, though this varies massively by category and value proposition. I've seen healthcare apps with 12% conversion because they solve urgent problems, and social apps struggling to hit 1% because the free version meets most user needs—the key is matching your monetisation model to actual user behaviour patterns.

What's the biggest red flag in app analytics that most founders miss?

Cohort analysis showing retention curves that fall off a cliff—I've seen apps celebrate 50,000 downloads while their Week 4 retention sits at 3%, meaning they've essentially lost 97% of their users permanently. Most founders focus on vanity metrics like total downloads rather than tracking whether people who downloaded last month are still using the app today.

How often should I be checking my app's performance metrics?

Weekly, not monthly—bad data compounds quickly in mobile apps, and a 5% weekly retention drop becomes losing half your users in two months. I've seen too many apps burn through their marketing budget for months before realising their fundamental metrics were broken, when weekly monitoring would have caught the problems early enough to fix them.

When should I stop spending on user acquisition and fix my app instead?

When your retention rates are consistently below benchmarks (under 20% Day 1, under 8% Day 7) or when you're spending more to acquire users than they'll ever generate in revenue. I've had to tell clients to pause £40k monthly ad spends because they were acquiring users at £8.70 each who generated £2.80 lifetime value—fixing the product experience first was the only sensible move.

What should I do if I'm getting zero feedback from users about new features?

Zero feedback usually means zero adoption, which is actually worse than complaints because at least complaints indicate people care enough to engage. When we launched an e-commerce feature and got complete silence—no support tickets, no reviews, no usage data—it meant nobody was using it at all, and we had to completely rethink the feature's visibility and value proposition.

Subscribe To Our Learning Centre