Can Your App Launch Without Regulatory Clearance?
A pet care startup spent nine months building an app that helps dog owners track their pet's medication schedules and health symptoms. The interface looked beautiful, the features worked perfectly, and they had secured meetings with three major veterinary chains who wanted to trial it. Then someone asked if the app needed medical device approval. The launch stopped dead, and the team spent another six months working through regulatory submissions they didn't know existed. They burned through most of their runway before a single customer could use the product.
Launching an app without proper regulatory clearance can result in fines ranging from £50k to several million pounds, depending on the severity and jurisdiction
Over the past ten years, I've watched the regulatory landscape for apps become more complex and, to be honest, quite a bit harder to work through. What used to be a fairly straightforward process (build it, submit it to the app stores, launch it) now involves multiple layers of legal and regulatory approval depending on what your app actually does. The rules keep changing too, which makes planning for compliance a moving target that catches a lot of founders off guard.
The tricky bit is that many apps fall into regulatory grey areas where it's not immediately obvious whether you need approval or not. I've worked with clients who were convinced their app was just a simple information tool, only to discover halfway through development that their features triggered medical device regulations. That's an expensive mistake to make.
Understanding Which Apps Need Regulatory Approval
Most apps don't need special regulatory clearance beyond the standard app store review process. If you're building a game, a productivity tool, or an e-commerce platform, you can generally go straight to launch once Apple and Google approve your submission. But there are specific categories where you absolutely cannot launch without proper legal approval, and the consequences of getting this wrong are severe enough that it's worth understanding the boundaries clearly.
Medical and health apps represent the biggest regulatory minefield. If your app diagnoses conditions, recommends treatments, monitors vital signs, or manages chronic diseases, you're probably looking at medical device regulations. I worked on a women's health app a few years back that we thought was just an educational resource. Turned out that because it suggested when users should see a doctor based on symptom inputs, it crossed into diagnostic territory and needed MHRA approval.
Financial services apps need FCA authorisation if they handle actual transactions, provide investment advice, or facilitate lending. Apps that store sensitive personal data about children face special requirements under age-appropriate design codes. Apps that process health data, location tracking, or biometric information need to demonstrate GDPR compliance before launch, not after.
The question you need to ask isn't whether your app feels like it should need approval (it probably doesn't feel that way to you), but whether the specific functions and data handling within your app trigger any regulatory frameworks. This requires looking at what your app actually does from a legal perspective, which is often quite different from how you think about it as a founder or developer.
The Medical Device Classification Problem
Medical device regulation is where I see the most confusion, probably because the definitions are broad and keep expanding as technology evolves. In the UK, the MHRA classifies medical devices based on their intended purpose and risk level, and apps can fall anywhere from Class I (low risk) to Class III (high risk). The classification determines how much scrutiny your app faces and how long the approval process takes.
Here's what trips people up... the classification isn't based on what your app is, but what it's intended to do. An app that tracks blood pressure readings and displays them to the user might be Class I. An app that analyses those same readings and alerts users to potential cardiac events might be Class IIa or IIb. An app that controls insulin pump delivery based on glucose readings would be Class III. Same basic data input, completely different regulatory requirements based on the decision-making involved.
Start by checking MHRA's guidance document "Medical Device Stand-alone Software Including Apps" which gives specific examples of what triggers medical device classification and what doesn't
| App Function | Likely Classification | Typical Approval Timeline |
|---|---|---|
| General wellness and fitness tracking | Not a medical device | No approval needed |
| Symptom checker providing information only | Borderline (often not regulated) | Legal review recommended |
| Apps that diagnose or recommend treatment | Class I or IIa | 3-6 months minimum |
| Apps controlling medical equipment or drug delivery | Class IIb or III | 12+ months |
I've probably reviewed about forty different health apps for medical device classification over the years, and the grey area cases are always the hardest. There's a mental health app we built that provides CBT-based exercises. We spent weeks going back and forth with legal counsel about whether it constituted psychological treatment (regulated) or wellness support (not regulated). The answer came down to specific wording in the app and how we positioned the intended use in our documentation.
Financial Services Apps and FCA Requirements
If your app touches money or financial advice, the FCA is watching. The regulatory requirements here are pretty clear-cut compared to medical devices, but they're strict and the penalties for non-compliance can put you out of business quickly. I worked with a fintech startup that launched a lending marketplace without full FCA authorisation because they thought they were just a technology platform connecting borrowers and lenders. The FCA didn't see it that way, and the company had to suspend operations and refund all customers while they sorted out proper licensing.
You need FCA authorisation if your app accepts deposits, facilitates payments beyond simple payment processing, provides investment advice, enables crowdfunding, or offers any form of lending or credit. Being a technology company doesn't exempt you from these rules... if you're performing regulated activities through your app, you need permission regardless of how you describe your business model.
The FCA authorisation process takes at least six months for straightforward applications, but can stretch to a year or more if your business model is complex or novel. You'll need to demonstrate you have adequate capital reserves, appropriate systems and controls, qualified personnel, and robust risk management processes. For a small startup, meeting these requirements before you've even launched can feel like a huge burden. Our guide to fintech app development and compliance covers these requirements in detail.
- Payment processing apps using established providers like Stripe generally don't need separate FCA authorisation
- Robo-advisors and investment platforms need full FCA authorisation before accepting any customers
- Peer-to-peer lending apps need authorisation even if you're just facilitating connections
- Crypto and digital asset apps face additional requirements under money laundering regulations
- Open banking apps need to meet PSD2 requirements and register with the FCA
What catches people out is the timeline... you need to factor FCA authorisation into your fundraising and runway planning. I've seen companies run out of money waiting for approval because they assumed it would take three months and it took ten. Build in buffer time and expect the process to take longer than the minimum stated timeframes. Understanding what financial metrics investors check can help you plan for these extended timelines when raising funds.
GDPR and Data Protection Before Launch
Every app needs to comply with GDPR before launch if it processes data from EU or UK users, but certain types of apps face higher scrutiny and specific requirements that go beyond standard privacy policies. Apps that process health data, children's data, biometric data, or location tracking need to conduct Data Protection Impact Assessments before launch and, in some cases, consult with the ICO before going live.
The ICO can issue fines up to £17.5 million or four per cent of global annual turnover, whichever is higher, for serious GDPR violations
The mistake I see most often is treating GDPR compliance as something you sort out with a privacy policy template from the internet. That might work for a basic app that collects minimal data, but if you're processing sensitive categories of personal data (health, ethnicity, political opinions, sexual orientation), you need proper legal review and documentation of your lawful basis for processing. You need to demonstrate that you've designed your data handling practices with privacy in mind from the start, not bolted on compliance after the fact. The real cost of ignoring user consent can be catastrophic for your business.
Apps targeting children under thirteen (or under sixteen in some EU countries) face the highest bar. You need to get verifiable parental consent before collecting any personal data, which is harder than it sounds. Age gates that just ask users to enter their birthdate don't count as verification. You need actual mechanisms to confirm a parent provided the consent, which typically means credit card verification, government ID checks, or similar processes that add friction and cost to your onboarding.
Location tracking apps need to be really careful about consent and purpose limitation. You can't collect location data for one stated purpose and then use it for something else without explicit consent. I worked on a delivery app where we had to completely redesign the permissions flow because our initial implementation kept location tracking on continuously, even when users weren't actively using the app for deliveries. That level of tracking needs clear justification and granular user control.
Children's Apps and Age-Appropriate Design Codes
The UK's Age Appropriate Design Code, which came into force a couple of years ago, sets fifteen standards that apps likely to be accessed by children must meet. The catch is "likely to be accessed by children" covers far more apps than you might think. If your app has broad appeal or doesn't have effective age verification, you probably need to comply even if children aren't your target audience.
The standards cover things like default privacy settings (they must be high privacy by default for children), data collection minimisation, no tracking for advertising purposes, geolocation services off by default, and no nudge techniques that encourage children to provide unnecessary personal data or weaken their privacy protections. Some of these requirements conflict with standard growth and monetisation practices that work fine for adult audiences but aren't allowed for children.
- Conduct an Age Appropriate Design Code risk assessment before launch if children might use your app
- Implement effective age verification or age estimation at account creation
- Configure separate user journeys for child users with appropriate restrictions
- Turn off data collection, profiling, and personalised advertising for users identified as children
- Provide prominent, accessible privacy information that children can understand
- Give parents access to parental controls without compromising child privacy unnecessarily
I worked on an educational app where we thought we'd be fine because schools were our customers, not children directly. Turns out that didn't matter... children were the end users, so we needed to comply with the full code. We had to rebuild our analytics implementation, remove several engagement features that counted as nudge techniques, and create a completely separate privacy notice written in child-friendly language. That added about eight weeks to our timeline.
The ICO has enforcement powers and has made clear they'll use them for serious violations affecting children. This isn't a nice-to-have compliance checkbox... it's a legal requirement that you can't launch without meeting if children will use your app.
The Soft Launch Strategy for Compliance Testing
One approach that works well for apps in regulatory grey areas is a controlled soft launch that lets you test your compliance approach with real users before committing to a full public launch. You release the app to a limited audience (maybe 500 to 1,000 users) in a specific geographic region, operate it under close legal supervision, and use that period to identify any compliance issues you missed during development.
Choose a soft launch market that matches your target regulatory environment... if you're planning a UK launch, soft launch in the UK under UK regulations rather than testing elsewhere and assuming the rules are the same
This strategy works particularly well for apps where the regulatory classification isn't entirely clear or where you're using novel features that haven't been tested in your category before. The soft launch gives you evidence of how the app actually gets used in practice, which can inform discussions with regulators about appropriate classification and requirements. I've used this approach probably a dozen times now for apps where we weren't completely certain how regulators would interpret specific features. It's also worth testing features thoroughly before adding them to your production app, particularly if they have compliance implications.
| Soft Launch Benefit | Why It Matters for Compliance |
|---|---|
| Real usage data | Shows regulators how users actually interact with features, not how you assume they will |
| Contained risk exposure | Limits your liability to a small user base if compliance issues emerge |
| Feedback loop | Users often find edge cases and uses you didn't anticipate during development |
| Evidence for classification | Supports your position in discussions with regulators about how features should be classified |
The key is being transparent about the soft launch nature of your release. Don't try to hide that you're operating in a testing phase... document everything properly, keep detailed records of issues and how you address them, and be prepared to pull the app completely if serious compliance problems emerge. A soft launch isn't a way to avoid regulation, it's a risk management tool that helps you get compliance right before scaling.
What Happens When You Launch Without Approval
The consequences of launching a regulated app without proper clearance range from annoying to catastrophic depending on what kind of app you've built and how long you operate before someone notices. At minimum, you'll be forced to suspend operations while you sort out proper approval, which means refunding customers, pausing marketing spend, and watching competitors gain ground while you're stuck in regulatory limbo.
Financial penalties are the next level up. The ICO can fine you for GDPR violations, the FCA can levy substantial penalties for unauthorised financial services activities, and the MHRA can prosecute for illegal medical device marketing. These fines aren't just slaps on the wrist... they're calculated to be painful enough to deter non-compliance and can easily reach six or seven figures even for small companies.
Personal liability is the bit that keeps founders awake. For certain types of regulated activities, directors can be held personally responsible for compliance failures. I know someone who ran a lending app without FCA authorisation and ended up with a personal ban from operating any regulated financial services business. That's a career-ending consequence that no amount of startup success can undo.
Beyond formal penalties, there's reputational damage that's hard to quantify but very real. If you launch a health app and it gets pulled for operating as an unauthorised medical device, that story follows you. Investors get nervous, users lose trust, and regulators remember you when you come back with proper applications. You've branded yourself as someone who either didn't understand the rules or chose to ignore them, neither of which is a good look.
The thing is, you can't really hide from regulators by staying small or flying under the radar. App stores report apps that appear to violate regulations, competitors file complaints about apps that have unfair advantages by avoiding compliance costs, and users report problems that trigger regulatory investigations. Operating without proper approval is a temporary state at best, and the longer you do it, the worse the consequences when you eventually get caught.
Building Compliance Into Your Development Timeline
The smart approach is treating regulatory compliance as a core part of your product roadmap, not something you think about after the app is built. That means involving legal counsel early in the design phase (not just before launch), building compliance requirements into your technical specifications, and planning your timeline around regulatory approval processes rather than hoping they won't apply to you. Understanding why development takes time helps stakeholders appreciate that compliance isn't just paperwork - it's integral to building a legally viable product.
Adding regulatory compliance after development is complete typically costs three to five times more than building it in from the start, both in direct costs and timeline delays
For medical device apps, plan on six to twelve months for regulatory work after your app is feature-complete. That includes time for documentation, quality management system setup, technical file preparation, submission, regulator review, and any modifications they require. For FCA authorisation, six to twelve months is also realistic. For GDPR compliance, you can move faster if you've designed with privacy in mind, but expect at least four to six weeks for proper legal review and documentation.
The biggest timeline mistake is treating these processes as sequential when they can often run in parallel. You can start your FCA application while finishing development. You can conduct Data Protection Impact Assessments while building features. You can engage with regulators for pre-submission guidance before your app is complete. Running these workstreams in parallel doesn't save you the total time required for regulatory work, but it prevents regulatory processes from extending your overall timeline by their full duration.
Budget for compliance from day one. Legal fees for regulatory work typically run £15k to £50k depending on complexity. Medical device consultants charge £500 to £1,000 per day. FCA applications need capital reserves that vary based on your business model but start at tens of thousands. If you're bootstrapped or raising a small seed round, these costs matter and need to be in your financial model before you commit to building a regulated app. Consider how much you should keep back for marketing alongside these compliance costs.
What I tell clients now is that if your app needs regulatory approval, that approval process is part of your product development, not something separate that happens afterward. You haven't finished building your product until you have the legal clearance to operate it. Thinking about it that way changes how you plan, how you budget, and how you communicate timelines to investors and stakeholders. Understanding what happens after app submission to the app stores is just one piece of a much larger compliance puzzle.
Conclusion
Most apps don't need special regulatory clearance and can go straight to launch once the app stores approve them. But for apps in healthcare, finance, children's services, or handling sensitive data, regulatory approval isn't optional and the penalties for launching without it are severe enough to destroy your business. The classification boundaries aren't always obvious, which means you need legal input early in your planning process, not after you've built everything.
The timeline and cost implications of regulatory work are substantial, and founders who don't factor them into their planning end up running out of runway or making desperate pivots to avoid regulations they should have embraced from the start. Building compliance into your development process from day one costs less and takes less time than trying to retrofit it later, and it means you can launch confidently knowing you won't face enforcement actions or forced shutdowns.
If you're planning an app launch and you're not sure whether you need regulatory clearance, get in touch with us and we can help you work through what applies to your specific situation before you invest in building something you can't legally operate.
Frequently Asked Questions
The key is looking at what your app actually does, not how you think about it. If your app diagnoses conditions, handles financial transactions, processes children's data, or makes health recommendations, you likely need regulatory clearance. Get legal input during the planning phase rather than after development, as the specific functions and data handling determine regulatory requirements.
Even information-only health apps can trigger medical device regulations depending on how they present that information. Apps that suggest when users should see a doctor based on symptom inputs or provide personalized health recommendations often cross into regulated territory. The MHRA's "Medical Device Stand-alone Software Including Apps" guidance document provides specific examples of what triggers classification.
Medical device and FCA approvals typically take 6-12 months minimum, while GDPR compliance review takes 4-6 weeks if built in from the start. Budget £15k-£50k for legal fees, plus consultant costs of £500-£1,000 per day for specialized regulatory work. These timelines and costs need to be factored into your fundraising and runway planning from day one.
Consequences range from forced suspension of operations to substantial fines (up to £17.5 million for GDPR violations or several million for medical device violations). Directors can face personal liability and bans from operating regulated businesses. You'll also face reputational damage that follows you with future investors and regulatory applications.
Yes, a controlled soft launch with 500-1,000 users in your target regulatory market can help identify compliance issues before full launch. This works well for apps in regulatory grey areas, but you must be transparent about the testing nature, document everything properly, and be prepared to pull the app if serious compliance problems emerge.
No, being a technology company doesn't exempt you from FCA requirements if you're performing regulated activities through your app. If your app facilitates payments beyond simple processing, provides investment advice, enables lending, or accepts deposits, you need FCA authorisation regardless of how you describe your business model.
Yes, the UK's Age Appropriate Design Code applies to apps "likely to be accessed by children," which covers apps with broad appeal or without effective age verification. You'll need to implement high privacy defaults, turn off tracking for advertising, disable geolocation by default, and avoid nudge techniques that encourage unnecessary data sharing.
Yes, running regulatory processes parallel to development is much more efficient than treating them as sequential steps. You can start FCA applications while finishing development, conduct Data Protection Impact Assessments while building features, and engage with regulators for pre-submission guidance before your app is complete. This prevents regulatory work from extending your overall timeline by its full duration.
Share this
Subscribe To Our Learning Centre
You May Also Like
These Related Guides

What Legal Costs Do New App Owners Always Forget?

How Much Does It Cost to Keep Your App Compliant?



