We’ve built two production mental health platforms under both regulatory regimes. Mentalyc – an AI-powered therapy documentation tool operating under HIPAA and SOC 2 Type II in the US. Selfapy – a CBT-based digital therapeutics platform certified under BSI IT-Grundschutz in Germany, with over 120 individual security requirements met before launch.
Here is what building under both frameworks actually requires from your engineering team.
Table of Contents
What is HIPAA and what does it require from your engineering team?
HIPAA (the Health Insurance Portability and Accountability Act) is the primary US federal law governing Protected Health Information (PHI). For a mental health platform, PHI includes session recordings, therapy notes, and diagnoses — every data point that connects a patient to their health condition. HIPAA is not a certification you earn once. It is a continuous operational requirement with no expiry date.
Architecturally, it requires 4 things that cannot be added after the fact:
- Encryption of PHI at rest and in transit across every data store and API boundary
- Access control at the domain level, not just at the application layer
- Audit logging of every PHI access event with forensic-grade detail
- A Business Associate Agreement with every third-party vendor whose system touches PHI
A database schema, logging configuration, or API response structure built without PHI separation in mind cannot be patched into compliance, it has to be rebuilt.
In practice, a HIPAA auditor does not ask „do you have encryption.” They ask „show me the access log for this patient record over the last six months.” Teams that added logging after the fact, on an architecture that wasn’t designed for it, cannot answer that question without weeks of retroactive work.
What is BSI IT-Grundschutz and how is it different from HIPAA?
BSI IT-Grundschutz is a security standard published by Germany’s Federal Office for Information Security (Bundesamt für Sicherheit in der Informationstechnik). Unlike HIPAA, which is a legal regulation, BSI IT-Grundschutz is a formal technical standard with auditable requirements, accredited certifiers, and renewal cycles.
BSI certification is not a legal mandate in Germany in the same sense as HIPAA in the US. German healthcare investors, health insurers, and institutional partners treat its absence as a disqualifier.
For Selfapy, certification meant satisfying over 120 individual requirements covering application security, infrastructure, incident response, and documentation. Each requirement is specific and auditable.
The process follows a defined rhythm: gap analysis, remediation, audit, certification, renewal. For a PM planning a product roadmap, this has a concrete consequence: every release touching security-relevant components must be scheduled with enough lead time to go through the certification cycle. A team that discovers this mid-quarter, pushing a release by 6 weeks ,discovers it too late. On the Selfapy project, the certification rhythm was part of sprint planning from day one, not a separate process running alongside development.
„Security in healthcare projects needs to be built in from day one – not added before the audit.” – Gosia Tańska, Mobile Developer, fireup.pro, Selfapy project lead
HIPAA, BSI, and GDPR: what each framework requires from your architecture
A platform operating in both the US and Germany must satisfy all three frameworks simultaneously. The security architecture overlaps, but the legal obligations, documentation standards, and data residency requirements do not.
| HIPAA (US) | BSI IT-Grundschutz (DE) | GDPR (EU) | |
|---|---|---|---|
| Type | Legal regulation | Technical standard | Data protection regulation |
| Certification | None — audit-based | Formal certification with renewal | None — ongoing compliance |
| Rhythm | Continuous operational requirement | Structured process with deadlines | Ongoing, with breach notification obligations |
| Vendor contracts | BAA with every PHI-touching vendor | Supply chain security controls | Data Processing Agreements (DPA) |
| Key architectural demands | Encryption, RBAC, audit logs, PHI isolation | 120+ technical controls, pen testing, documentation | Data minimisation, right to erasure, portability, data residency |
GDPR applies to all EU residents’ personal data regardless of where the platform is headquartered. A US-first platform expanding to Germany must layer GDPR and BSI requirements onto an existing HIPAA architecture. Compatibility between those layers requires deliberate design, not the assumption that it will sort itself out.
How does HIPAA compliance affect AI features in a mental health platform?
AI features in a mental health platform are inside the compliance perimeter, not outside it.
In Mentalyc, every recording, transcript, and AI-generated note is PHI. The AI pipeline is subject to the same encryption, access control, and audit logging requirements as the rest of the system.
The concrete challenge is test automation. In a typical CRUD application, isolating the test environment from production data is a standard step – mock data replaces real records. With audio-based AI features, that approach breaks down. The model needs to process a real recording to produce a realistic output — and every real therapy session recording is PHI. A test environment that touched production recordings would violate HIPAA. A test environment using only synthetic data would not test what matters.
The solution was a framework built on Playwright, Cucumber BDD, and Docker containerisation with tests running in isolated environments with no access to production PHI, using a carefully prepared set of anonymised test recordings that preserve the acoustic characteristics of real sessions. The BDD layer gave Mentalyc’s non-technical stakeholders visibility into exactly what was being tested and under what conditions, directly relevant when SOC 2 Type II auditors reviewed testing procedures.
The result: test execution reduced from several hours to 24–25 minutes. The team gained a reproducible, auditable testing process in a regulated environment, as important as speed.
What does the EU AI Act require from mental health platforms operating in Europe?
AI tools that influence clinical decisions must be designed for EU AI Act high-risk compliance from the first sprint, not classified and retrofitted later.
Annex III of the EU AI Act covers AI systems in healthcare that support or influence clinical decisions. High-risk classification requires conformity assessment, technical documentation, human oversight mechanisms, and registration in the EU database of high-risk AI systems.
„Human oversight mechanism” sounds abstract. In code, it means specific things. Every AI decision or recommendation must be blockable or overridable by a human before it affects a patient record. The system must log not just the model’s output, but whether and how the therapist modified it. It must be possible to reconstruct the full history: what the model proposed, what the human changed, what entered the documentation. These are requirements at the level of database schema and API design, not functionality that can be added as another endpoint after launch.
For teams building in Germany, there is also potential classification as SaMD (Software as a Medical Device) under MDR 2017/745 with CE marking requirements and a separate conformity assessment. The combination of MDR SaMD classification, GDPR Article 9, BSI IT-Grundschutz, and EU AI Act high-risk classification is the compliance stack that serious digital health products in Germany are navigating in 2025 and 2026.
How long does it take to build a HIPAA-compliant mental health platform from scratch?
With the right architecture decisions made upfront, 4 months to production.
That was the timeline for 9am.health – a virtual clinic platform our team built from the first line of code in 2021. The platform now serves employees of Amazon, Novo Nordisk, and the State of Georgia, and added 40,000 new users through employer benefit programs in 2026 alone.
Teams that defer PHI isolation, mock eligibility verification, or treat compliance as a separate workstream do not ship faster. They ship, discover the gaps, and spend the following quarter rebuilding.
Summary
Four compliance frameworks, two markets, one architecture a decision you make in the first sprint or pay for with a rebuild several months later.
From our experience on Mentalyc and Selfapy, one pattern repeats: teams that treat HIPAA, BSI, and GDPR as separate workstreams always end up in the same place. Systems that work but cannot scale. AI features that work in a demo but cannot go to production without violating requirements. Architectures that pass an internal audit but will not pass an external one.
Mental health platforms handle data that is exceptionally sensitive and exceptionally regulated. An architectural mistake in e-commerce costs a refactor. In a therapy platform, it costs certification, user trust, and the ability to operate in a given market.
Fireup.pro has been building healthcare systems since 2018. From mySugr with 1.8M users acquired by Roche, to 9am.health serving employees of Amazon and the State of Georgia, to Mentalyc and Selfapy. The common thread across every project: architectural decisions made at the start determined everything that came after.
Building a mental health platform and planning to enter the European or US market? We’re happy to talk through what that concretely requires from your system.

