Don’t Be the Next Cautionary Tale in Healthcare Security

Don’t Be the Next Cautionary Tale in Healthcare Security

Share on:

Researchers at mobile security firm Oversecured scanned ten Android mental health apps with a combined 14.7 million downloads on Google Play. What they found should make any HIT developer uncomfortable: 1,575 security vulnerabilities across those ten apps, including 54 high-severity issues. In one app alone—with more than a million downloads—researchers counted more than 85 medium- and high-severity flaws. These aren’t obscure apps. Several explicitly promise users that their therapy conversations and mood data are encrypted and private. Some are used in state-funded healthcare programs in Europe with six-figure patient counts.

The vulnerabilities weren’t exotic or hard to find. Oversecured identified unvalidated intent handling that could give an attacker access to authentication tokens and session data, local storage misconfigurations that exposed therapy notes and CBT session records to any app on the device, and hardcoded Firebase database URLs sitting in plaintext inside APK resources. The kind of flaws that show up when security testing isn’t built into the development cycle, or isn’t happening at all.

Therapy records sell for more than $1,000 per record on dark web markets, according to Oversecured founder Sergey Toshin. That’s not a theoretical risk. It’s a pricing structure.

This Is Exactly What HHS Is Warning You About

On December 27, 2024, the Office for Civil Rights at HHS published a Notice of Proposed Rulemaking to substantially update the HIPAA Security Rule—the first meaningful revision since 2013. The proposed rule is a direct response to what OCR has observed through enforcement: healthcare organizations treating “addressable” implementation specifications as optional, skipping security testing, and misunderstanding the scope of what the Security Rule actually covers.

The proposed changes would codify specific, non-negotiable testing requirements: penetration testing at least once every 12 months, vulnerability scanning at least every six months, and annual compliance audits. These aren’t suggestions. OCR specifically called out its intent to eliminate the addressable vs. required distinction entirely, meaning every implementation specification becomes mandatory if finalized.

If you are building or maintaining software that creates, receives, maintains, or transmits electronic protected health information (whether that’s a therapy app, a patient portal, an EHR integration, or an mHealth tool) you are a regulated entity or a business associate. That means this proposed rule lands on you.

The rule is still being finalized as of early 2025. But OCR doesn’t wait for final rules to enforce against known bad practices. The agency has levied significant penalties against organizations that failed to conduct adequate risk analyses long before this proposed update was published. The direction of travel is clear.

Why Annual Penetration Testing Isn’t Enough by Itself

Annual penetration testing is valuable, but it’s a point-in-time assessment. You get a snapshot of your security posture on the day the test runs. If your team deploys a new feature two weeks after the test, introduces a third-party SDK with unreviewed permissions, or updates a backend configuration that opens an unintended API endpoint—none of that shows up in last year’s pen test.

The Oversecured findings illustrate this problem well. Several of the vulnerable apps had update dates as recent as November 2025 and some going back to September 2024. Active development was happening. Security testing, apparently, was not keeping pace.

This is why the proposed HIPAA rule pairs the annual penetration test with mandatory semiannual vulnerability scanning. These serve different purposes. Penetration testing simulates real-world attacker behavior—a skilled tester will chain low-severity findings together, pivot through your infrastructure, and demonstrate actual exploitability. It answers the question: could someone actually get to my ePHI? Vulnerability scanning is automated, runs frequently, and catches newly introduced weaknesses or unpatched components quickly. It answers: what changed, and does that change create risk?

For HIT developers specifically, the testing scope needs to go beyond the network perimeter. The mental health app findings were almost entirely within the application layer: how the app handled data internally, what it exposed to other processes on the device, what it hardcoded into its own codebase. Mobile app security testing, API security testing, and source code review need to be part of your program if you’re building anything that patients interact with directly.

The Compliance Audit Is a Third Leg, Not a Formality

HIT developers often treat compliance audits as a documentation exercise, a process of assembling evidence that policies exist. That’s how organizations end up with a security policy that says “we perform annual risk assessments” while the last actual assessment was three years ago.

The proposed HIPAA rule tightens this significantly. The NPRM would require annual compliance audits with documented validation that technical controls are actually implemented and functioning, not just described in a policy.

It also requires a technology asset inventory and network map showing the movement of ePHI—updated at least annually and in response to environmental changes.

For developers, that means you need to know every system that touches ePHI. Every API endpoint. Every third-party integration. Every mobile SDK that has access to session data. The mental health apps Oversecured analyzed had plaintext backend API endpoints and hardcoded database URLs sitting in their APK resources. That kind of configuration sprawl happens when there’s no systematic inventory of what exists and what it connects to.

The compliance audit also creates the accountability loop between your penetration test results and your remediation activity. A pen test that produces a report that sits unread until next year’s test isn’t compliance—it’s a liability document waiting to be used against you if there’s a breach. The audit process should verify that findings were prioritized, remediated within defined timelines, and re-tested. Under the proposed rule, critical and high-severity vulnerabilities would need to be addressed within specific timeframes, with documentation.

Where to Start If You’re Behind

The gap between where most HIT development teams operate today and where the proposed rule expects them to be is significant. HHS acknowledged this explicitly. The proposed rule’s regulatory impact analysis estimates substantial implementation costs across the sector. But OCR also made clear that it would not adjust requirements based on cost, specifically to prevent organizations from deciding cybersecurity investment is optional.

The practical starting point is a current-state assessment. Before you can build a penetration testing cadence or a vulnerability scanning program, you need a current technology asset inventory: what systems exist, what they touch, and where ePHI flows. This is foundational work that many organizations haven’t done rigorously, and the Oversecured findings suggest mobile and web applications are particularly likely to have gaps here.

From there, the sequencing matters. A penetration test conducted before you’ve done basic vulnerability remediation will produce so much noise it’s hard to prioritize. Running automated vulnerability scans first, remediating what you find, and then bringing in a penetration testing team gives you a cleaner baseline and a more focused engagement. Post-test, the compliance audit validates that what you found got fixed.

For organizations that have been operating on informal practices (running a vulnerability scan occasionally, doing a pen test when a customer asks for it) the shift to a documented, scheduled, auditable program requires process change, not just technical investment. You need defined remediation SLAs by severity. You need a process owner who is responsible for tracking findings to closure. You need to be able to produce that documentation when OCR asks for it.

The Real Risk Is Thinking This Doesn’t Apply to You

The mental health apps Oversecured analyzed aren’t HIPAA-covered entities in the traditional sense. They’re consumer apps. But they handle sensitive clinical data, they promise users privacy and security, and they’ve been integrated into state healthcare programs. The regulatory landscape for health data security is expanding, not contracting.

For purpose-built HIT—EHR systems, patient portals, clinical decision support tools, interoperability platforms, health information exchanges—HIPAA coverage is clear. If you’re building software that a covered entity uses to manage ePHI, you are a business associate. The Security Rule applies to you. The proposed NPRM applies to you.

What the mental health app findings underscore is that software developers in health-adjacent spaces have often treated security as a secondary concern, or assumed that if they weren’t directly covered by HIPAA they could operate without a formal security program. That assumption is becoming untenable. The breach economics Oversecured cited—therapy records worth more than $1,000 each on dark markets—mean that health data is a target regardless of whether the app carrying it has a HIPAA compliance program.
HHS’s proposed Security Rule update arrives at a moment when healthcare data breaches are at their worst recorded levels. The HIPAA Journal reported 2024 as the worst year in terms of breached healthcare records—276 million records compromised as of early 2025, representing a 64.1% increase year-over-year. The number of attacks dropped slightly. The damage from each one grew substantially. That’s what happens when attack surface expands faster than security programs do.

Annual penetration testing, semiannual vulnerability scanning, and documented compliance audits aren’t a burden imposed on HIT developers from outside. They’re the minimum viable security program for a sector that attackers have explicitly targeted, in an environment where patient data is worth more than credit cards. The proposed HIPAA rule is giving the industry a deadline to get there. The smart move is not to wait for it to be finalized.

Please Note: This article references the HHS OCR Notice of Proposed Rulemaking published December 27, 2024, and the Oversecured security research reported by BleepingComputer. The proposed HIPAA Security Rule updates were not finalized as of this writing. Organizations should consult qualified legal and compliance counsel regarding their specific obligations.

MARKET STUDY SURVEY

FHIR Interoperability Testing

Share your experience with FHIR interoperability testing to help identify common challenges, practices, and emerging trends.

SPECIAL OFFER

Meet one-on-one with a Drummond cybersecurity expert to discuss your current risks, security gaps, or compliance needs and receive a customized ACTION PLAN.