Is Your Health App Selling Your Data? A Privacy Guide
You downloaded a meditation app to manage your anxiety. A fitness tracker to count your steps. A period tracker because your doctor recommended it. You entered your weight, your heart rate, your sleep patterns, your moods, and your medications. You trusted these apps with some of the most intimate details of your life. But do you know where that data went?
The answer, for the vast majority of health and wellness apps, is uncomfortable. Your data was shared with advertisers, data brokers, and third parties whose names you have never heard. Not because of a hack or a breach, but by design. The business model of most free health apps depends on monetizing your personal health information. And unlike your medical records at a hospital, most of this data has almost no legal protection.
Most health and fitness apps are not HIPAA-compliant and routinely share user data with third parties. A 2024 Mozilla Foundation study found that 80% of health apps fail basic privacy standards. To protect yourself, look for apps that process data on-device rather than uploading to cloud servers, provide granular consent controls, and clearly state they do not sell data to brokers.
The Hidden Cost of Free Health Apps
There is an old saying in Silicon Valley: "If the product is free, you are the product." Nowhere is this more true than in health and wellness technology.
Data as the Product
The global health app market generates an estimated $6.2 billion annually, but the majority of health apps are free to download. How do they make money? Through advertising revenue driven by targeted data, direct data sales to third parties, and partnerships with insurance companies, pharmaceutical firms, and data brokers who aggregate and resell personal health information.
A 2023 investigation by The Markup found that popular health apps including period trackers, mental health platforms, and diet apps were sending user data to Facebook (Meta), Google, and third-party analytics firms within seconds of being opened. In many cases, this data sharing began before the user even accepted the privacy policy.
The Data Broker Ecosystem
Health data does not stay with the app that collected it. It enters a sprawling ecosystem of data brokers, companies that buy, aggregate, and resell personal information. A 2024 report from Duke University's Sanford School of Public Policy identified over 4,000 data brokers operating in the United States, many of whom specifically trade in health and wellness data.
These brokers create detailed profiles that combine your app data with purchasing history, location data, social media activity, and public records. The resulting profiles can reveal sensitive health conditions, fertility status, mental health treatment, substance use patterns, and more. This information is sold to insurers, employers, landlords, and marketers, often without any disclosure to the individual whose data is being traded.
Real Consequences
This is not a theoretical concern. In 2022, a Catholic publication used commercially available location data from the dating app Grindr to track and publicly out a priest. In 2023, data broker Near Intelligence was caught selling location data that could identify visitors to abortion clinics, mental health facilities, and addiction treatment centers. In 2024, the FTC fined a mental health app $7.8 million for sharing user therapy data with advertising platforms.
Health data is uniquely sensitive because it is uniquely permanent. You can change your email address, your phone number, or your home address. You cannot change your medical history.
What Health Apps Know About You
The scope of data that health and wellness apps collect is far broader than most users realize. Here is a taxonomy of what a typical health app may gather.
Biometric Data
Heart rate, blood pressure, blood oxygen levels, sleep patterns, menstrual cycles, weight, body measurements, and step counts. Some apps also collect voice data (for mental health assessment) and facial data (for emotion detection). Movement and pose data from fitness apps can reveal physical disabilities, injuries, and health conditions.
Location Data
Many health apps request location access for features like outdoor workout mapping or finding nearby gyms. But location data reveals far more than your running route. It reveals where you live, where you work, what doctors you visit, what pharmacies you use, and how often you leave your home (a potential indicator of depression or mobility issues).
Behavioral Patterns
Usage patterns within health apps reveal intimate behavioral details. When you log meals, moods, or symptoms, you are creating a longitudinal record of your daily habits. A 2023 study in Nature Digital Medicine demonstrated that smartphone usage patterns alone (screen time, typing speed, app switching behavior) could predict depressive episodes with 85% accuracy.
Health Conditions and Medications
Apps that track medications, symptoms, or health conditions maintain explicit records of diagnoses and treatments. Even apps that do not directly ask about health conditions can infer them: searching for "low carb recipes" in a diet app, tracking glucose levels, or logging specific symptoms creates a data trail that reveals medical information.
Social Graph
Apps with social features, including shared workout groups, family accounts, or community forums, capture your social connections and their health data alongside yours. This creates a network map that can reveal family health histories and social determinants of health.
The Mozilla Study: 80% of Health Apps Fail Privacy Standards
In 2024, the Mozilla Foundation published the results of its most comprehensive health app privacy review, evaluating 72 of the most popular health and wellness apps against Mozilla's "Privacy Not Included" standards. The findings were alarming.
Key Findings
- 80% of apps failed basic privacy standards. This means they collected more data than necessary, shared data with third parties for advertising, lacked adequate data deletion mechanisms, or had privacy policies that were misleading or incomprehensible.
- 61% of apps shared data with third parties for advertising purposes. This includes data shared with Meta (Facebook), Google, and smaller ad tech companies. Many apps shared data even when users opted out of personalized advertising within the app's settings.
- 52% of apps had privacy policies that contradicted their actual data practices. Apps that claimed they did not sell data were found sharing data with brokers through technical mechanisms that their legal teams classified as "sharing" rather than "selling," a distinction without a meaningful difference.
- Only 12% of apps offered true data deletion. While many apps had a "delete account" button, most retained data in backups, analytics databases, or third-party systems for months or years after deletion was requested.
- Mental health apps were the worst offenders. Of the 32 mental health and therapy apps evaluated, 28 (87.5%) failed privacy standards. Several shared therapy session data, mood logs, and diagnostic screening results with advertising platforms.
The Privacy Policy Problem
A 2024 analysis by Carnegie Mellon University found that the average health app privacy policy is 4,200 words long and written at a college reading level. Researchers estimated that reading every privacy policy for the apps on an average smartphone would take approximately 76 hours. More importantly, 89% of users reported that they never read privacy policies, and those who did found them intentionally vague about data sharing practices.
The practical result is that informed consent, the foundation of ethical data collection, is largely a fiction in consumer health apps. Users agree to terms they have not read, cannot understand, and would likely reject if they were presented clearly.
HIPAA and Health Apps: What Most People Get Wrong
Perhaps the most dangerous misconception about health app privacy is that HIPAA protects you. For most consumer health apps, it does not.
When HIPAA Applies
The Health Insurance Portability and Accountability Act (HIPAA) protects health information held by "covered entities": hospitals, doctors, insurance companies, and their direct business associates. If your doctor uses an electronic health record system, that system must comply with HIPAA. If a hospital's patient portal stores your lab results, HIPAA applies.
When HIPAA Does Not Apply
HIPAA does not apply to consumer health apps that you download from the App Store or Google Play. If you voluntarily enter your health information into a fitness tracker, meditation app, period tracker, or wellness platform, that data is generally not protected by HIPAA. The app company is not a covered entity, and your voluntary data entry is not a "protected health transaction."
This means that the heart rate data your cardiologist records in your medical chart has robust legal protection, but the identical heart rate data recorded by your fitness app has almost none. The same data, two entirely different legal regimes.
The Gap
This gap between user expectation and legal reality is significant. A 2023 survey by Rock Health found that 72% of consumers believed their health app data was protected by HIPAA or similar regulations. Only 9% correctly understood that most consumer health apps operate outside HIPAA's scope.
Some states are beginning to address this gap. Washington State passed the My Health My Data Act in 2023, which extends privacy protections to consumer health data regardless of whether the entity collecting it is a HIPAA-covered entity. California's Consumer Privacy Act (CCPA) provides some protections, including the right to know what data is collected and the right to request deletion. But these state-level protections are patchwork, and most Americans have no specific legal protection for the health data they share with consumer apps.
How to Evaluate a Health App's Privacy Practices
Given the regulatory gaps and the prevalence of poor privacy practices, consumers need practical tools for evaluating health apps before trusting them with sensitive data. Here is a seven-point checklist.
1. Where Is Your Data Stored?
Look for explicit statements about data storage location. Apps that store data on their own cloud servers create a centralized target for breaches and a repository that can be shared with third parties. Apps that store data on your device (on-device storage) keep your information under your physical control. The privacy policy should clearly state where data resides.
2. Where Is Your Data Processed?
This is different from storage and equally important. Many apps claim they do not "store" your video or audio, but they upload it to cloud servers for processing before deleting it. During that processing window, your data is exposed. Apps that process data on-device (using on-device AI models) never transmit raw data to external servers, eliminating this exposure entirely.
3. What Is the Consent Model?
Good consent models are granular (you choose what to share, with whom, and for how long), revocable (you can withdraw consent at any time), and informed (the app explains in plain language what each consent grants). Bad consent models are binary (accept everything or do not use the app), permanent (no mechanism to withdraw), and buried (consent is granted through a privacy policy nobody reads).
4. Does the App Share Data with Third Parties?
Look for specific statements about third-party data sharing in the privacy policy. Be wary of vague language like "we may share data with partners to improve our services." Look for concrete lists of categories of third parties and the specific types of data shared with each. The best apps will state explicitly: "We do not share your data with third parties for advertising purposes."
5. Can You Delete Your Data?
Test the data deletion process before you invest significant time in an app. Can you export your data? Can you delete your account? Does the app confirm that deletion removes data from backups and third-party systems? A 2024 Consumer Reports investigation found that 41% of apps that offered account deletion did not actually delete associated data from all systems.
6. Is Your Data Encrypted?
Look for statements about encryption both "in transit" (while data moves between your device and servers) and "at rest" (while data sits in storage). End-to-end encryption means that even the app company cannot read your data. This is the gold standard but remains rare in health apps.
7. Is There an Audit Trail?
Can you see a log of who has accessed your data and when? The best health apps provide transparency logs that show every data access event, every sharing action, and every third-party request. If you cannot see who has looked at your data, you have no way to verify the app's privacy claims.
On-Device Processing: The Privacy Architecture That Actually Works
Among all the technical approaches to health data privacy, one stands out for its simplicity and effectiveness: on-device processing. Instead of sending your data to a cloud server for analysis, on-device processing runs the analysis directly on your phone or tablet. Your raw data never leaves your device.
How It Works
Modern smartphones contain powerful processors capable of running sophisticated AI models locally. Apple's Neural Engine and Qualcomm's AI Engine can perform billions of operations per second, enough to analyze movement patterns, process images, and generate health insights without any cloud connection.
In an on-device architecture, the AI model is downloaded to your phone once. When you use the app, your data (camera feed, sensor readings, user inputs) is processed by this local model. Only the results, not the raw data, are stored or optionally shared. The raw data (your video, your biometrics, your keystrokes) never touches an external server.
The Privacy Advantage
On-device processing eliminates four major privacy risks simultaneously. First, there is no data in transit to intercept. Second, there is no cloud database to breach. Third, there is no server-side copy that can be subpoenaed, sold, or shared. Fourth, there is no processing window during which raw data exists on third-party infrastructure.
This is not a theoretical advantage. In 2024, a major fitness platform suffered a data breach that exposed the workout locations, heart rate data, and body measurements of 14 million users. In an on-device architecture, that breach would have been impossible because the data never existed on a central server.
The Trade-offs
On-device processing is not without limitations. AI models that run locally must be smaller and less complex than cloud-based models, which can affect accuracy. Processing on-device uses battery power and can generate heat. And some features, like comparing your progress against population-level benchmarks, require some data to be aggregated (though this can be done with privacy-preserving techniques like differential privacy or federated learning).
Despite these trade-offs, the trajectory is clear. Apple, Google, and Samsung are all investing heavily in on-device AI capabilities, and the performance gap between on-device and cloud processing is closing rapidly. For health and wellness apps, on-device processing is increasingly the architecture that best balances functionality with privacy.
Who Does This Well?
Apple's Health app processes most health data on-device and uses end-to-end encryption for data that syncs to iCloud. Signal processes messages on-device with end-to-end encryption. In the wellness space, Kelo uses on-device AI to analyze Tai Chi movement through the phone camera, processing the video feed locally and storing only aggregate movement metrics (joint angles, form scores, timing data), never raw video. Users control exactly what metrics are shared and with whom through a granular consent system.
Taking Control of Your Health Data
The current state of health app privacy is discouraging but not hopeless. Here are practical steps you can take today to protect your health information.
Audit Your Current Apps
Open your phone's settings and review which apps have access to your health data, location, camera, and microphone. On iPhone, go to Settings, then Privacy and Security. On Android, go to Settings, then Privacy, then Permission Manager. Revoke permissions that are not essential to the app's core function. If a calorie-counting app has access to your location, that is a red flag.
Read the Privacy Nutrition Label
Both Apple's App Store and Google Play now require "privacy nutrition labels" that summarize what data an app collects and whether it is linked to your identity. These labels are imperfect (they rely on self-reporting by developers), but they provide a quick way to compare apps. Before downloading a health app, check its privacy label and compare it to alternatives.
Use the Mozilla Privacy Not Included Guide
Mozilla's "Privacy Not Included" buyer's guide (foundation.mozilla.org/privacynotincluded) evaluates popular apps and connected devices against privacy standards. It is free, regularly updated, and written in plain language. Check it before trusting a new app with health data.
Exercise Your Rights
If you live in California, Colorado, Connecticut, Virginia, Utah, or another state with consumer privacy laws, you have the right to request a copy of your data, request deletion, and opt out of data sales. Use these rights. Companies are legally required to respond to these requests within specific timeframes, typically 30-45 days.
Choose Privacy-First Apps
When selecting health and wellness apps, prioritize those that process data on-device, provide granular consent controls, clearly state they do not sell data, and have been independently evaluated for privacy. The trade-off may be fewer features or a subscription cost, but the alternative is paying with your most sensitive personal information.
Kelo was designed from the ground up with this philosophy. Movement data is processed on-device using AI pose tracking. No video is stored or transmitted. Consent is granular and revocable. Data sharing with family members or healthcare providers happens only with explicit, per-session authorization. And the business model is built on community memberships and provider partnerships, not data monetization.
Demand Better
Ultimately, the health app privacy problem requires systemic solutions: federal legislation that extends HIPAA-like protections to consumer health data, FTC enforcement against deceptive privacy practices, and industry standards that make privacy the default rather than the exception. As consumers, the most powerful thing we can do is choose apps that respect our data and refuse to use those that do not. Market pressure works. When users leave platforms over privacy concerns, as millions did with Facebook after the Cambridge Analytica scandal, companies change their practices.
Your health data is among the most intimate information that exists about you. It deserves the same protection as your medical records, regardless of which app collected it. Until the law catches up, the responsibility falls on each of us to protect it.
