Whoop collects an extraordinary depth of biometric and personal data — heart rate, HRV, sleep patterns, blood oxygen, skin temperature, reproductive health metrics, GPS location, and more — while operating in a regulatory gap that leaves users with far fewer protections than they likely assume.
Despite the company’s promise that it “never sells” personal data, a 2025 class-action lawsuit alleges Whoop secretly shared sensitive health information with a third-party tracker (Milberg, ClassAction.org), and a peer-reviewed academic study placed Whoop in the highest privacy-risk cluster among 17 leading wearable manufacturers (Nature). The gap between Whoop’s marketing language and its actual privacy policy terms deserves close scrutiny from the roughly one million users who wear the device daily.
What Whoop knows about your body and your life
Whoop’s data collection is among the most extensive in consumer wearables. Its privacy policy (last updated October 9, 2025) enumerates an expansive list of data categories. The biometric and wellness data alone includes resting heart rate, heart rate variability, respiratory rate, skin temperature, blood oxygen saturation, acceleration, workout metadata, sleep staging, strain scores, and recovery metrics (Mozilla Foundation).
Users who engage with Whoop’s newer features provide even more: the Advanced Labs service (launched September 2025 via Quest Diagnostics) (Sacra) collects biomarkers, lab samples, lab results, and clinical notes. Female health tracking captures menstrual and reproductive data (WHOOP). The WHOOP Coach AI feature retains full conversation histories.
Beyond physiological data, Whoop collects personal identifiers (name, email, phone, mailing address, photographs), payment and transaction data, device information (OS type, IP address, screen resolution, carrier), and detailed online activity logs — pages viewed, time spent, navigation paths, and access duration (WHOOP). Perhaps most notably, Whoop collects precise GPS geolocation data when users grant permission for certain exercise types, along with general location inferred from IP addresses.
The data arrives from multiple streams: the strap itself, the mobile app, the WHOOP Coach AI, cookies and tracking technologies, third-party lab partners, and even external sources such as employers, insurance companies, coaches, marketing partners, and data licensors. This breadth means Whoop builds a remarkably complete profile spanning physiology, behavior, location, and digital activity.
The “we never sell your data” claim and its fine print
Whoop’s privacy principles state unequivocally: “We never sell our members’ personal data. This is our promise.” The formal privacy policy reinforces this, noting that “for the 12-month period prior to the date of this Privacy Policy, WHOOP has not sold any Personal Data” (WHOOP). The company also states it does not use consumer health data for marketing purposes.
However, the reality is more nuanced. Whoop acknowledges using third-party cookies and advertising tracking technologies for interest-based advertising, and the company concedes that “because of how broadly the CCPA defines ‘sale,’ we want to be clear that we use third party cookies and other tracking technologies.” This means browsing behavior and usage data flow to advertising partners through automated technologies — a practice that may technically constitute “sharing” under California law even if Whoop avoids the word “sale.”
The most significant loophole involves de-identified and aggregated data. Whoop’s policy permits sharing “aggregated or de-identified information” with “any third party, including advertisers, promotional partners, and sponsors.” Privacy researchers consistently warn that de-anonymization of such datasets is feasible, particularly when the underlying data includes the kind of granular physiological patterns Whoop collects (Mozilla Foundation).
A 2025 analysis in the International Association of Privacy Professionals (IAPP) journal noted that “despite claims of anonymization, sensor data often contains unique and persistent fingerprints that make true anonymization difficult, if not impossible” (IAPP).
Fitness tech reviewer Ray Maker (DCRainmaker) publicly confronted Whoop’s CEO on this gap in 2020, writing: “No, you do sell your users data. In fact, your privacy policy not only explicitly says you do, but even outlines exactly what has been sold in the last 12 months.” (X). When pressed, Maker said Whoop’s clarification amounted to: “they basically said they only give it away instead” (X).
Who gets access to your Whoop data
Whoop’s data-sharing ecosystem extends to a surprisingly wide range of entities. Service providers include payment processors, hosting providers, analytics services, security consultants, and the third-party LLM partner powering WHOOP Coach. Advertising partners collect data through cookies embedded on Whoop’s platforms (Mozilla Foundation). Third-party laboratory partners — specifically Quest Diagnostics and SteadyMD — receive data when users participate in Advanced Labs.
Professional advisors such as lawyers, auditors, bankers, and insurers also receive data “where necessary in the course of professional services.” Law enforcement and government authorities can obtain data when Whoop believes “in good faith” it is “necessary or appropriate to comply with the law or legal process.” Mozilla’s Privacy Not Included review flagged this language as weaker than industry best practices, noting the absence of any requirement for a subpoena or commitment to sharing only the minimum necessary data (Mozilla Foundation).
Corporate wellness programs represent another sharing channel. Whoop’s policy states it “may share your information” with employers or organizations subject to user consent, typically as aggregated data. Mozilla raised a troubling scenario: “It’s not too far fetched to think an employer could require you to wear one of these bands to monitor you for COVID symptoms. But they take that monitoring way beyond that and look to see which employees drink on the weekends.”
For its AI features, Whoop shares de-identified metrics with its LLM partner (believed to be OpenAI) under a “Zero-Retention/Zero Training Policy,” meaning the partner cannot store or use Whoop data for training. The standard business transfer clause allows all data to transfer to acquirers in mergers, acquisitions, or bankruptcy — a particularly relevant provision given CEO Will Ahmed’s November 2025 comments about considering an IPO “over a horizon of two years” (Access IPOs).
The Segment tracker lawsuit and what it revealed
The most significant privacy controversy emerged in August 2025, when Lomeli v. Whoop, Inc. was filed in the Northern District of California (Top Class Actions). The class-action lawsuit alleges that Whoop embedded a third-party tracker called Segment (owned by Twilio, acquired for $3.2 billion in 2020) into its mobile app, which collected and transmitted sensitive personal data without user knowledge or consent (Milberg).
The alleged data shared via Segment included full names, email addresses, heights, weights, birthdays, genders, cities, usernames, mobile device details (ClassAction.org), heart rate data, blood oxygen levels, blood pressure insights, stress levels, sleep patterns, and titles of videos watched within the app (Nextpit). The lawsuit brings claims under the Video Privacy Protection Act (VPPA) — for sharing video watching history — and the California Confidentiality of Medical Information Act (CMIA) — for disclosing medical information without authorization (The5KRunner, Sourcepoint). Damages sought include $2,500 per VPPA violation and $1,000 per CMIA violation, plus punitive damages.
This is not Whoop’s only active legal challenge. A separate class action, Sanderson v. Whoop (filed October 2023), alleges violations of California’s Automatic Renewal Law through deceptive enrollment practices. That case achieved class certification in March 2025 (The5KRunner), meaning it is proceeding. A third lawsuit, Rowe v. Whoop (filed November 2025), targets allegedly false advertising of “medical-grade” blood pressure insights (ClassAction.org) following an FDA warning letter.
Security measures and the absence of known breaches
Whoop stores all data in Amazon Web Services (AWS) West region in the United States. Data is encrypted at rest using 256-bit encryption (AWS RDS and S3 services) and encrypted in transit (Arizona). The Whoop strap communicates via Bluetooth Low Energy 5.0 using security protocols from specifications 4.2 and above. The company maintains access controls with logging: employees cannot access personal data without a legitimate business need, and data access logs are actively monitored for anomalies (WHOOP).
Whoop has a formal vulnerability disclosure program through HackerOne with a 90-day disclosure deadline (WHOOP). The company enters into Data Protection Agreements with third-party vendors.
No publicly known data breaches have been reported specific to Whoop as of February 2026, though security experts caution that third-party dependencies introduce vulnerability — the 2025 academic study in NPJ Digital Medicine noted that “98% of organizations have a relationship with at least one vendor that has experienced a data breach” (PubMed Central).
The overall security posture appears reasonable by industry standards, though the IAPP has flagged general wearable vulnerabilities including “weak encryption, insecure Bluetooth protocols, and limited capacity for regular security updates” (IAPP). The fundamental concern is less about Whoop’s technical security and more about what happens to data once collected — the policy permissions rather than the cryptographic protections.
Why HIPAA doesn’t protect your Whoop data
Whoop states explicitly in its Consumer Health Data Privacy Notice: “WHOOP is not a ‘covered entity’ or ‘business associate’ under HIPAA. This means WHOOP is not subject to HIPAA’s privacy or security rules” (WHOOP). This is legally accurate. HIPAA applies only to healthcare providers conducting electronic transactions, health plans, and healthcare clearinghouses — not to consumer technology companies selling directly to users (Franklin County Law Library, Tech Policy Press).
The practical consequence is profound: the same biometric data that would receive stringent federal protection if collected by a doctor’s office has essentially no federal health-privacy protection when collected by Whoop. As an analysis in the UC Law Science & Technology Journal concluded, “Ultimately, with minor exceptions, HIPAA does not cover wearables” (Milberg, Uclawsf). Suzanne Bernstein of the Electronic Privacy Information Center (EPIC) has emphasized that health data from wearable devices simply “doesn’t fall under HIPAA’s umbrella” (Government Technology).
Several state laws partially fill this gap. Washington’s My Health My Data Act (effective March 2024) broadly defines “consumer health data” to encompass fitness and wellness metrics, requiring informed opt-in consent (California Lawyers Association, UpGuard) and publishing a separate Consumer Health Data Privacy Policy — which Whoop now provides (Washington State Attorney Gen). California’s CCPA/CPRA classifies biometric information as sensitive personal information. Whoop’s privacy policy addresses residents of 15+ U.S. states with specific privacy rights.
A potentially transformative development arrived in November 2025 when Senator Bill Cassidy introduced the Health Information Privacy Reform Act (HIPRA), which would extend HIPAA-like privacy, security, and breach notification standards to wearables, health apps, and wellness platforms (Inside Privacy, Athletech News, Alstonprivacy). The bill remains in early stages before the Senate HELP Committee, but signals growing legislative attention to the regulatory gap.
How Whoop compares to Apple Watch, Oura, and Fitbit
A 2025 peer-reviewed study in NPJ Digital Medicine evaluated privacy policies across 17 wearable manufacturers using a 24-criteria rubric (Nature, PubMed Central). Whoop was placed in the highest-risk cluster (Cluster 3) alongside Xiaomi, Huawei, and Wyze, while Apple and Google (Fitbit) ranked in the lowest-risk clusters based on written policy analysis (Health Platform News).
Apple Watch represents the privacy gold standard in wearables. Its architecture maximizes on-device processing — health metrics are calculated entirely on the iPhone or Watch, never touching Apple’s servers unless the user explicitly opts into iCloud sync with end-to-end encryption (9to5Mac). Apple’s HealthKit imposes strict requirements: third-party apps that access health data cannot use it for advertising or sell it to data brokers. Apple has no ad-based business model monetizing health data.
Oura runs many algorithms locally on the ring and phone (“edge inference”), explicitly refuses to share sensitive data without consent, and does not use health data for advertising (Oura). After Roe v. Wade, Oura publicly committed to opposing law enforcement requests for reproductive health data (Belle) — a stance notably absent from Whoop’s policies.
Fitbit (Google) presents a complex picture. Its written policies scored well academically (Nature), but Google’s fundamental business model as the world’s largest advertising company creates inherent tension. The EU imposed a 10-year ban on using Fitbit health data for Google Ads as a condition of the 2020 acquisition (Android Authority) — a regulatory constraint, not a design philosophy. New Fitbit users must now create Google Accounts, deepening data integration with an advertising ecosystem (Mozilla Foundation).
The critical structural difference is business model. Apple sells hardware. Oura sells hardware plus subscriptions. Whoop sells subscriptions (Milberg). Google sells advertising. Only Whoop and Google have significant economic incentives to extract additional value from user data beyond the direct service, though Whoop’s subscription model is arguably better aligned with user interests than Google’s ad model. The decisive advantage Apple holds is architectural: health data processed on-device cannot be shared, subpoenaed, or breached from a server that never held it.
User control: export, deletion, and the data that stays behind
Whoop provides several mechanisms for user control. Users can access and manage their data through the WHOOP Privacy Center (privacy.whoop.com) or by emailing WHOOP (WHOOP). The company states it “will delete your personal data if you ask us to, including if asked when you cancel your membership.” Data export is available upon membership termination.
Users can opt out of specific data collection: disabling location services stops GPS collection, unpairing the strap stops wellness data transmission, and standard browser tools manage cookies and advertising tracking (WHOOP). For California residents, CCPA rights include access, deletion, correction, and opt-out of sale/sharing. European residents receive full GDPR rights including portability and the right to lodge complaints with supervisory authorities (WHOOP).
However, a critical limitation applies: even after deletion, Whoop retains “aggregated data or de-identified data derived from or incorporating your Personal Data.” The company provides no specific retention period, stating only that data is kept “as long as reasonably necessary” (WHOOP). The broad content license users grant — “perpetual, worldwide, non-exclusive, royalty-free, fully paid-up, sublicensable and transferable” — survives account termination and explicitly permits using aggregated, de-identified content to train AI models (WHOOP). Users who don’t opt out of the mandatory arbitration clause within 30 days of signing up permanently waive their right to class-action lawsuits and jury trials (Terms).
Conclusion
Whoop occupies an uncomfortable position in the privacy landscape: it collects clinical-grade physiological data with consumer-grade legal protections. The company’s stated commitment to not selling personal data is genuine in the narrow, literal sense (WHOOP) — but the fine print permits extensive sharing of de-identified data with “any third party,” vague law enforcement cooperation, broad content licensing, and the kind of third-party tracking that a 2025 lawsuit alleges amounted to covert health data sharing (ClassAction.org, Milberg). The absence of HIPAA coverage means users’ most intimate biological signals — their heart rhythms during sleep, their stress responses, their reproductive patterns — receive no more federal protection than their browsing history (Mozilla Foundation, Inside Privacy).
The proposed HIPRA legislation and Washington’s My Health My Data Act (Alstonprivacy, California Lawyers Association) represent early efforts to close this gap, but the regulatory patchwork remains thin. Users considering Whoop should weigh the genuine performance insights the device offers against the reality that once biometric data may leave their wrist, their control over it diminishes significantly — and a perpetual content license means some derivative of that data may persist indefinitely, regardless of whether they ever cancel their membership.
Last Updated on 23 February 2026
