If you were looking for online therapy between 2017 and 2021, it’s likely that you came across BetterHelp. BetterHelp claims to be the largest provider of online therapy, boasting over 2 million users. Once you landed on their website, you would have filled out an intake questionnaire, similar to what you would do at a traditional therapist’s office. The questionnaire asked about your therapy experience, medication usage, issues with intimacy, feelings of sadness, and even your religious beliefs, LGBTQ status, or age if you were a teenager. The purpose of these questions was to match you with the most suitable counselor, and BetterHelp assured users that their information would remain private.
However, BetterHelp isn’t just a regular therapist’s office, and your information may not have been as private as advertised. According to a complaint filed by federal regulators, BetterHelp shared user data, such as email addresses, IP addresses, and questionnaire answers, with third parties like Facebook and Snapchat for targeted advertising purposes. The Federal Trade Commission (FTC) further criticized BetterHelp for not properly regulating what those third parties did with the user data. As a result, BetterHelp reached a settlement with the FTC and agreed to refund $7.8 million to consumers whose privacy had allegedly been compromised. BetterHelp, in a statement, denied any wrongdoing and justified the sharing of user information as an “industry-standard practice.”
Our health information is scattered across various digital platforms, whether we’re completing forms, seeking prescription refills, or searching for medical information online. This information is valuable to advertisers and tech companies selling ad space because our health greatly influences our behavior. The more these companies know about us, the more they can influence us. Recent reports have revealed instances of Meta tracking patient information from hospital websites, and apps like Drugs.com and WebMD sharing search terms and user data with advertisers. The FTC settled with Flo, a popular period and ovulation app with over 100 million users, after alleging that it disclosed users’ reproductive health information to third-party marketing and analytics services despite promising otherwise in its privacy policies. Flo, like BetterHelp, denied any wrongdoing and clarified that it didn’t share users’ personal details.
When our health information falls into the wrong hands, the consequences can be severe. Advertisers and social media algorithms inferring specific medical conditions or disabilities can lead to exclusion from crucial resources like housing or employment opportunities. Moreover, our sensitive personal data can be exploited for fraud or identity theft, resulting in financial and medical complications, damaged credit ratings, canceled insurance policies, denial of care, and even harassment or discrimination if made public.
Many people believe that their health information is protected under the federal Health Insurance Portability and Accountability Act (HIPAA), which safeguards medical records and personal health information. However, HIPAA only applies to “covered entities” like health insurance companies, doctors, hospitals, and some associated businesses. Tech companies, social media platforms, advertisers, and most health tools targeting consumers directly are not covered by HIPAA regulations. As a result, when we download health apps and input data, there are no specific protections other than what the app promises. Consumers also have no way of verifying if apps are following their stated policies. BetterHelp, for example, falsely displayed HIPAA seals on its website without any official review or compliance confirmation.
Companies that sell ads often argue that they aggregate information and don’t target individuals. However, these aggregated categories can be quite specific, allowing for hyper-targeted pharmaceutical ads or the dissemination of unscientific “cures” and medical misinformation. Additionally, discrimination can occur based on inferred interests or characteristics. For instance, Facebook’s data-collection machine led to Meta’s violation of the Fair Housing Act by enabling advertisers to exclude users interested in topics like “service animal” and “accessibility.”
Although the FTC has shown a growing interest in regulating health privacy through settlements, its actions are primarily achieved through consent orders without admissions of wrongdoing. If a company violates the consent decree, a federal court can initiate an investigation, but the FTC’s enforcement resources are limited. Privacy and consumer advocates have urged congressional committees to increase funding for the FTC due to rising consumer complaints and reported fraud cases. Meanwhile, the FTC has created tools to assist app creators in complying with the law, and HHS’s Office for Civil Rights has issued guidelines for HIPAA-covered entities regarding online tracking technologies. These efforts aim to prevent privacy issues before they cause harm.
The Center for Democracy & Technology has proposed its own consumer privacy framework to address the lack of HIPAA obligations for entities holding substantial amounts of mental and physical health data. The framework emphasizes setting appropriate limits on the collection, disclosure, and use of health data, as well as information that can be used to draw conclusions about a person’s well-being.
Denial of responsibility! Vigour Times is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.