Are Mental Health Apps Spying on You? A Deep Dive into Data Collection

Estimates reveal that around 20% of Americans are living with some mental health issue or condition. However, only 60% of them have adequate access to proper care & treatments they ideally require. To help people with mental illnesses, several mental health apps have emerged in recent years and flooded the market. These apps offer many features, from stress management to mood-tracking tools. These apps have achieved high popularity and reported a market value of $5.2 billion in 2022. However, several of these mental health apps have been found to have very serious privacy concerns. 

Recent research revealed that several mental health applications breach the users' trust and are most likely to exploit the users' privacy. Thus, this compromises all the parts of the users' personal and very vulnerable data. The prevailing mental therapy applications like Youper, TalkSpace, & BetterHelp are among the giant data-harvesting machines. These apps are found to have the most basic, vague, or even the most deceptive privacy policies. 

In 2024, extensive checks were conducted on the privacy policies of various well-known mental therapy applications. The results show how these apps put all the users' vulnerable data at high risk. Here, we will discuss some relevant terminologies related to this matter and valuable tips for users to safeguard their personal information.  

Privacy Concerns & Pitfalls of Mental Health Apps 

According to the research, more than 80% of the tested mental therapy applications collect users' personal data, and this includes symptoms of the user's mental health issue or condition. Even worse than that is that several culprits also exchange this sensitive information with several other third parties without the users' knowledge or permission. 

Therapy apps are often recommended by health and medical websites and are often verified for their popularity based on the volumes of downloads on the app store. Although these apps are very popular, their privacy policies in 2024 need to be revised. They collect users' data extensively, which can be misused in several ways. 

Mass collection of data & its distribution is made possible as these mental health applications are not licensed and authorized medical platforms. So, the companies that operate these apps are not under the surveillance of HIPAA (the Health Insurance Portability & Accountability Act). Also, the US federal law does not regulate how the collected data by the mental health applications are being used. Thus, the owners of these mental health apps are free to use the data at their discretion. Even though the FDA has the right to review & approve platforms, this would not certify data security. 

Which Data Do Mental Health Applications Usually Collect?

It was found that Talkspace, BetterHelp, Cerebral, Youper, & several other mental health apps very actively monitor as well as store the users' details. This specifically includes the users' mood checkers, journal entries, medical symptoms, various medications, & even the complete transcripts of the chats. Collecting these without adequate protection or consent can lead to malicious or unintentional misuse

The infiltration level depends largely on the exact app that the user is using. Meditation apps or mood trackers may get less data than platforms that usually require users to fill in a detailed questionnaire for the purpose of signing up, like TalkSpace & BetterHelp. These apps often require personal details of the users like their address, DOB, medical history, and sexual orientation, forcing the users to share a number of their vulnerable data. 

Several mental health applications are found to collect all your metadata. This information describes the ways the users communicate with a platform, like the time of login, the length of the sessions and the features that the users use very often. This may not be concerning, but the users' metadata lets the app companies quickly build complete profiles of the users when paired along with all the data that the users share. 

Potential Risks That Data Collection Can Expose to the Mental Health App Users 

As the mental health applications operating in the market are collecting a large amount of personal data of the users, this puts the users at several privacy risks that include: 

  • Data breaches – By utilizing the personal information of the users of the mental health apps, hackers, as well as other malicious gangs, can likely create more fulfilled or complete profiles of the users and allow for elaborate scams of the user’s identity. 
  • Advertising & sharing with third parties – Several mental health apps share the users' vulnerable and sensitive data with the advertisers and partners to earn a lot of money & target the users with customized ads
  • Inaccuracy in data – The mental health applications thoroughly analyze all your available data, but misrepresentation could finally lead to some inaccurate interventions or recommendations. 
  • Storage for a long term – Several apps do not disclose the time duration for which they store the users' data or the security level of their storage. This increases the overall chances of personal data being highly compromised in the form of a breach or leak. 
  • Emotional manipulation – The detailed insights and information of the users’ mental state put them at risk of certain emotional manipulation, like for political campaigns, advertising, or many other purposes. 
  • Integration with several other services – A few of the mental health applications also link your existing account with smart home devices as well as other devices. This lets others create certain detailed profiles of the users and exploit them easily. 
  • Stigma in the users' personal lives – Several issues related to mental health are associated with prejudice. So, leaking all such data may lead to misuse by employers or insurers, stigmatization and discrimination. 

Conclusion 

Without HIPAA regulations and federal laws, mental health applications can use your data in various ways as they please. Around 80% of mental health apps do not meet the industry standards for privacy protection. They are found guilty of collecting vulnerable & identifiable details of the users, then storing them even after the users delete their accounts, & they also tend to share the users' files with several third parties. 

As a user of a mental health application, staying vigilant and informed is undoubtedly the first line of defence. As users embrace more and more digital solutions for mental health, one needs to ensure that the users' well-being does not put their real-world security under threat or risk. 

Sharing is Caring

About the Author

Team Technical Explore
(353 Articles Published)

Leave a Reply

Your email address will not be published. Required fields are marked *

Read More

crossmenu