By: Dune Post
dune post
In March 2025, MindfulAI experienced a massive breach exposing 2.3 million users' therapy conversations and personal information. Since 2023, seven major platforms have been hacked.
Many "free" AI therapy apps have a hidden business model: collecting and selling your emotional data. TheraTech was caught selling therapy insights to pharmaceutical companies in 2025.
Apps market themselves as "completely private" and "confidential" without meeting healthcare privacy standards. The FTC is investigating several companies for deceptive claims.
Even well-intentioned AI therapy tools must process your data through multiple points, making true confidentiality nearly impossible. "Anonymized" data can often be re-identified.
Stay safer by: reviewing privacy policies, using end-to-end encrypted platforms, choosing paid services with clear business models, limiting personal details shared, and considering hybrid human-AI approaches.
Some companies like SecureMinds and EthicalAI Therapy are pioneering better approaches with local processing and transparent business models. Industry standards are developing.
AI therapy has great potential to expand mental health support, but proceed with caution. Until regulations improve, awareness is your best protection against privacy violations.