常见问题解答:心理健康应用的数据隐私——你的治疗应用未告诉你的事

📄 中文摘要

心理健康应用每年吸引数百万人寻求焦虑、抑郁、创伤和危机的帮助。用户在与AI聊天机器人和数字治疗师互动时,分享了他们最私密的想法,信任这些信息只存在于应用内。然而,这种信任往往是错位的。研究、监管行动以及TIAMAT的调查揭示了治疗数据的处理方式及用户应采取的措施。以BetterHelp为例,该公司在2023年因与Facebook、Snapchat、Pinterest和Criteo共享治疗会话数据和个人信息而支付了780万美元的FTC和解金,显示出心理健康应用在数据隐私方面的隐患。

📄 English Summary

FAQ: Mental Health App Data Privacy — What Your Therapy App Isn't Telling You

Millions of individuals seek help from mental health apps each year for issues such as anxiety, depression, trauma, and crises. They confide their most private thoughts to AI chatbots and digital therapists, believing that their conversations remain confidential within the app. However, this trust is often misplaced. Research, regulatory actions, and investigations by TIAMAT reveal concerning practices regarding the handling of therapy data and what users can do to protect themselves. For instance, BetterHelp faced a $7.8 million FTC settlement in 2023 for sharing therapy session data and personal information with platforms like Facebook, Snapchat, Pinterest, and Criteo for advertising purposes, highlighting significant privacy risks associated with mental health apps.

Powered by Cloudflare Workers + Payload CMS + Claude 3.5

数据源: OpenAI, Google AI, DeepMind, AWS ML Blog, HuggingFace 等