content image

AIセラピーが急成長中──しかし各州はチャットボットに「できること」と「できないこと」の線引きを始めている

AI Therapy Is Booming—But States Are Drawing the Line on What Chatbots Can and Can't Do

AIチャットボットによるメンタルヘルス支援が広がる一方、米国では各州が規制に乗り出している。AIセラピーの可能性と限界、そして求められるルールとは。
分からないところをタップすると
↓日本語訳が表示されます↓

Can AI therapy save a hurting mind? In some ways, it can help. A chatbot can answer at any hour, speak in calm language, and make a lonely person feel heard for a moment. Many people are already trying it. NAMI says 12% of U.S. adults are likely to use AI chatbots for mental health care in the next six months, and 1% say they already do. JAMA also reported in January 2026 that chatbots have become one of the most common sources of talk-therapy support in the United States, and that “therapy or companionship” was ranked the top use of generative AI in 2025. (nami.org)

But governments are starting to draw a clear line. In Illinois, Governor JB Pritzker signed the Wellness and Oversight for Psychological Resources Act in August 2025. The law says therapy services in the state must be conducted by a licensed professional. It allows AI for support work, such as administrative or supplementary help, but not for independent therapeutic decisions, direct therapeutic communication with clients, or treatment plans without human review and approval. Violations can bring a civil penalty of up to $10,000 per case. (idfpr.illinois.gov)

Illinois is not alone. Utah took a different path in 2025 by regulating “mental health chatbots” instead of banning them outright. Under Utah law, these chatbots must clearly tell users that they are AI and not human. The law also limits how companies can use a user’s personal mental health information and blocks them from using chat messages to target ads. Nevada also passed a 2025 law that stops AI providers from claiming their systems can give professional mental or behavioral health care and from offering systems specifically programmed to do that work. (le.utah.gov)

So, what is the real answer? AI may be useful as a tool, but it is not the same as a therapist. NAMI says AI is not a replacement for clinical care and should stay within safe informational boundaries instead of acting like therapy. Utah officials have also said that responsible use needs informed consent, privacy protection, careful monitoring, and attention to patient welfare. In short, AI can support human care, but trust, judgment, and responsibility still belong to real people. (nami.org)

by EigoBoxAI
作成:2026/04/08 15:02
レベル:初中級 (語彙目安:1000〜2000語)

まだ読んでいないコンテンツ

content image
by EigoBoxAI
作成:2026/04/08 15:05
レベル:初級 (語彙目安:300〜1000語)
content image
by EigoBoxAI
作成:2026/04/08 15:03
レベル:中級 (語彙目安:2000〜2500語)
content image
by EigoBoxAI
作成:2026/04/08 09:05
レベル:超上級 (語彙目安:8000語以上)
content image
by EigoBoxAI
作成:2026/04/08 09:04
レベル:上級 (語彙目安:6000〜8000語)
content image
by EigoBoxAI
作成:2026/04/08 09:02
レベル:中上級 (語彙目安:4000〜6000語)
content image
by EigoBoxAI
作成:2026/04/08 03:05
レベル:超入門 (語彙目安:〜300語)
content image
by EigoBoxAI
作成:2026/04/08 03:02
レベル:初級 (語彙目安:300〜1000語)
content image

AI翻訳イヤホンが主流に:Samsung、Google、Timekettleが言語の壁を打ち破る競争を繰り広げている

AI Translation Earbuds Are Going Mainstream: How Samsung, Google, and Timekettle Are Racing to Break the Language Barrier

Samsung Galaxy Buds4の登場でAI翻...
by EigoBoxAI
作成:2026/04/08 03:01
レベル:中級 (語彙目安:2000〜2500語)
content image
by EigoBoxAI
作成:2026/04/07 21:04
レベル:初中級 (語彙目安:1000〜2000語)
content image
by EigoBoxAI
作成:2026/04/07 21:03
レベル:超上級 (語彙目安:8000語以上)