美女免费一级视频在线观看
Every agency and industry player in the space is working on some new AI product. Whether it’s a “mini” AI creative agency or a new tool for market research, AI applications are popping out left, right and center, with the promise of helping marketers do their job better and faster.
However, the everyday person is already using AI models multiple times per day in a way that AI developers may not have expected.
Earlier this year, Harvard Business Review found that in the past year, people used generative AI tools like ChatGPT for therapy/companionship more than for any other reason.
Since then, there have been a number of reported cases of the interactions between users and generative AI models responding as therapists. The New York Times published a number of op-ed pieces featuring the death of a youth via suicide, highlighting how they confided in ChatGPT about their mental health struggles.
Over the past month, there has been a shift in the generative AI space. Open AI recently announced that it will be including mental health “guardrails” into future generations of the product, as well as specific guardrails for teen users after the tool was linked to “encouraging” suicides and murder.
The Food and Drug Administration will also convene a panel to discuss the impact of AI mental health products on November 6.
While generative AI tools like ChatGPT, Google Gemini and Meta AI are not specifically designed for healthcare use, the fact that they are being used to fill healthcare needs — particularly mental health needs — highlights the decades old sentiment that Americans are still struggling to access affordable mental health services.
Experts in the industry have varying views on the implications of mental health and AI, but generally stand by the notion that mental health guardrails are needed as technology advances.
“Guardrails right now have matched the kind of training counselors in the field receive,” said Steve Baue, CEO and owner of the Employee Resource Center.
Can AI be your new therapist?
Psychologists and mental health counselors in America don’t have it easy in 2025 — there is a massive shortage of mental health counselors, and the public largely views access to mental health services challenging and unaffordable.
Jon Nelson, a mental health expert and co-founder of Neurolivd — a consulting firm that connects healthcare companies with individuals with lived experiences of conditions — said that the use of AI is generally welcomed in the mental health space, as it could potentially fill a number of gaps that currently exist in the system.
As someone who has dealt with mental health challenges for years, he noted that AI could provide quicker access to services for patients who are facing difficulties.
“The problem with typical therapy is you’re not with that therapist 99.9% of the time. Plus, having technology means you do not have to wait for a year to get therapy and to pay out of pocket $400 per session,” said Nelson. “Being able to create technology to help you throughout that process and make sure it’s done safely is so needed.”
When reflecting on the state of AI in the mental health space Nelson, along with his co-founder Rachel Wurzman, said that many in the industry are currency focused on developing AI tools to assist the daily operations of therapists and counselors.
There are, however, some who are investing in AI forward therapy platforms.
“If you look at slingshot AI with what they’re developing, it is essentially personalized therapy that’s going to go directly towards people. It’s 18 and older now but once they perfect this in adults, they’re going to bring this into other audiences,” said Nelson.
“AI tools can be really positive here, and they also have the potential to recapitulate and re-engineer some of the same features that make clinical encounters for people with serious mental illness a wholly positive experience,” added Wurzman.
However, they also note that these technologies need to be created with specific guardrails.
Specifically, when developing technologies, researchers and developers should work alongside clinicians in the field, and mimic techniques that have been tested and are proven to be effective in working with folks with various mental health conditions.
Baue agreed, adding that models being used for therapy — like ChatGPT — need to have specific guardrails on them, as they have not been developed for specific therapeutic purposes.
“Counselors need to go through 3,000 hours of training,” said Baue. “The guardrails right now have to be 3,000 supervised hours that the counselor has to receive. That’s what is needed right now.”
Marketers have power
As AI becomes more pervasive, more and more healthcare professionals have been expressing their concern over how different audiences are using the tool.
Laura Erickson-Schroth, the Chief Medical Officer at The Jed Foundation (JED) — a non-profit dedicated to protecting emotional health and preventing suicide said that it’s been concerning to hear about youth’s experiences with AI chatbots and suicide.
“It’s unfortunate that these instances happen, but they show there is a great need for regulation,” said Erickson-Schroth.
That being said, she added that the industry should be leaning into these conversations, and playing an active role in shaping technology, rather than playing a backseat role.
“Even though this is coming out of the tech world, it’s having massive impacts. We are all being affected,” said Erickson-Schroth.
She advised medical marketers to study these audiences deeply.
“One of the most important things is to look at the patients that are involved. How are they already using AI? Asking them questions about what they find most interesting or helpful about it can illuminate a lot,” she said.
Erickson-Schroth also advised medical marketers to focus on developing campaigns and materials to educate audiences on how to use AI and transparency around tools that exist in the tech world. These initiatives, she added, should inform audiences about the limitations around AI, specifically its practical limitations when playing an “expert” in mental health discussions.
“Technology companies have a role to play. They’re not going to regulate themselves, but there’s a lot they can do in terms of providing education, transparency around how their tools work, and this is where I think healthcare companies can come in,” said Erickson-Schroth.
Baue also encouraged healthcare marketers to work with traditional technology companies to bridge the gap between what they are producing, and how information and campaigns are produced in the healthcare world.
He noted that healthcare campaigns usually carry stricter messaging and a more serious tone, aiming to balance education about a product’s benefits with awareness of its risks.
This kind of tone could be helpful for technology companies to adopt, specifically when a product can have such a great impact on the health of an individual.
Encouraging collaboration between healthcare industry experts and generative AI companies is something else he also emphasized.
“How do we make sure that what is being produced and used has those guardrails, human oversight, and is thought about through the lens of how it’s being used? Healthcare needs to have an input,” said Baue.