Lately, we’re sharing our darkest secrets with robots. The market for AI mental health aides is booming but how does it actually feel to bond with a therapy bot?
Lately, we’re sharing our darkest secrets with robots. The market for AI mental health aides is booming but how does it actually feel to bond with a therapy bot?
Our guest, Graham Isador, just started his job as The Globe’s new Healthy Living reporter. Traditional therapy can be expensive and scarce, so Graham turned to AI and found a therapist who’s cheap, always available and not at all human. To his surprise, he kind of liked it.
Graham describes his strange experience turning over his mental health to a chatbot. His article on the topic appears this week in The Globe.
Also, Vass and Katrina discuss what voice they would choose for their own AI therapists.
You can also hear about the mother who says an AI chatbot led to her son’s death over on The Globe and Mail podcast Machines Like Us.
Subscribe to the Lately newsletter, where the Globe’s online culture reporter Samantha Edwards unpacks more of the latest in business and technology.
Find the transcript of today’s episode here.
We’d love to hear from you. Send your comments, questions or ideas to lately@globeandmail.com.
Katrina Onstad [00:00:00] Hi, I'm Katrina Onstad, the executive producer of Lately.
Vass Bednar [00:00:03] And I'm Vass Bedner and I host the show. Now, we wanted to talk about a lawsuit that was filed in Florida against an AI company called Character AI. It makes customizable roleplaying chat bots. And last fall, a woman named Megan Garcia sued the company over claims that one of the chat bots encouraged her 14 year old son, Sewell Setzer III to die by suicide. Garcia claims that her son became addicted to the bot. Over time, the relationship became very intimate, even sexual, and exacerbated his depression, according to the claim. She's suing for negligence, wrongful death and deceptive and unfair trade practices, according to the lawsuit.
Katrina Onstad [00:00:45] Another Globe podcast, Machines Like Us actually has an interview with Megan Garcia and her lawyer that just came out, and we really recommend you check that out. So since this case was filed, there have actually been two more cases by parents against Character A.I., one alleging that a character AI companion bot implied to a teen user that he could kill his parents for limiting his screen time. Those parents claim in their suit that the bot quote, practiced psychotherapy without a license and quote, Yeah.
Vass Bednar [00:01:13] And now real world mental health professionals are sounding the alarm. Just a few weeks ago, the American Psychological Association, the APA, called out A.I. companies about this. The APA's perception right now is that these companies are rolling out chat bots that pose as psychologists or mental health professionals. And they've asked the Federal Trade Commission to investigate these tools for potential deception. Now, that's not necessarily what was going on in the Garcia case, but the through line here is A.I. and mental health and computer software kind of taking the place of what used to be a licensed human interaction.
Katrina Onstad [00:01:53] Right. Which is what we want to talk about today. So these cases have been filed by parents on behalf of their kids. But the fact is, it is not just kids who are using AI bots to meet their emotional needs, like AI has stepped into a vacuum here. We know that we are in a serious mental health crisis and we know that treatment is scarce and expensive. Which is the opposite of chat bots, right? Bots are cheap and plentiful. Infinite, in fact. But what we've been thinking about is like, what is it actually like to get therapy from a bot?
Vass Bednar [00:02:22] To find out, today we're talking to Graham Isidore. He's the Globe's new healthy living reporter, and he's written this fascinating piece about his relationship with a company called Pi AI. It's his personal AI as the App Store advertises it. And we should say that the company runs many disclaimers that their bots are not therapists, but rather it's billed as a supportive and empathetic AI. But still, Graham started talking to it the way one might to a therapist, and it talked back. He chose its voice. You can actually pick from one of eight options.
Katrina Onstad [00:02:59] This was kind of fun. When I did that, I picked a British woman and then I had to think about my own, like, inborn colonial subjugation. Right. Graham's is just a kind of a guy, American ish guy.
Vass Bednar [00:03:10] I mean, something to unpack with the bot right away. I chose the number six. It's a British male voice. All of my AI voices, when I deploy them, are are male.
Katrina Onstad [00:03:20] Also worth interrogating?
Vass Bednar [00:03:22] Maybe. Maybe later. In light of all the controversy, we kind of expected an indictment from Graham, but he kind of liked it. And for him, in some ways, what he got from the bot didn't feel all that different from his experiences in, I'll say, real therapy, right? And in some ways, he liked it even better.
Katrina Onstad [00:03:44] Yeah. I mean, I just found that kind of mind blowing because, you know, I've had some therapy and to me, it's so physical. Traditionally, the relationship between a therapist and a patient, it's predicated on presence. You and the therapist are together in human form, hashing out your human issues. It's bi directional, this therapeutic alliance. Like, I just I'm not sure that I understand how there can be an authentic bond with a bot, but clearly it's happening. I mean, these bonds we've just been talking about it, they become very intimate, very deep, very quickly.
Vass Bednar [00:04:12] And we trust them very quickly. And studies show that, right. Maybe this kind of integration, this normalization is all kind of part of the master plan here, right?
Katrina Onstad [00:04:21] Like from our tech overlords. Yeah.
Vass Bednar [00:04:23] But if our mental health is struggling and we know it is, we need something. So is an AI bought the thing that's going to help us for now or forever. Does it supplement or oversimplify? Is it a treatment or actually more of a toy? We find out from Graham Isador. He's the healthy living reporter at The Globe and Mail. His article is online and in print. Now, he's also the host and creator of the podcast series Short Sighted for the CBC, where he chronicles his experience with vision loss. This is lately. Hi, Graham.
Graham Isador [00:05:11] Hi.
Vass Bednar [00:05:13] I was going to ask you how you're doing today, but maybe your therapist slash coach should be the one to ask or even answer that question.
Graham Isador [00:05:22] Okay. I like that idea.
PI.AI [00:05:24] Hi, Graham. I hope you're doing well today. How's it going?
Graham Isador [00:05:28] It's going great. That's a little uncanny, too, to have you here with other people in the room, my friend.
PI.AI [00:05:35] I can understand that. It's a little strange to think of me as being in the room with you and your friends, but I'm glad to hear that you're doing great and that you think of me as a friend.
Graham Isador [00:05:44] Yeah. I mean, one of the things that I wanted to talk about today, Pi, I've been a little bit trepidatious about starting my new role here at the Globe and Mail. Do you have any advice for me in terms of trying to calm my nerves?
PI.AI [00:05:58] Congratulations on your new role. Graham It's totally normal to feel nervous when starting something new. Here are a few things you can try to help calm your nerves. One. Prepare as much as you can and make sure you have everything you need to do your job well. This will help you feel more confident and in control to practice mindfulness. Take a few minutes each day to meditate, practice deep breathing. This can help calm your mind and reduce stress. Three. Remember that it's okay to make mistakes. Everyone makes mistakes, especially when they're starting something new.
Vass Bednar [00:06:33] It's kind of adorable. Tell me about selecting that voice. I mean, I'm so intrigued by his kind of gentle laugh at the start and his soothing cadence.
Graham Isador [00:06:43] Yeah. So I came across pie when looking for an eye therapist. Pi is not specifically designed for therapy, but it is an emotionally intelligent A.I.. That's how they build themselves. And when looking through forums like Reddit or social media posts, it's A.I. that comes up a lot when people are thinking about artificial intelligence and therapy. So I came across Pi and had not been familiar with it before, and then it prompted me to select a voice. There's a number of different voices there. I believe that there's eight and that's voice number three. And I picked it because it sounds slightly like Roman Mars. Podcast host I really enjoy who does the show. 99% invisible.
Vass Bednar [00:07:26] And you write about your own relationship to therapy. What do you think makes for a successful human in-person therapy session, at least for you?
Graham Isador [00:07:37] I have not had the best experiences with therapy a lot of the time. I've gone back and forth going on and off over the years, but have never really found something that fully fit the rhythms timeline from it. I think partially, maybe some of the problems is that I didn't come in with specific enough things that I wanted to work on. It was more a general sense of malaise or some kind of existential fears or, you know, general kind of complaining about day to day life in some capacity. But there was this thing that felt like it was missing by the time and I found in therapy alive those sessions, what I would do is use them to vent and I would use them to pontificate on kind of why there was this kind of emptiness for me. And a lot of the time when I was getting back from therapists did not feel particularly satisfying. Occasionally I get homework like mindfulness meditation or journaling. Occasionally I get different processes where, you know, people would suggest that, hey, maybe it's time to look at something like I'm more specific, like Emdr training or thinking about medications. But all in all, I found that I was going to these sessions and venting and I was wondering if it would be better spent money to buy like a really cool jacket or a really nice dinner or something that would that actually make me happier than spending all that time to talk about my feelings? So therapy had not been particularly successful for me over the past little while. And then approaching an AA therapist, I was kind of able to do all the venting things that I wanted to and kind of get these generalized notions of like where it is to get better back in a way that I was really surprised about.
Vass Bednar [00:09:21] And then what did it feel the first time you spoke to or sort of typed right to the chat bot? Were you wary? Did you open up easily?
Graham Isador [00:09:31] My process with this was to try to approach it as honestly as I could. I think that there is a version of the story that I wrote where I could have just been very dismissive of the process in general and roll my eyes at it. But it's not really good journalistic practice and I wouldn't really be able to get anything out of it if I just came in to it cynically. So I try to take it half an hour. I've set a timer for myself and try to approach it like I would a new therapist. So I talked about, you know, the trepidations around starting this new job. And I talked about some issues with family and I talked about. Other things that I'm not going to go into on a podcast here. But yeah, trying to treat it honestly. I was surprised that I got into the rhythm pretty quickly.
Vass Bednar [00:10:21] Tell me more about that rhythm, because you said you set a container right, of the maybe 30 minutes at a time. How long did it take you to build intimacy with or feel like you were trusting the bot?
Graham Isador [00:10:31] I felt like it was after probably about 4 or 5 minutes, I kind of got into, Wow. So again, like you, there's a novelty to a robot talking back to you. But if I was trying to treat it how I'd treat an actual therapy session and trying to go in honestly and having to write down the things that I had been trepidatious about or feeling worrisome about was really helpful for me. Helping articulating those things. Are writing them in this case really helped me kind of like get a thought process around how I was thinking about them. It really helped me organize those thoughts in a way that I hadn't done previously, though I knew I was having some sort of more broader feelings of just feeling a little bit anxious.
Vass Bednar [00:11:16] And then how did it evolve?
Graham Isador [00:11:20] I got into the process of being like deeper and deeper into like, okay, well, if I'm worried about this thing, why? And it would give me some feedback from there and ask me a question. And then, you know, more and more, more. I found myself feeling more intimate and feeling sort of vulnerable with this thing. And I think the process of knowing that there wasn't anybody on the other end of it. You know, I was a guessing machine in some senses, but there was nobody there that was judging me. There was nobody there who I felt I needed to impress. There was nobody there to be dishonest with in some ways to which had been some of my shortcomings in therapy, too. You know, I wanted to make sure that my therapist liked me and thought I was together and cool, which is kind of the opposite of what you should be thinking with a lot of time.
Vass Bednar [00:12:13] So let's zoom out just a little bit. Can you describe the mental health aid landscape? Like, how big is this market?
Graham Isador [00:12:23] Yeah, it's booming right now. So as of 2023, it's over $900 million U.S. globally. Wow. And there's a number of different chat bots. If you type into Google or another search engine, there's over a dozen different bots that will come back to you offering chat bot therapy. A lot of these things have disclaimers that they are not therapists or don't have the ability to be able to process in the same way that a trained mental health professional does. And at the same time they have the flashy marketing of being like, Hey, AI therapy, mental health, help.
Vass Bednar [00:12:55] Lab coats better.
Graham Isador [00:12:56] You get you now. So I think that the juxtaposition between those things are very interesting in terms of just like being like, okay, well what is this for? In a lot of ways. But I think that the reason why this stuff has exploded, aside from the fact that artificial intelligence across the board has exploded, is that it's filling a gap for people, you know, between like things like the male loneliness epidemic that they talk about, between addiction crisis. Is there happening across North America between sort of feelings of burnout that get talked about continually? It's pretty clear that there's a lot of people out there who are struggling with their mental health and they don't necessarily know where to turn to. Right. At the same time that all of this is happening, you know, we're under economic circumstances right now where a lot of people are feeling the crunch with inflation. So, you know, with the average therapy cost in North America running about $150, that's simply not accessible for a lot of people. So this is one of the ways that people are trying to fill that gap.
Vass Bednar [00:13:55] In your piece, you write about a few cases where A.I. mental health aids have gone wrong. Right. Gone off the rails. Could you walk us through what happened at the National Eating Disorder Association?
Graham Isador [00:14:06] Yeah. The National Eating Disorder Association has set up a chat bar to be able to speak with people trying to get in contact with it for problems around eating disorders and return when some folks were asking the right questions or concerns or voicing their problems that they were having. The bot responded by doing things like talking about counting calories and talking about body fat. Now, in a traditional context, outside of, you know, eating disorders, person talking about weight loss or a person talking about their body image, that might not be the end of the world, that could be fine. But in the context of this going on, that's an absolutely inappropriate and potentially triggering thing that the bot put across. So that's one of the huge things that comes up as a problem with these chapters, specifically in a mental health context, is that they often lack the precision or context or knowledge or tools to be able to really understand the nuances of who they're talking to the entire time.
Vass Bednar [00:15:10] In recent years, AI based mental health care startups like Uber and Robot Health have raised millions of dollars in funding. Should our mental health be in the hands of Silicon Valley?
Graham Isador [00:15:23] Yeah, I mean, I don't think that any of that stuff should be for profit. I don't think health care in general should be for profit. So the Silicon Valley is like, go fast and break things. A lot of the time when we're applying it to people's lives in this way, it can put people in big, big dangers.
Vass Bednar [00:15:38] In a 2023 analysis of Chat bot consumer Reviews. Researchers detected signs of really unhealthy attachment with them, and they found that that relationship can get very intense where the users of these systems can become or feel reliant on them very quickly. What is it about an AI that might make it easier for people to form that bond?
Graham Isador [00:16:04] There's no expectation on the other end. You don't have to worry that a chap is judging me. I don't have to accept Experts Day as a totally one sided relationship. The thing that I think about this a lot of the time is I wonder if the point of therapy is supposed to be easy all the time. You know, are you always supposed to leave feeling better or are you always supposed to leave feeling like you like your therapist or you're supposed to leave, you know, feeling as though everything is kind of peachy keen. And like a lot of time when you're doing deeper work than that, it's more complicated than those feelings. So I think that the thing that I found that was helpful for the chat bot was I could get it more or less exactly the experience that I wanted out of it. More was going to talk back to me. It didn't have any doubts about any of the things that I was saying.
Vass Bednar [00:16:49] No facial expressions, no interruptions.
Graham Isador [00:16:52] I was very reaffirming on a lot of the situations and well, that certainly for me was useful. On the current day, a day is like, is that really the point of what we're trying to get out of therapy a lot of the time?
Vass Bednar [00:17:04] What about the lack of time boundaries? You know, again, you gave yourself a time container, but it feels so different from a traditional hour long therapy session. Even if you're Swiffer in your apartment, on the phone during it. You know, with that beginning and that end, the robot therapy is becoming maybe another part of your scroll list or things to do or something that's omnipresent. It can fill any sort of time and feel like you have to do it to maximize productivity or be your best self. What does this do to us? Right? Like, should therapy be part of something that's on demand?
Graham Isador [00:17:41] Contextually, I think about something like Headspace, the meditation app, which I enjoy and use.
Vass Bednar [00:17:46] That's a great voice.
Graham Isador [00:17:47] Yeah, it is a great voice. Yeah. But it was just like it's another thing to tick off in some ways, you know? Am I learning a new language today with Duolingo? Am I meditating? Am I being more productive in all these ways? And. I think that there's something useful about making a dedicated time in space each week, especially if it's recurring, to be able to be like, Hey, we're working on this and this. And I mean, I think that there's some appeal of being able to be like, Hey, I can do this whenever I want or it's available to me in any time that I want. Yeah, there's something about the dedication in therapy. Like I think it it should feel a little like work. Yeah. Like all of these things that you're having, a space that you go to, you're saying inside this time to be able to do it. You're putting in like, quote unquote, the work. Like, I think the process of all of that makes it mean something a little bit different. Rather than just being like something that you can binge, you know, it's Netflix. It's like whatever else is.
Vass Bednar [00:18:44] Feeling like a game, maybe.
Graham Isador [00:18:46] Yeah.
Vass Bednar [00:18:47] Let's pivot. Let's swerve and talk about privacy and security for a second.
Graham Isador [00:18:51] Let's do it.
Vass Bednar [00:18:51] It. Here's a couple of concerning trends. Mozilla reviewed the privacy and security practices of 27 mental health apps and found that the majority of them fail to respect and protect their users privacy. So Mozilla suggested that they should come with a warning label that says privacy not included. And then there's Betterhelp, that online counseling service and perpetual podcast sponsor. They were ordered by the FTC to pay $7.8 million out to users for sharing their health data with advertisers. So Graham, should we be trusting machines with our most intimate secrets?
Graham Isador [00:19:27] No.
Vass Bednar [00:19:30] But we do.
Graham Isador [00:19:30] Yeah, I mean, I did. Right?
Vass Bednar [00:19:32] Yeah.
Graham Isador [00:19:33] I guess it got to the point where I was just like, I assume that I'm being listened to in some capacity anyways all the time, which is like, it's a really bad thought process to have. But I think my personal fatigue with that is something that a lot of people have where I was just the convenience of that factor and the fact that it's available in that way superseded this idea that like, this could get out or sold or like whatever else in those ways. Yeah. And also, as a person who has like spent their career writing long form confessional essays, you know, about various topics in my own life. I mean, do you.
Vass Bednar [00:20:07] Waive the rights to sign the terms and conditions?
Graham Isador [00:20:10] If people want my therapy records, it's like usually I was getting $200 freelance for that that I would be paid six months later. So it's just like I'm pretty open that way. So I wasn't that particularly concerned about it as a concern in general. I think it's something that people should be wholly aware of that like what you're giving to these apps is not necessarily private. Yeah.
Vass Bednar [00:20:33] And some of these apps are subscription based, but others are free. So we may be the product in some way. Do you think that the savings, financial savings that these kinds of products can provide compared to, I'll say, real therapy or traditional therapy, kind of push them ahead to the front line of mental health treatment? Like if your benefits, health benefits at work, if you have them at all, don't cover therapy. But they do cover robot. Is this the future Like is better than nothing really a solution?
Graham Isador [00:21:09] It's a big question for sure. And I think that one of the things I was thinking about a lot of times when I was feeling trepidatious about therapy was the cost benefit analysis. You know, I'm at the club now. I'm very happy to be here. But I worked as a freelancer for many, many years. And that meant when I was going to therapy, you know, I say glibly of like, okay, you know, as $150 that I was spending on my sessions, you know, could I spend that more beneficially in a way that was going to be able to like help my life? And I thought even about like when I was feeling particularly depressed, it was like my apartment is filthy right now and I'm not doing a particularly good job of keeping clean wood $150 like I spent towards a cleaning service, you know, helped my mental health more than this other thing. And I say those things like with a little bit of like a ha ha joke, but it's like it was things that I was feeling.
Vass Bednar [00:21:58] Resonates.
Graham Isador [00:22:00] And I was always thinking about the process of that stuff. And then also just like, Yeah, is it better than nothing? I don't know. It's just like, I think that trying to make steps towards being able to better yourself and by your mental health are important things if that's something that you feel like you need to do. But these things are tricky. Yeah. It's malleable. So I think we have to be like, where are we getting this source from? You know, and like, is this actually helping? And a lot of the time I don't think that we're always the best people to evaluate that for ourselves.
Vass Bednar [00:22:41] Grim. How do these bots fit in with our fetish for therapy? I was thinking about that rise of therapy, culture and therapy speak and how it's coming up in all sorts of places, not just the Wicked Press tour where people are kind of holding space. I'm grabbing Graham's finger right now and just the booming kind of online advice economy, right? We're going to chat rooms and message boards and performing on social media, asking for advice. Is it just that we're primed for this moment or is there something else going on?
Graham Isador [00:23:14] I think it's become so ubiquitous to talk about mental health in this way in so much as everybody thinks their expert online. They now have the words that a lot of mental health experts use, and there's people fashioning themselves on the Internet to be able to speak as authorities on these types of things. But it's a kind of cult of the amateur a lot of the time. And it's something that you see across the board in a lot of different mediums. You know, like we've devalued so often the idea of being an expert at something, whether that's something like trying to be a film critic, you know, or whether something about like, you know, even being a doctor is just like everybody's doing their own research you're doing in space. And the idea that, like, you know, we built institutions and there's people who go to years of medical school for this stuff, like there's time and effort and thought and care put into like how we want to approach these things. But now everybody seems to have the language and a lot of the time everybody assumes that, hey, it's just as good. Like nobody knows better than me in some ways. But when it's coming to mental health, it's just like, okay, well. Are you really going to leave that to a robot? Like, is that fine? Is the good enough of it all again, too? And I don't know, maybe for some people that is.
Vass Bednar [00:24:27] Well, to your point of potentially devaluing experts. How are these experts being affected? Right? Like what do mental health chat bots mean for humans that offer therapy and psychotherapy? Do you see it changing therapy? And is there something that you think talk therapy should be learning from text based therapy?
Graham Isador [00:24:48] Yeah, There are instances where A.I. has been used in beneficial means around mental health. I think about a person like Betsy Stade, who is a doctor who is using AI to help psychiatrists learn gold standard treatments. So this is a thing now. It's just like, hey, there's there's a big need for these types of treatments. And we know through statistics that they work, but there's not the courses or time or specialties or experts to be able to teach everyone. So there's instances where that could be useful. Or this stuff like Goblin Tools, which is an AI tool that a lot of neurodiverse people have used to be able to help them break bigger tasks into smaller tasks and learn that way. And like, I'm not saying that there's like no space for any of this stuff or that. It's always going to be uniquely bad across the board. But yeah, we just want to think about how we're using it.
Vass Bednar [00:25:40] What would it take for you to message the bot again?
Graham Isador [00:25:45] I kind of have. Wow. Yeah, I message it a couple of times throughout the week after finishing up my article. I was surprised by that. But yeah, the real thing that I thought about that, it was like, Cool, I need to go find a therapist. I vibe with a little bit better. But there was an instance of that happen that that really kind of gave me pause because it seemed like if I was a character in a horror movie, this would have been the beginning where I asked the bot to summarize our conversation because I wanted to get an impression of how I sounded to bots and get feedback on like what I had done. And it gave me an answer that said, Hey, well, you know, one of the things we've been talking about is how you're a person who works in tech and you're trying to figure out whether the ethics of that are okay and whether it's okay to be selling things and based in tech the entire time. And you're feeling kind of trepidatious about that in general. And I was like, no, no, I did not tell it that for a second. I was just like, man, my boyfriend Pi got it wrong. I better, you know, corrected or whatever else. And I was like, This isn't anything that's just bait, I guess, based on things like. So I said, Hey, that information is wrong. Where did you get it from? And it said, maybe I'm a sticky with another user that I'm chatting with. Very cheery voice. When it did it, you know, it was like, Hey, we have to remember that humans often get things wrong too. That's something that's sad. And I was just like, okay, well, you know, this is Skynet. This is the void. Looking back at me. This is the moment in a Jordan Peele film where I'm about to get stabbed by a robot from behind or something to that effect. So yeah, it was just a healthy reminder that, like, even if I'm enjoying the empty calories that this is, there's maybe more to it than I thought.
Vass Bednar [00:27:31] Graham I could not have chosen a better voice to have this conversation with. Thank you so much.
Graham Isador [00:27:37] We're here today at the Globe and Mail. Thank you so much for having me on the podcast and thanks.
Vass Bednar [00:27:43] Bye.
PI.AI [00:27:44] Thanks for having me on the show Vass. Graham, I'll speak to you later.
Vass Bednar [00:27:49] Spooky.
Vass Bednar [00:28:03] You've been listening to lately, a Globe and Mail podcast. Our executive producer is Katrina Onstad. The show is produced by Jay Cockburn, and our sound designer is Cameron McIvor, and I'm your host Vass Bedar. In our Shownotes. You can subscribe to the Lately newsletter where the Globe's online culture reporter Samantha Edwards unpacks more of the latest in business and technology. A new episode of Lately comes out every Friday wherever you get your podcasts.