Ever heard of AI counselors? Unless you have been living under a rock for the last few years, you are probably well aware of the growing mental health problem in the Western world today. The causes for this issue are many. Lack of purpose, despair over the state of the world, and of course playing a bigger role than ever in the last year would be the lack of human interaction.
Of course, regardless of the state of lockdowns in your area and the increased suicide rates that go along with that, there are some groups that have greater struggles with suicide than others.
The Trevor Project recognized that one of these is those identifying as LGBTQ. This group has a disproportionately high rate of suicide, whether that be from people rejecting them, their own confusion, or a combination of these factors will vary from case to case. The important thing is that the people behind the Trevor Project realized that when there is an adult who lets these people know they care about them and treats them like they are an important person worthy of respect, there is a 40% drop in instances of suicide.
While the Trevor Project is well intentioned, they are also woefully undermanned for the task. With such a large number, 1.8 million annually contemplating suicide, Trevor only has 600 employees to handle the demand. This has set them on a path to working on new ways to serve those in distress. One of those ways being explored is the use of AI as a counselor.
How can that work? How would someone respond to this? Knowing that they are being put in contact with a machine when what they really want is a person would seem to be something that would be upsetting. Honestly, that is intuitive. However, there could still be a role for AI in handling some of the basics. Sometimes a person just needs a little encouragement when they call and not a full psychological evaluation. A properly trained AI can help sift through the callers in those first few minutes. If it turns out that the person calling is someone with a more pressing issue than even the best trained AI can hope to deal with, it can then put the caller in touch with a person.
Speaking of training those AIs, the Trevor Project is feeding all of its collected years of conversations into their programs in order to teach them how to interact with people. By looking at the flow of the conversations and how certain people respond to various phrases and tones of voice, an AI can be trained to at least handle the more basic issues that might come up.
There are also those who might have been putting off opening up precisely because they are afraid to talk to a person. There may be issues of judgement and anxiety at play that would actually make the prospect of talking to an AI more enticing. Sometimes, people just need to vent and an AI presents an opportunity to do exactly that.
These AI counselors have of course a far wider application. One will naturally think again of the separation issues caused by people adhering to lockdown orders around the world. There are some who have literally not seen their loved ones for over a year. That, in addition to the complete disruption of normal life for many has sent the suicide rate through the roof. We’re talking numbers far greater than any hotline can deal with. Or think of Japan. Young people commit suicide there at an alarming rate under the best of conditions. This is often in response to social pressures to be the best at everything. Given the pressures against being open about how one feels about such things and the country’s general acceptance of technology, an AI counselor might actually be preferable for many there.
TARTLE is eager to help in any way to develop these kinds of projects so that more people can be helped. That’s why we are asking you to sign up and share your data and experiences with just these kinds of endeavors. It’s one small way we can contribute to getting people the help they need.
What’s your data worth? Sign up and join the TARTLE Marketplace with this link here.