Be friends with artificial intelligence

Stock photo of a chat bot

Artificial intelligence (AI) has grown rapidly in 2023 and will continue to do so. ChatGPT, Snapchat and various photos and videos are just some of the AI ​​experiences people have on their devices every day.

AI has many positive aspects. But experts say when it comes to health advice, you should take advice from a trained human, not a robot. And if you’re worried about the “rise of the robots” – as they say in Hollywood – there are ways to deal with it.

“It’s completely normal to feel scared when there are the first signs of big technological changes or other things that could change our lives,” says Ari Lakritz, PsyD, a clinical psychologist at OSF HealthCare, who says he understands these concerns often sees.

Deal with technological advances

Common fears are that technology will make your job redundant or provide you with misinformation. dr Liquorice suggests being proactive and not reactive to the changes. Learn how AI is transforming your job or hobbies.

“See if maybe you can be a part of it. Start incorporating some of this into your daily practice,” says Dr. Licorice.

AI can also save time.

“It has the potential to streamline things we don’t enjoy doing and give us more time and energy to do things we want or think are important,” says Dr. Licorice.

For example, it can take an hour to search online for sources for your college essay. If ChatGPT can do this in minutes, you might have time for an afternoon stroll or meet a friend for lunch, boosting your mental and physical health and spurring creativity.

It’s also not the first time we’ve dealt with digital unknowns. Do you remember the supposed Y2K bug around the turn of the millennium? Fears of computer clock malfunctions and worse did not materialize, and experts drew lessons for the next big technology shift like AI. And one bright spot in the year 2000 angst, according to Dr. Liquorice for making people learn about technology.

Don’t go to a robot for mental health care

Snapchat, the photo messaging application popular with young people, recently launched a chatbot called “My AI”. It’s a way for people to connect with someone when they have no one else.

“My AI can answer a burning quiz question, offer advice on the perfect gift for your best friend’s birthday, help plan a long weekend hike, or suggest what to cook for dinner,” Snapchat’s website reads.

But the company acknowledges, “It’s possible that responses from My AI may contain biased, false, harmful, or misleading content.” Because My AI is an evolving feature, you should always use the responses provided by My AI independently check before relying on any advice and you should not disclose any confidential or sensitive information.”

This concern has also been echoed by some parents and mental health professionals. They fear that young people will receive insufficient psychological counseling from a robot.

“It’s like googling for symptoms when you’re ill instead of going to a real doctor,” says Dr. Liquorice with a hint of dismay. “You’re given a lot of information, but there is very little guidance on how to use that information and what information works better for you.”

If you’d rather talk about your mental health virtually, there are more legitimate options than Snapchat, says Dr. Licorice. First and foremost, your doctor may offer virtual visits. Otherwise, speak to a provider or a trusted adult about online third-party counseling opportunities.

“These are programs that have been tested and verified. There is oversight and responsibility. There is research behind it,” explains Dr. Licorice. “With an AI assistant, on the other hand, the source of the information is not clear. It is not clear how best to use this information.”

Will AI and chat rooms replace in-person mental health care? Don’t count on it, predicts Dr. Licorice. The human connection of being in a room with someone can’t be underestimated, he says.

In doubt

If the content on your device is just too big, turn it off, put it aside and take a break.

Look out for these warning signs:

“Do you feel like you’re doing too much of this? Do you find your use of technology taking over other interests in your life? Have you ever tried to save something but failed? Do you find that using so many technologies causes stress or dysfunction?” asks Dr. Licorice.

Most devices have settings that can limit screen time. Parents should set limits on their children’s device usage. dr Liquorice says you can even share your browsing or chat history with a trusted adult so they can make sure you’re using the technology appropriately.

If your technology use is causing significant mental health problems, talk to your GP or mental health provider. You can also call 9-8-8 to reach the Nation Suicide Prevention Lifeline.

Learn more

Learn more about behavioral and mental health at the OSF HealthCare website.

‘, ‘window.fbAsyncInit = function() {‘, ‘FB.init({‘, ‘appId:\’216372371876365\’,’, ‘xfbml:true,’, ‘version: \’v2.6\” , ‘});’ ]; ppLoadLater.placeholderFBSDK.push(‘};’); var ppFacebookSDK = [
‘(function(d, s, id) {‘,
‘var js, fjs = d.getElementsByTagName(s)[0];’, ‘if (d.getElementById(id)) return;’, ‘js = d.createElement(s); = id;’, ‘js.src = “”;’, ‘fjs.parentNode.insertBefore(js, fjs);’, ‘}( document, \’script\’, \’facebook-jssdk\’));’ ]; ppLoadLater.placeholderFBSDK = ppLoadLater.placeholderFBSDK.concat(ppFacebookSDK); ppLoadLater.placeholderFBSDK.push(”); ppLoadLater.placeholderFBSDK = ppLoadLater.placeholderFBSDK.join(“\n”);