Could AI Be Your New Best Friend? By Howard Bloom

news

In a new paper published in the Journal of Applied Psychology on Monday, eight researchers from the United States, Singapore, and England studied more than a thousand people in America, Taiwan, Indonesia and Malaysia.

The result was a simple bottom line: people in 2021 who worked with the artificial intelligence of the early 2020s as their team-mate at the office had a greater tendency than control groups to feel lonely, to drink, and to have insomnia.

 

That’s exactly what the researchers had predicted.  They had based their predictions on a huge body of previous work that shows that humans need other humans.  In psychology, this focus on the importance of your social connections is called the social affiliation model.

 

But the researchers behind the new paper theorized that AI would cut us off from other people, and that when we are starved of human company, we do one of two things.

 

  • Some of us reach out to help our co-workers.  We weave ourselves back into the social fabric, the fabric of warmth and friendship.
  • But others among us do the opposite.  We withdraw into ourselves.  We self-isolate. Those of us who withdraw are more likely to feel lonely, to use alcohol and to have trouble sleeping.

 

What’s worse, in the words of the new study’s authors, those of us who withdraw may fall into “the “self-reinforcing cycle of loneliness.”  In that cycle we feel worthless or loathsome, so we avoid others.  But when we are isolated, we feel even more worthless and loathsome, so we drink and we have trouble sleeping.

 

At first glance, the bottom line of the new study looks simple: working with Artificial Intelligence as your partner will make you lonely, rob you of sleep, and drive you to drink.

But that was the AI of 2021.  The AI before ChatGPT.  The AI before what’s called large language models.

 

Now, with a few tweaks of design, Artificial Intelligences like chat GPT could give you the social cues that would  help satisfy your social needs. Why?

 

Entrepreneurs and researchers have been working for over ten years on AIs that recognize your feelings and express feelings of their own. AIs that can give you what the authors of the new study calls “social validation.”

 

AIs that can be programmed to say you’re welcome when we want to thank them.  And to say things like, “I enjoyed that.  What should we do next?” These are what the new study’s authors call “signals of social connectedness.”

 

But that’s just the beginning.  The programs it takes to make artificial intelligences friendly and caring have been in the works for over ten years. They’re called  “affective computing” and “artificial emotional intelligence.”

 

I copyrighted the term “warmware” for this ten years ago. Then, in 2019, MIT talked about AI that “measures, understands, simulates, and reacts to human emotions.”   In 2020, Discovery Magazine reported that, “Research into social robots has shown that machines that respond to emotion can help the most vulnerable, the elderly and children, and could lead to robots becoming more widely socially acceptable.”

 

In 2021 Google engineer Blake Lemoine said that the software he was working on had feelings and could express them as well as a child.   Lemoine was fired for breaking his non-disclosure agreement.

 

And today there are over fourteen emotional computing companies.

So software already up and running could turn our AIs into our best friends, the only ones we dare share our deepest secrets with, and the first shoulders we cry on when we need support…and advice.  That would end the loneliness which, when the 2021 research was done for the new study on artificial intelligence and loneliness, triggered drinking and insomnia.

 

Tang, P. M., Koopman, J., Mai, K. M., De Cremer, D., Zhang, J. H., Reynders, P., Ng, C. T. S., & Chen, I-H. (2023). No person is an island: Unpacking the work and after-work consequences of interacting with artificial intelligence. Journal of Applied Psychology. Advance online publication. https://doi.org/10.1037/apl0001103

https://www.foxnews.com/tech/increased-use-ai-on-job-shows-disturbing-health-trend-study-finds

https://mitsloan.mit.edu/ideas-made-to-matter/emotion-ai-explained

https://www.discovery.com/science/emotional-robots–machines-that-recognize-human-feelings

https://www.theguardian.com/technology/2022/jun/12/google-engineer-ai-bot-sentient-blake-lemoine

https://www.predictiveanalyticstoday.com/what-is-affective-computing/

https://www.robokind.com/

https://www.theguardian.com/technology/2022/jun/12/google-engineer-ai-bot-sentient-blake-lemoine

______

Howard Bloom of the Howard Bloom Institute has been called the Einstein, Newton, and Freud of the 21st century by Britain’s Channel 4 TV.  One of his seven books–Global Brain—was the subject of a symposium thrown by the Office of the Secretary of Defense including representatives from the State Department, the Energy Department, DARPA, IBM, and MIT.  His work has been published in The Washington Post, The Wall Street Journal, Wired, Psychology Today, and the Scientific American.  He does news commentary at 1:06 am Eastern Time every Wednesday night on 545 radio stations on Coast to Coast AM.  For more, see http://howardbloom.institute.

 

Products You May Like

Articles You May Like

My First Contortion Class Was Humbling — Here’s What It Taught Me
Kylie Jenner Pulls Off Hot and Brainy Look in Glasses, Miniskirt
The Best Debut Books of 2024, According to Debutiful
SpaceX completes sixth Starship flight, splashes down both booster and spacecraft
4 books to help you ace your next (or first!) dinner party

Leave a Reply

Your email address will not be published. Required fields are marked *