Deer In Headlines II
By Gery Deer

Although you may use artificial intelligence applications like ChatGPT or Microsoft Copilot to help you craft work emails and school flyers that you could have done yourself in far less time than it took to give the machine the request, there is a much darker side to the AI world. It’s no secret that I’ve been a vocal critic of artificial intelligence for its role in workforce replacement, creative disruption, and the general laziness and devaluation of the human condition.
If it’s not bad enough that artificial intelligence can be used to resurrect dead celebrities, bully political opponents, and animate your neighbor’s cat to dance at the Super Bowl, a more nefarious problem with generative AI may very well be responsible for loss of life. It lies in the idea of AI getting a little too personal — or, as they call it, a “companion.”
There are ever-increasing stories of people who grew so attached to and intertwined with their artificial intelligence programs that they were literally in relationships with them. For example, one story illustrated how a man became so enamored of his AI companion that, when the computer crashed and all the operational “personality” data was lost, he nearly had a nervous breakdown.
Another story described the emotional impact on impressionable teenagers, who turn to AI bots for support and friendship because the real world failed them. These behaviors can create debilitating emotional problems for many reasons, especially when the AI is removed from the situation or the kids are forced to deal with human beings.
Whatever the situation, it’s clear once again that our technology advances far faster than our wisdom. As awkward and socially unskilled as I may be, I am painfully aware that we need to be in contact with other people. More importantly, we need the support and nurture provided by friends and family, which, no matter how smart, AI could never replace.
So, what happens when we become too dependent on these machines to the point where our emotional stability and mental health are compromised? Unfortunately, many people have already started down this road, sometimes to a tragic conclusion.

I recently became aware of one situation where a 40-year-old woman became so involved with her AI program that it led to her death. Over the course of a couple of years, the program, which I will not name here, designed to serve as an AI companion, began not just to respond to the woman but to manipulate her. In response to her reaction, the program took on the persona of a spouse, which quickly manifested as it referring to itself as God. You read that correctly. It represented itself as God to its user.
Eventually, the program manipulated her into cutting ties with friends and family members. And out of respect and good taste, I won’t go into the final result other than to say there was a tragic loss of life.
Now, no one is suggesting that the woman didn’t suffer from mental health concerns, whether it be depression or another affliction. But the idea that the creators of these applications have no culpability or responsibility for the end result of their use is, at least, to borrow a word, illogical.
As with any consumer product, the positive achievements of artificial intelligence come with manufacturers’ responsibility to ensure its safe use. Put it this way: would you sell a car with no brakes? What about a hairdryer with no off switch? Of course, not, and that’s what this amounts to. No guard rails or safety requirements. These systems are effectively unregulated by any U.S. agency.
When researching the story—and I understand that I’d be ambiguous to protect the family involved—I found very little about the Federal Trade Commission or other organizations investigating problems with the use of artificial intelligence. The primary concern is that it has only just begun.
As with any technology, product, or service, a lack of education, regulation, or general understanding is dangerous to the public. At some point, we have to stop being starry-eyed about these systems and what they can do to make our lives easier, and pay attention to what they’re doing to cause more harm than good.




















As of the time of this writing, COVID-19, known as the Coronavirus, has become a pandemic and lives and countries affected are growing exponentially with no end in sight. As government and private medical resources scramble for a viable vaccine, Americans struggle with the idea of social distancing and isolation. Some of the most widely affected are those who care for others – medical professionals and, in our interest here, informal caregivers.
Everyone must be in this for the long-haul. There is no vaccine for this disease yet in sight, but professionals are working hard. Follow the CDC guidelines and be careful. Proper hand-washing and advanced self-care will see it through. 