The image at the top of my blog posting yesterday showed a giant tsunami, about to wipe out hundreds of people on an otherwide inviting beach - not to mention wiping out all of those going about their business in the adjacent city, which was also pictured. That horrific image from yesterday made visual what Peggy Noonan said about the dangers of Artificial Intelligence, in her column published on February 12, 2026, in The Wall Street Journal.
Today's image is quite different. The ocean is calm, and the woman on the lounge chair, located on a small little island, in some tropical sea, seems quite happy and content. Where's the horror in that?
Well, that image comes from the online version of a column by Amelia Miller, published in The New York Times on Sunday, February 15, 2026. The hardcopy version of Miller's article is titled as follows: "Will A.I. Companions Turn Every Man Into An Island?" If you click that link, to read what Miller has to say (and I hope you do), you will find that her title is different online, referencing polyamory. Her point is that people who rely on A.I. for "companionship" are, in fact, truly isolating themselves. When someone is online with the companion on their phone, they might just as well be isolated on some lost and remote island. That's a different kind of "horror," but such social isolation would, in fact, be a horrific fate for those who might end up that way. Here is a quick snippet from Miller's column, making the point:
If we don’t change course, many people’s closest confidant may soon be a computer. We need to wake up to the stakes and insist on reform before human connection is reshaped beyond recognition.
I am, of course, very much in agreement with this caution by Amelia Miller. Creating "fake" human substitutes with whom we consult online - as we eliminate, more and more, any real human-to-human contact in our lives - is extremely dangerous and extremely disturbing. Below, I am providing another quote from Miller's column - even more disturbing, to me. This is the text that impelled me to comment about A.I., again, following so closely on my comment yesterday:
These developers’ perspectives are far from the predictions of techno-utopia we’d expect from Silicon Valley’s true believers. But if those working on A.I. are so alive to the dangers of human-A.I. bonds, and so well positioned to take action, why don’t they try harder to prevent them?
The developers I spoke with were grinding away in the frenetic A.I. race, and many could see the risks clearly, but only when they were asked to stop and think. Again and again as we spoke, I watched them seemingly discover the gap between what they believed and what they were building. “You’ve really made me start to think,” one product manager developing A.I. companions said. “Sometimes you can just put the blinders on and work. And I’m not really, fully thinking, you know?”
When developers did confront the dangers of what they were building, many told me that they found comfort in the same reassurance: It’s all inevitable. When I asked if machines should simulate intimacy, many skirted responding directly and instead insisted that they would. They told me that the sheer amount of work and investment in the technology made it impossible to reverse course. And even if their companies decided to slow down, that would simply clear the way for a competitor to move faster (emphasis added).
Nothing is "inevitable," unless we fail to act. We can change the world. But, of course, that doesn't mean we will. We have to choose to take action, and if we do, we will have a chance to survive.
The dangers inherent in the continuing development of A.I. - which dangers are beginning to be so clearly recognized - are only one of many potential world-ending dangers facing us now. We are pushing towards an artic "tipping point" beyond which the processes of human-caused global warning now underway will destroy our human world. And so will any future use of nuclear weapons. And.... we can also start worrying about A.I., and the impacts that we can see that it can cause - like a "tsunami," as Peggy Noonan said.
Individual actions are important, but they are not enough. Check back to my blog posting yesterday for what I'm advising!












