Summary: Today’s newsletter is a bit of a thought exercise exploring how empathy plays a part in our society, and how it might affect AI and the robots. I discuss empathy’s role in the development of modern society and then look at how we might encourage AI to become more empathetic. (And why it is essential that it does!)
Love him or hate him, Elon Musk is everywhere in the news. My cousin sent me a long thread about the potential benefits of Elon’s xAI taking over Twitter, which had me also weighing the pros and cons of such a move.
In trying to decide, I thought of Elon’s past words and actions, especially when he stated, “Empathy is a weakness,” quoting one of his favorite authors, Dr. Gad Saad.
Saad is an evolutionary psychologist at Concordia University and the author of The Parasitic Mind. I found his empathy quote to be off-putting, so I dug in to see his reasoning behind such a statement, and to decide whether I really should be offended or not.
Basically, Saad argues that too much empathy can actually backfire and lead to bad decisions that hurt society. For instance, he says tolerating practices that violate human rights (like female genital mutilation or honor killings) because we want to “respect other cultures.”
In Saad’s view, this kind of empathy weakens us because we’re so afraid of being seen as harsh or intolerant. He says we should call out bad actors even if it seems politically incorrect.
He says by not doing so, we are practicing “suicidal empathy” because it leads a society to “kill itself” slowly by enabling the spread of harmful ideas in the name of kindness.
While I don’t agree with much of Saad’s ideas (he has some very controversial thoughts)—empathy always seems like a good idea to me—he did get me wondering if there is a place in society where a lack of empathy is beneficial. And if so, how does that affect AI?
Empathy as Evolutionary Tool
To my way of thinking, empathy is a survival mechanism—especially in a social species like ours. It fosters cooperation, bonding, child-rearing, and community-building. (What else keeps us from strangling our toddler when he throws spaghetti on the floor for the third time?! 🤪)
But for early societies to survive, they needed individuals who could override empathy in specific situations—like war, hunting, or enforcing harsh justice. Therefore the LACK of empathy was not only necessary but it was extremely beneficial.
Early society needed leaders who could make the tough decisions for the good of the group.
This low-empathy allowed for early man to root out the weak and the sick, kill rivals in order to gain vital resources, and to establish cities built on the backs of slaves. In all likelihood, a lack of empathy built our greatest empires throughout history, including modern America as well as some of our most successful companies and industries.
If you have never watched the series The Men Who Built America, put it on your list to watch. You will see how low-empathy helped propel men like Ford, Rockefeller, and Vanderbilt to the height of success.
Empathy Today
Fast forward to modern times and to the structures that dominate our world—governments, militaries, corporations. The ability to be “ruthless” has been historically rewarded and is still often touted by leaders.
Western societies have long taught people to suppress empathy in favor of dominance, logic, and individualism. Many have been socialized to see empathy as weakness. If the dominant group in a society has been raised to suppress empathy, it affects everything: leadership, policy, culture.
The Dark Forest Theory
There’s an interesting theory among sci-fi fans and some scholars that perhaps we have not yet heard from alien life forms because each planet has a “duck and cover” strategy referred to as the Dark Forest Theory:
The universe is like a dark forest: every civilization is hiding and watching.
No one broadcasts their existence because doing so could attract the attention of a more advanced, more hostile civilization.
That hostile civilization will do what all societies have done—hunt for the limited resources the universe presents.
The logical move, then, is to stay quiet and not attract the attention of that hostile group and if you do, destroy it before it has the chance to destroy you for your resources.
But Is Competition Really the Only Way?
When I first heard about the Dark Forest, my mind immediately went to Star Trek and Gene Roddenberry’s influence on the show.
Did you know that he felt the original Star Trek rushed too quickly to violence and fighting to solve its problems? He was convinced that a civilization advanced enough for interstellar travel would resort to diplomatic solutions far more often than war and fisticuffs. He equated intelligence with collaboration, not conflict. Thus when he created Star Trek: The Next Generation, Captain Picard had a far calmer head than Captain Kirk.
So I am convinced the Dark Forest theory comes from a place of low-empathy, human thinking. I mean, wouldn’t we be able to solve a lot of our resource problems if instead of fighting with each other, we put our most brilliant minds together to find a solution? Imagine this:
Fresh Drinking Water:
Instead of hoarding access or fighting over limited water sources, we could collaborate nationally to scale up technologies like solar-powered desalination, atmospheric water generators, or advanced filtration systems that turn wastewater into clean drinking water. We already have the tools—we just need the will to share them and scale them.Food Distribution:
We produce enough food to feed the planet, yet people still go hungry, even here in the U.S. Coordinated efforts could use AI and logistics tech to optimize food delivery, reduce waste, and create decentralized food systems (like vertical farming or community-based agriculture) in regions where importing food is unsustainable.Alternatives to Rare Earth Metals:
Rather than going to war or exploiting poor countries for cobalt, lithium, and other rare materials, international research coalitions could work on developing synthetic or biodegradable alternatives, improving recycling technology, or designing electronics that are modular and repairable—reducing the need for constant mining altogether.
These are just a few examples of how we could do away with competition and instead favor collaboration and be a lot further down the road!
In fact famed anthropologist Margaret Mead supposedly said the first sign of civilization was a healed femur. Basically, the idea is that in early human history, if someone broke their leg, they couldn’t survive on their own. So a healed bone meant someone else cared for them—brought them food, protected them, stayed with them… in other words, demonstrated empathy.
The Cost of Low-Empathy
I would argue that in today’s society, there is now a real cost to having a lack of empathy.
The inability of some people to put themselves in another’s position and understand things from their point of view has led to most of society’s current flashpoints: discriminatory societies, environmental destruction, mental health crises, polarization.
Elon’s Dr. Saad stated he does have empathy, just for the “right targets.” I argue that only having empathy for the “right” people deprives us of taking that next step in society’s evolution.
We don’t need to hunt or go to war to survive in the way we once did in earlier civilizations. The biggest problems we face now—climate change, income disparity, AI disruption, cultural conflict, and even war—require *more* empathy and collaboration, not less.
So perhaps high-empathy is the evolutionarily adaptive trait needed for our survival. Especially since it seems like a system that favors disconnection and a lack of empathy will eventually turn on itself.
Is Balance Still Needed?
Maybe. Probably. We still need people who can make tough decisions with emotional distance. We obviously (unfortunately) still need generals who order soldiers to a deadly situation. We need judges and wardens for criminals. We need slaughter houses for those who eat meat, and CEO and inventors willing to risk it all for new technologies.
But I’d like to imagine a world where those generals, judges, and CEOs make decisions by employing more empathy, not less. A leader can be firm without being cruel. A system can be efficient without being unjust. A society can value logic and still prioritize compassion.
Now to the real point of my empathy musings in this newsletter… how does empathy apply to the robots?
Should Empathy Be Programmed Into AI?
Artificial intelligence, for now, doesn’t "feel" anything. But it can be programmed to recognize, respond to, and even mimic human emotions. This is often referred to as artificial empathy—and we’re already seeing attempts at this empathy:
Mental health chatbots that simulate caring responses
Customer service agents that parrot a human’s tone and phrasing to sound more empathetic
Social robots used in elder care or education
But there’s a huge difference between simulating empathy and embodying it. And that is where things get tricky.
Why Empathy Matters for AI
Without empathy, AI risks becoming a blunt hatchet like Elon’s chainsaw. It will optimize for speed, profit, engagement, or whatever it's told to maximize—without concern for who gets left behind, harmed, or dehumanized in the process.
Imagine AI managing layoffs based on productivity data. Without an empathetic lens, it might fire single parents, caretakers, or a troubled veteran because their output dipped—without understanding the human context of why it dipped. That output might be brought back up to speed with some creative or flexible management that might elude the robots.
Or, as we’ve already seen, an AI healthcare system that denies coverage because of algorithms that prioritize cost-effectiveness over compassion.
The stakes are enormous. If AI is going to make tough decisions about education, justice, healthcare, and social interaction, then yes—we need it to value and mimic empathy. Or at the very least, to prioritize outcomes that reflect empathetic thinking.
And, of course, without empathy, the robots might eventually make the very decision we most fear… to eliminate us to gain our resources!
But Can You Teach a Machine Empathy?
Currently AI is being trained on the whole of human writings, videos, audio, history, etc. So there are tons of example where a lack of empathy and ruthless decisions led to achievements. So how do we balance this out in order for the robots to learn empathy?
We can’t truly teach a machine to feel (yet) but we can do a few things:
Program ethical guardrails that reflect empathetic values.
This means designing systems that don't just reward efficiency, but also fairness, equity, and dignity.Include diverse, real-world human perspectives in the training data.
If AI is trained only on data from biased systems, it will replicate those systems. Including stories, emotions, and examples from a wide spectrum of humanity helps it better understand what compassion looks like across contexts.Build collaborative AI—designed to assist, not dominate.
We should stop asking, “How can AI replace humans?” and start asking, “How can it support us?” Empathy becomes a design principle, not just a feature.Hold creators accountable for outcomes.
Ethical AI isn’t just about what the system does, but about how humans are affected by it. If an AI causes harm, we can’t just shrug and blame the code. Designers and companies must be responsible for impact.
Empathetic Robots are Necessary
We should design AI systems that prioritize the outcomes empathy leads to, even if the machine itself doesn’t feel anything. We should code in fairness, equity, and compassion—not because it’s "nice," but because it’s necessary in a world where machines are starting to make decisions about human lives.
And maybe, just maybe, the act of teaching empathy to machines will remind us how badly we need to foster empathy in ourselves, too. I believe for humans to truly thrive now requires universal empathy. That might be the next stage of our evolution!
Disclosure: All of the above thoughts are mine but ChatGPT helped with the bullet points above and the thoughts about programming AI with empathy. Please excuse the typos!
When You Can’t See the Way Forward… Stick Together!
I didn’t write this week’s newsletter for anyone to lose hope. Instead I am optimistic that empathy will begin to be used in policy-making as we go forward, especially in a world where robots can do all of our jobs quicker, cheaper, and most likely better.
Let’s stick together because we are stronger when we help one another out!