Palantir is a company that cultivates an image of intellectual seriousness, as well as coolness and lethality. Their CEO, Alex Karp, is a PhD in philosophy who says things like: “Palantir is here to disrupt and make the institutions we partner with the very best in the world. … And when it’s necessary, to scare enemies, and on occasion, kill them.” The company’s partnerships with the US government are extensive.
What does Palantir actually believe in? What ideas drive its work? In a world were conflicts grow every day, should we criticize a company that claims to defend the West?
To help answer these questions, I reached out to , a reporter for the online magazine, Responsible Statecraft. Last year, Connor wrote a profile of Karp, titled “Anatomy of a Silicon Valley Hawk.” Here, we discuss the Palantir ideology, as Karp articulates it in his writings and public statements. I start our dialogue by setting the scene and asking some questions.
— Santiago Ramos
Santiago Ramos: We are in the middle of a Palantir charm offensive. A new biography of Palantir’s CEO, Alex Karp, has just been published, and Karp is granting interviews left and right. I’m sure you’ve seen the video clip of Karp wielding a sword before a reporter. The company is selling merch and opened up a new educational fellowship for high school graduates. It’s the perfect time, then, to ask: What is the Palantir ideology?
This an important question for two reasons. First, Palantir is a billion-dollar data company with great influence inside the US military and law enforcement. By making government surveillance more effective, Palantir touches the lives of every American. Palantir insists that it is not a data company and is not directly involved in surveillance, while also being deeply embedded in state bureaucracies and declaring a mission to “scare of fuck out of our enemies,” i.e., the enemies of the West.
Second, Karp has already started a public debate about Palantir, and wants people to argue with him. The charm offensive is only part of his strategy. A few years ago, Karp — a PhD in philosophy from Germany — entered the cultural fray and began arguing against his company’s critics. He’s even published a book outlining his worldview: The Technological Republic: Hard Power, Soft Belief, and the Future of the West.
Karp’s book is a combination of 1.) interventions in recent culture war battles (e.g., the fallout from the congressional hearings on campus anti-Semitism), 2.) commentary on business management theory (e.g., a critique of status and hierarchy within corporations, along with a defense of the role of the “founder”) and 3.) a defense of a few foundational philosophical claims. I want to focus on number 3.
Three claims stood out to me from Karp’s book:
Relativism is Bad. Karp believes that we are living in a relativistic era, when people lack strong convictions: “Our collective and contemporary fear of making claims about truth, beauty, the good life, and indeed justice have led us to the embrace of a thin version of collective identity, one that is incapable of providing meaningful direction to the human experience. All cultures are now equal. Criticism and value judgments are forbidden.” The result is a culture which lacks purpose and self-confidence.
The West is the Best. Karp defines the West as “a set of cultural and political values rooted in antiquity and extending through history to the modern era,” which “began to take shape in the late nineteenth century.” The West, according to Karp, is a civilization which “made possible, and indeed bearable, collective existence on a grand scale.” The West’s values, as he puts it elsewhere, “are obviously superior” to those of its competitors. Still, in the book at least, Karp is savvy enough to acknowledge that the West has its own sins and faults — while also suggesting that part of the strength of the West comes from its willingness to engage in self-criticism.
The United States is the Sword of the West. The United States has a special role in the world today: to defend and preserve the values of the West. It’s irresponsible to deny that the US has this role. We are in a crisis today because Silicon Valley’s best and brightest have shirked the “necessary task of building the nation, of constructing a collective identity and shared mythology.” Palantir bucks this trend and contributes to this common project, Karp says, by strengthening American military might.
These ideas don’t seem uniquely dangerous to me — in fact, they aren’t original. They are standard center-right talking points. I might disagree with some of them, but I’ve heard much more disturbing stuff coming out of Silicon Valley. For example, Karp doesn’t, as far as I can tell, believe that the human race is just a “biological bootloader” for a future superior race of AIs, as both Elon Musk and Sam Altman have suggested. Of course, I’ve also heard some wild ideas from Peter Thiel, founder of Palantir — but it’s unclear to me whether his feverish speculations about a posthuman future are more relevant within Palantir today, than Karp’s more earthbound ideas.
What am I missing here? Is my summary of Karp’s philosophy missing an important detail? Am I failing to grasp the subtext in The Technological Republic? Is the Palantir ideology more dangerous than my account of it suggests?
Connor Echols: As you note, Elon Musk and Sam Altman seem indifferent to the possibility of a future ruled by robots, and they are doing everything they can to bring it about. But that possibility remains purely hypothetical. Alex Karp, by contrast, is already using his ideology to make the world a more dangerous place.
I should note that, unlike many Karp critics, I don’t view him as insane. In fact, I consider him quite thoughtful. Palantir’s success is downstream of his creative approach to running a business, in which he devolves enormous responsibility to individual engineers — an approach that allows the company to recruit and retain top-tier talent. And The Technological Republic shows that he’s admirably willing to engage with work from outside his ideological tribe. In one section, Karp discusses the work of Edward Said, the late Palestinian-American theorist whose writing on Orientalism is a bête noire of the anti-woke set. Instead of joining the pile-on, Karp defends Said, arguing that his oeuvre has been “frequently misinterpreted” by fans and critics alike.
Unfortunately, Karp has decided to use his evident brilliance in service of a civilizational mission that increases the risk of conflict between world powers. For more than two decades, Karp has worked to meld the Western chauvinism of Washington’s foreign policy establishment with the techno-optimism of Silicon Valley. And these efforts have helped drive a dangerous culture shift in both places.
Karp’s desire to defend the West stems from a deep feeling of vulnerability. Growing up with a Jewish father and a Black mother, Karp came to believe “if fascism comes, that I’d be the first or second person on the wall.” In his telling, the West is the only culture capable of protecting someone like him. But this produces in Karp, and in Palantir, a fundamentally Manichaean worldview. As Palantir CTO Shyam Sankar recently put it, “there is still evil in the world, and that evil is not us.” Technical parity with these evildoers is “insufficient,” Karp wrote in his book. “A weapons system in the hands of an ethical society, and one rightly wary of its use, will act as an effective deterrent only if it is far more powerful than the capability of an adversary who would not hesitate to kill the innocent.”
Karp isn’t worried that this will launch an arms race in which both sides are incentivized to, say, rapidly field autonomous weapons. He believes that this arms race is inevitable, and that we must not lose. And he’s working hard to operationalize that view. Palantir doesn’t just try to win Pentagon contracts; it also looks to grow the defense tech industry as a whole using programs like the First Breakfast, which helps weapons startups get contracts of their own. Sankar, his deputy, has even been commissioned as an officer in the U.S. Army Reserves, helping to usher in what Sankar has called “voluntary civil-military fusion.”
This is where Palantir’s true danger lies. Karp and his crew have helped to bring about a world in which Silicon Valley is pivoting to hawkish patriotism and Washington is pivoting to open-eyed techno-optimism. If Karp’s ideas seem unoriginal, it’s because American elites are already embracing them with open arms.
Santiago: By “Manichean worldview,” I assume you mean that Karp reduces the world into two sides: one good, the other evil. If so, I think you’re right: there’s definitely a Manichean tendency in Karp’s thought. I’m also sure that, if asked, Karp would say that, of course, there are many good things about China and Russia, despite their being illiberal, revisionist powers. But all of the positives Karp sees in the “other side” — as well as the nuances and ambiguity that necessarily attend global affairs — become irrelevant for him once it’s time to defend the West against its enemies.
The topic of Manicheanism is important for one reason: it forces us to ask who exactly Karp thinks he is defending. Who is the “good” side, and how does he conceive of it? On this, Karp equivocates. Throughout the book, he is adamant that human beings need a “collective mission” in order to give their lives meaning. The collective mission he has in mind is the United States as a national project. But he also often writes about the West as a whole providing a collective mission.
As I quoted in my first message, the “West” for Karp is decisively modern — it was born in the 19th century — and secular. Its crowning achievements are science and technology, made possible by commerce and industry. Within the West, the United States is the modern state par excellence, a nation that need not be defined by “a shallow appeal to either ethnic or religious identity,” but instead by a strongly held “belief.” But belief in what, exactly? Well, belief in the superiority of the West … and in the goodness of capitalism, science, and technology … but only as long as all of these are restrained and redirected by a collective national purpose ... that purpose being … ??? The definition is never complete.
Ironically, Karp’s idea of the West is an example of the “soft belief” that he attacks. The most we can say is that Karp believes the West is a powerful civilization, and that it should use its power to do good. He argues that the West is better than other civilizations, but he doesn’t really say why. (He suggests that Western superiority can be most easily perceived when one takes an “aesthetic view.” An aesthetic view is, by definition, not a moral one.) For Karp, the West is a mighty empire without a clear goal beyond self-preservation. If Immanuel Kant dreamed of a Kingdom of Ends, for Karp, the West is the Kingdom of Means. It’s more like a tool than an actual culture.
And yet, even this hollowed-out conception of the West has its virtues. Some sort of conflict with revisionist powers does seem inevitable, and I want the United States — and the West — to survive it. (The most compelling parts of Karp’s book deal with how Palantir helped US soldiers avoid being blown up by IEDs in Afghanistan.) Of course, I would prefer that the US fight this conflict without firing a bullet; with humility rather than hubris; always mindful that the West, too, can commit war crimes, or might become totalitarian, if it doesn’t watch itself. And while it’s sad that the head of … “vibes” at Palantir dismisses any concerns about the surveillance state as “Low IQ behavior” (to give just one example of the aggressive media strategy in the company), this doesn’t make Palantir wrong about the threat of great power conflict.
Connor: The threat of great power war is indeed a serious cause for concern. That is precisely why Karp’s influence keeps me up at night.
Karp’s worldview isn’t just Manichaean; it’s also Hobbesian, at least in the colloquial sense. For Karp, conflict and aggression are deeply embedded in human nature, as he explained in his dissertation. Karp extends this idea more or less directly to the realm of geopolitics. “You have an inability for these cultures — West, non-West — to interact effectively,” he said back in 2022, adding that he therefore expects “an acceleration of conflict with China.” He concludes that the only way to avoid war is to achieve complete military dominance over the enemy.
But the vigorous pursuit of dominance is a powerful accelerant of conflict. In international relations theory, there’s a concept known as the “security dilemma.” The basic idea is that any step taken to improve one’s own military strength, even if done with defensive intent, will inevitably be seen by the enemy as threatening, since it shifts the overall balance of power to their detriment. This tends to launch spirals of escalation in which both sides repeatedly attempt to achieve an upper hand. This can lead to arms races, as we saw during the Cold War, or even open conflict, as we saw in World War I.
This doesn’t mean that we should roll over and stop building any weapons. But it does mean we should be discerning when deciding which developments to pursue. American policymakers understood this when they sought a series of nuclear arms agreements with the Soviets, through which both sides halted weapons programs and reduced their overall power in order to decrease the risk of calamity. A similar case could be made about the development of autonomous weapons, whose risks are arguably on par with those of nuclear weapons. Yet Karp appears determined to go all-in on advancing these technologies.
What do we get in exchange for all this risk-taking? In Karp’s view, we get a chance to save the West and preserve its virtues. But, as you note, he is unwilling to spell out those virtues beyond vague references to freedom, capitalism and liberal democracy. This frees him to pursue projects that, in my opinion, undermine the very values that distinguish the West. Was the National Security Agency’s dragnet approach to surveillance a threat to Western values of freedom and privacy? Karp certainly didn’t think so when he sold Palantir’s tech to the NSA. Does Israel’s killing of Palestinian civilians in Gaza represent a threat to the laws of war and human rights? Not if you ask Dr. Karp, who still proudly equips the Israeli military with his tech.
These contracts speak far louder than any vague pronouncements about Western values. This is why I describe part of Palantir’s ideology as Western chauvinism rather than Western pride or some other, less pejorative phrase. Simply put, Karp’s professed love for the West boils down to a feeling that his tribe is better than the enemy’s. If Karp really wanted to preserve the West as an ideological project, then Palantir would be a very different company.
Santiago: I was hoping you’d notice my (poor) attempt at subtlety when I wrote that I want the West to “survive” great power conflict; I didn’t say “win.” I’m not sure what winning would look like in the type of war that we might be fighting in the 21st century. The US military predicts that future conflicts with “near-peer adversaries” will be more destructive than any war Americans have seen since World War II. (Gaza is the “dress rehearsal,” one reporter suggests.) The goal should be to avoid war; failing that, the goal should be to make it through them with civilization intact.
A necessary but not sufficient condition for survival, we both agree, is enough military strength to deter or withstand attack from outside. But another necessary condition — and this is something that Karp does not understand — is a persistent countervailing force that pushes back against those forces within that are structurally biased in favor of war (AKA the military industrial complex). By countervailing force I mean all the interests in society which critique the voices who counsel war. I mean voices (like yours) that discuss ideas like the security dilemma.
What scares me about Palantir is that, through their public relations arm (their CTO even wants to make Hollywood films), the company works to delegitimize those countervailing forces. They make the tools of war but also want to influence the debate about war.
So I want to leave you with this question, which I have no idea how to answer: to what extent can the tech that Palantir makes be separated from its ideology, which we both agree is dangerous? Is it possible to separate the tool from the worldview? Does their Machiavellian/Hobbesian attitude toward geopolitics condition the types of products that Palantir makes? Take, for example, the autonomous weapons which you mentioned above. If you had a magic wand and could create a new Palantir, one free of Karp’s ideology, would that Palantir still create autonomous weapons?
Connor: I’m not sure whether a brand new, Karp-free Palantir would still work to develop autonomous weapons. But I’m inclined to say that it would. My logic is not about the relationship between the technology and the ideology per se. Rather, it’s about the market incentives that Palantir has already created.
Before Palantir, Silicon Valley types were highly skeptical of working with the military. They often expressed this skepticism in moral terms, but the fundamental reason was practical. Venture capitalists, as a rule, invest widely in the hopes that one successful investment will cover the losses for all their failures. This means that every startup they invest in must have a path to wild success. Government contracting can be lucrative, but, to these high-stakes investors, it always seemed too bureaucratic to allow for the rapid growth that keeps the VC ecosystem alive.
Palantir’s wild success convinced VCs that they were missing out on a big opportunity. Now, they’re desperately trying to catch up and get their piece of the $1 trillion U.S. defense budget. To make this bet worth it, they need to create a seemingly game-changing technology. AI-powered weapons are that technology.
This really began to sink in for me in October, when I attended a glitzy product launch put on by ShieldAI, a military tech startup valued at a cool $5.3 billion. In a room full of politicians, defense lobbyists and Pentagon officials, ShieldAI unveiled what it called the world’s “first autonomous fighter jet.” The company boasted that the plane, called the X-Bat, could fly faster and farther than most fighter jets for a lot less money.
The remarkable thing about this wasn’t the plane itself; after all, ShieldAI still doesn’t have a fully functional prototype, let alone a production-quality model. More notable was the fact that ShieldAI was investing so aggressively in the plane despite not having a contract with the Pentagon. This means that the only way to recoup that investment is to convince the Defense Department and Congress that this technology is so good that it needs a major new line item in the budget — a herculean feat for an unproven company.
To attract the Pentagon’s attention, ShieldAI has to push the boundaries. And it’s done so with gusto. According to the company, the X-Bat could be programmed to “automatically engage” (read: fire on) an enemy in a designated area. When I asked their executives to clarify their stance on autonomous weapons, they said that they would always have a human “on the loop.” But this clever phrase only means that a human would monitor the machine’s decisions and intervene if the X-Bat made an obvious mistake. And it’s not even clear whether this level of oversight will prevail given that, according to ShieldAI, their plane can “engage” targets “even if the comms link is severed.”
Like Palantir, ShieldAI expresses the need for this sort of technology in ideological, West-is-best terms. It’s hard to say whether this ideology derives from the market incentives or vice versa. But what I know for sure is that, barring some unforeseeable shock, Palantir-esque companies will continue to push the limits of military tech. We’re living in Dr. Karp’s world now.
Wisdom of Crowds is a platform challenging premises and understanding first principles on politics and culture. Join us!






Incredible piece here. Echols nails the real tension when he brings up the security dilema and how dominance-seeking backfires into arms spirals. The Cold War nuclear treaties remind us that strategic restraint can actually build more durable security than raw force expansion. But here's the rub: venture capital has basically discovered defense as a moonshot category, which means these emirgent autonomous weapon s systems are more aboutmarket timing than strategic coherence. The incentives are all wrong for the kind of discernment we need.