opinion Alex Karp, Nicholas Zameska: US tech companies should help build AI weapons

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Alexander C. Corp. is the co-founder and CEO of Palantir Technologies. Nicholas W. Zamiska is the company's head of corporate affairs and legal counsel in the Office of the CEO. His book, “The Technological Republic: Hard Power, Soft Belief, and the Future of the West,” will be published in February.

On July 16, 1945, shortly after dawn, a group of scientists and government officials gathered on a secluded stretch of sand in the New Mexico desert to witness humanity's first test of an atomic weapon. An onlooker described the explosion as “magnificent purple”. The thunder of the bomb blast echoed across the desert.

J. Robert Oppenheimer, who led the project that culminated in the test, pondered that morning the possibility that this destructive power might somehow contribute to lasting peace. He recalled Swedish industrialist and philanthropist Alfred Nobel's hope that dynamite, which Nobel invented, would end wars.

After seeing how dynamite was used to make bombs, Noble told a friend More Capable weapons, no less, would be the best guarantors of peace. “The only thing that will stop nations from going to war is terrorism,” he wrote.

Our temptation may be to retreat from such grim calculations, to retreat into the hope that a peaceful instinct will prevail in our species if only those with weapons will put them down. It has been nearly 80 years since the first nuclear test in New Mexico, however, nuclear weapons have only been used in war twice, at Hiroshima and Nagasaki. For many, the power and horror of the bomb has become distant and faint, almost abstract.

Humanity's record of weapons management – imperfect and indeed dozens of times almost disastrous – has been remarkable. Almost a century of some version of world peace has prevailed without major power military conflict. At least three generations – billions of people and their children and grandchildren – have never known a world war. John Lewis Gaddis, a professor of military and naval history at Yale, has described the lack of major conflicts in the postwar period as a “long peace.”

The nuclear age and the Cold War essentially cemented a reckoning between the great powers for decades that made real escalation rather than skirmishes and tests of power on the margins of regional disputes, highly unattractive and potentially costly. Steven Pinker makes a broader argument. “The decline in violence may be the most important and least appreciated development in the history of our species.”

It would be unreasonable to give all or even most of the credit to any one weapon. Any number of other developments since the end of World War II, including the spread of democratic forms of government across the planet and a level of interconnected economic activity never before imagined, are part of the story. .

The calculus of great powers that helped prevent World War II could also change rapidly. But the predominance of American military power has undoubtedly helped preserve the peace, fragile as it may be. However, the determination to maintain such hegemony has become increasingly unfashionable in the West. And deterrence, as an ideology, is in danger of losing its moral appeal.

The nuclear age is coming to an end soon. This is the century of software. The wars of the future will be driven by artificial intelligence, which is developing faster than conventional weapons. The F-35 fighter jet was conceived in the mid-1990s, and the aircraft – the flagship attack aircraft of US and coalition forces – would remain in service for another 64 years. The US government expects to spend more than $2 trillion on the program. But as retired Gen. Mark A. Milley, former chairman of the Joint Chiefs of Staff, recently asked, “Do we really think that manned aircraft are going to conquer the skies in 2088?”

In the 20th century, software was built to meet the needs of hardware, from flight control to missile avionics. But with the rise of artificial intelligence and the use of large language models to make battlefield targeting recommendations, the relationship is changing. Now software is alongside hardware — drones in Ukraine and elsewhere — increasingly serving as the means by which AI recommendations are carried out.

And for a nation that holds itself to a higher moral standard than its adversaries in terms of the use of force, technical parity with the enemy is insufficient. A weapon system in the hands of a moral society, and one that is reasonably cautious about its use, will serve as an effective deterrent only when it is more powerful than an adversary capable of killing innocents. Don't hesitate.

The trouble is that the young Americans most capable of building AI systems are also the most reluctant to work for the military. In Silicon Valley, engineers have turned their backs, unwilling to engage in the messiness and moral entanglements of geopolitics. While pockets of support for defense work have emerged, most of the funding and talent continues to be on the consumer side.

gave Our country's engineering elite rushes to raise capital for video-sharing apps and social media platforms, advertising algorithms and shopping websites. gavey They won't hesitate to track and monetize people's every move online, and bar their way into our lives. But when it comes to working with the military, many balk. The rush is just to build. Few people ask what should be built and why.

In 2018, about 4,000 Google employees wrote a letter to Chief Executive Sundar Pichai, asking him to abandon a software effort, called Project Maven, for US special forces in Afghanistan and other countries. were being used for surveillance and mission planning at sites. . The employees demanded that Google never “create war technology,” arguing that helping troops plan targeting operations and “potentially lethal consequences” is “unacceptable.”

Google tried to defend its involvement in Project Maven by saying that the company's work was merely for “non-offensive purposes”. It was a subtle and lawyerly distinction, especially from the perspective of front-line soldiers and intelligence analysts who needed better software systems to survive. At the time, the head of Google Cloud, Diane Greene, held a meeting with employees to announce that the company had decided to end its work on the defense project. A Jacobin article called it “an impressive victory against US militarism”, noting that Google employees had successfully stood up against what they believed to be a misdirection of their talents. .

Yet the peace enjoyed by those in Silicon Valley opposed to working with the military is made possible by the threat of the same military's credible power. At Palantir, we're building software for US and allied defense and intelligence agencies that will enable the deployment of this century's AI weapons. We, as a society, should be able to debate the merits of using military force overseas without hesitation to provide those people with the software they need to do their jobs.

Most alarmingly, a generation's disillusionment with and disinterest in our nation's collective defense has led to a massive redirection of resources—intellectual and financial—towards serving the needs of a consumer culture. . The diminishing demands we place on the technology sector to develop products of sustainable and collective value are overwhelmingly fueling market aspirations. As David Graber, who taught anthropology at Yale and the London School of Economics, observed in a 2012 Buffler article, “The Internet is a remarkable innovation, but what we're talking about is a library. Has a very fast and universally accessible collection, post office, and mail order catalog.”

The technology world's shift toward consumer concerns has helped fuel a certain escapism—Silicon Valley's instinct to ignore the pressing issues facing us as a society in favor of the trivial and ephemeral. Challenges ranging from national defense and violent crime to education reform and medical research have appeared too complex, thorny and politically fraught for many in the technology industry to tackle.

A year after the revolt at Google, a revolt by Microsoft employees threatened to halt work on a $480 million project to build an augmented reality platform for soldiers in the US military. The activists wrote a letter to Satya Nadella, the chief executive, and its president, Brad Smith, arguing that they “didn't sign up to make weapons” and calling on the company to cancel the contract.

In November 2022, when OpenAI released its AI interface ChatGPT to the public, it banned its use for “military and combat” purposes. After the company lifted its ban on military applications this year, protesters gathered outside OpenAI CEO Sam Altman's San Francisco office to demand that the company “end its relationship with the Pentagon and Do not take any military client.”

Such anger from the crowd has trained leaders and investors in the technology industry to avoid any sign of controversy or disapproval. But their flexibility comes with significant costs. Silicon Valley's many investors and legions of exceptionally talented engineers easily set aside hard problems. A growing generation of founders say they actively seek risk, but caution often prevails when it comes to investing deeply in social challenges. Why get into geopolitics when you can build another app?

And build apps that they've done. The proliferation of social media empires systematically monetizes and channels the human desire for status and recognition.

For its part, the foreign policy establishment has repeatedly misjudged dealings with China, Russia and others, believing that economic integration would reduce their leaders' support at home and their interest in military escalation abroad. May be enough to reduce. The failure of the Davos consensus was simply to abandon the stick in favor of the carrot. Meanwhile, China's Xi Jinping and other authoritarian leaders have seized power in a way that political leaders in the West may never understand.

On a visit to the United States in 2015, Xi fondly recalled reading “The Old Man and the Sea” while speaking to a group of business and political leaders at the Seattle Chamber of Commerce. He said that when he went to Cuba, he traveled to Cojimar on the north coast, which inspired Ernest Hemingway's story of a fisherman and his 18-foot marlin. Xie said he ordered the author's favorite “mojito,” “with mint leaves and ice,” explaining that he “just wanted to feel for himself” what Hemingway had in mind when he wrote his story. “It is important to try to gain a deeper understanding of cultures and civilizations that are different from our own,” added the leader of the nation that has about a fifth of the world's population. We would be advised to do the same.

Our broader reluctance to proceed with the development of effective autonomous weapon systems for military use may stem from a legitimate skepticism of power itself. Pacifism fulfills our natural compassion for the powerless. It also frees us from having to navigate between the difficult trades the world presents.

Chloé Morin, a French writer and former adviser to the country's prime minister, suggested in a recent interview that we should resist the easy urge to “divide the world into the dominant and the dominated, the oppressors and the oppressed.” It would be a mistake, and indeed a form of moral backwardness, to systematically equate impotence with piety. Both subjugated and subjugated are equivalent to committing major sins.

We do not advocate a thin and shallow patriotism – an alternative to thinking and a genuine reflection on our nation's strengths as well as its flaws. We just want America's technology industry to keep one important question in mind — which is not whether a new generation of autonomous weapons incorporating AI will be developed. It is he who will build them and for what purpose.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment