AI is 'a nuclear bomb and the entire world has already got it,' Palantir ethics expert warns

Palantir civil liberties engineer says he is not sure society is ready for AI

An expert on artificial intelligence ethics for a company that builds software used in war zones likened the emergence of widely-available advanced AI to the atomic bomb this week, voicing concerns over how the powerful technology could be used.

"AI is a nuclear bomb and the entire world has already got it," John Grant, civil liberties engineer at AI software firm Palantir, said during a panel discussion for the Reagan Institute Summit on Education (RISE) put on by the Ronald Reagan Presidential Foundation and Institute on Thursday. 

Palantir logo on building in Switzerland

A woman walks under a sign of big data analytics software company Palantir at their stand ahead of the World Economic Forum annual meeting in Davos, Switzerland, on May 22, 2022. (FABRICE COFFRINI/AFP via Getty Images / Getty Images)

Ticker Security Last Change Change %
PLTR PALANTIR TECHNOLOGIES INC. 65.77 +6.59 +11.14%

"I'm not sure as a society we're ready to handle that very well," he explained. "We need people to take their own responsibility for how they use it — and that's a challenge."

Grant said the situation reminds him of the Twitter engineer who invented the retweet button famously saying, "We might have just handed a 4-year-old a loaded weapon."

CHATGPT MAKER THREATENS TO LEAVE EU OVER REGULATION DESPITE ASKING CONGRESS TO REGULATE AI

The civil liberties engineer is responsible for educating Palantir employees on AI ethics, and says he teaches his colleagues to identify the ethical issues with technology and how to draw boundaries to take responsibility for the effect they could have on society. Grant said the same things should be taught to kids in school, so those who are developing new technologies understand they have the power to control how it is used.

Palantir defense vehicle

A Palantir Technologies TITAN (Tactical Intelligence Targeting Access Node) for military defense field intelligence deployment is displayed at the company's booth during the Consumer Electronics Show in Las Vegas on Jan. 5, 2023. (PATRICK T. FALLON/AFP via Getty Images / Getty Images)

"You have to imbue in these students responsibility for what they're building and the effect on the world, and there's a dangerous tunnel vision sometimes with engineers and computer scientists where they say, ‘Hey, I’m going to build this thing and it's going to work, it's going to be cool,'" he said.

Grant told the RISE audience that one of the driving impetuses for him starting the education program at Palantir was a quote he read from Frank Oppenheimer, the brother of Robert Oppenheimer, who is credited with being the father of the atomic bomb. Both men worked on the Manhattan Project.

TECH GIANT SAM ALTMAN COMPARES POWERFUL AI RESEARCH TO DAWN OF NUCLEAR WARFARE: REPORT

Grant said Frank Oppenheimer was asked later in life how he felt about having worked on a genocidal weapon, and he replied something to the effect of, "Somehow we never thought it would be used on people."

robot hand holding atomic symbol

Palantir's AI ethics expert warns the widely-available advanced technology is akin to the atomic bomb. (iStock / iStock)

"His response chilled me," Grant said, pointing out that the bomb was developed "in the middle of the most violent, bloody war in history."

"I'm not disparaging Oppenheimer," Grant explained, saying the physicist "was solving a mechanical problem, he was solving an engineering problem," and the broader implications weren't considered.

GET FOX BUSINESS ON THE GO BY CLICKING HERE

"We've got to show all these engineers, all these students, it's not just the project in front of you," he reiterated. "That you have to figure out, how is it going to affect the world? And you have to say, ‘Am I happy with that or am I not happy with that, and how are we going to mitigate those negative effects?'"