Oscar-winning filmmaker James Cameron says he believes the future “weaponization” of artificial intelligence is the “biggest danger.”
“I think the weaponization of AI is the biggest danger,” the “Titanic” director told Canadian CTV on Tuesday.
“I think that we will get into the equivalent of a nuclear arms race with AI, and if we don’t build it, the other guys are for sure going to build it, and so then it’ll escalate,” Cameron explained.
“You could imagine an AI in a combat theater, the whole thing just being fought by the computers at a speed humans can no longer intercede, and you have no ability to deescalate,” he continued.
Cameron, who directed and co-wrote the 1984 action film “Terminator,” was asked about recent concerns raised by AI experts regarding its capabilities.
Leaders in the field have supported regulation, highlighting the need to ensure general artificial intelligence benefits humanity in the long run.
“I absolutely share their concern,” Cameron told the station.
“I warned you guys in 1984, and you didn’t listen,” he said.
The Hollywood giant also noted that it is important to assess who is developing the technology and what their goal is by operating in the field.
In terms of AI replacing writers and creators, Cameron said he doesn’t believe that will soon be an issue because “it’s never an issue of who wrote it, it’s a question of, is it a good story?”
“I just don’t personally believe that a disembodied mind that’s just regurgitating what other embodied minds have said – about the life that they’ve had, about love, about lying, about fear, about mortality – and just put it all together into a word salad and then regurgitate it… I don’t believe that have something that’s going to move an audience,” he said.
Could it happen in the future?
“Let’s wait 20 years, and if an AI wins an Oscar for Best Screenplay, I think we’ve got to take them seriously,” he said when asked whether he’s open to the possibility of accepting an AI-produced script.