Despite the fact that Hollywood’s actors and writers went on strike in part because of the encroachment of AI just this past summer, we’ve already seen the technology pop up more during the past few months. In January, disappointed True Detective: Night CountryLate Night With The DevilPink Floyd made headlines when they selected an AI-generated video as one of the winners for their Dark Side Of The Moon anniversary contest earlier this month.
- Off
- English
Unfortunately, despite outraged fans’ and creators’ attempts at boycotts and other direct action, it looks like Reese Witherspoon was right when she said that AI was here to stay, so we should all “just get used to it.” Case in point: two separate, major studios have made headlines for their unwelcome use of the technology in just the past 48 hours.
Related Content
For a mere $3 billion, you could've bought indie darling A24Elizabeth Banks finds the prospect of bias in AI scripts "terrifying"- Off
- English
The first of these studios is A24, which lost a great deal of goodwill amongst its normally fervent fanbase yesterday when it published a series of apparently AI-generated posters for Alex Garland’s Civil War. While all of the posters depict war-torn visions of recognizable American cities and landmarks—none of which actually appear in the film—perhaps the most egregious is a version of Chicago where the Marina City towers simply aren’t in the right place. (Did Wilco teach us nothing?) If you zoom in on the tan building on the left side of the poster, you can also see some tell-tale signs of AI blurring and incorrect doubling.
Other easy-to-spot mistakes include a wreckage-filled Miami street where one car seems to have three doors, and some abomination that’s either a swan boat with no seat, or the largest swan to ever live. It’s also especially ironic that this is happening to a film by Alex Garland, whose excellent 2015 movie Ex MachinaThe A.V. Club’s request for comment on this piece, but it’s pretty safe to assume this was a marketing decision made by higher-ups without consultation from Garland himself.
But it’s not just A24. A recent FuturismWhat Jennifer Did, which follows a young woman named Jennifer Pan’s alleged plot to kill her parents, a number of pictures depicting Pan as a “bubbly, happy, confident, and very genuine” person before her arrest appear to have been artificially generated, or at least manipulated. The photos (depicted in more detail in the linked article) also have the tell-tale signs; Pan’s hands and ears aren’t quite right, objects in the background are muddled nonsense, and in one image—actually used on the documentary’s poster—she has one impossibly long tooth. The fact that AI was used was not disclosed anywhere in the film’s credits. (Netflix did not immediately respond to The A.V. Club’s request for comment on this story.)
While the presence of AI in movie posters is frustrating (and perhaps a suable example of false advertising), its use in a documentary is downright terrifying. As 404 Media reported, filmmakers and Archival Producers Alliance co-founders Jennifer Petrucelli, Stephanie Jenkins, and Rachel Antell actually presented draft guidelines for the ethical use of AI in documentary work just this week. While their suggestions revolve more around practicing complete transparency than curtailing the use of AI altogether (it isn’t going away any time soon, after all), the filmmakers’ main fears rest in the fact that artificial photos and video deepfakes could imminently lead to a warped archive not even historians will be able to trust.
“One of the things we’ve realized is once a piece of media exists, even if it is disclosed [that it’s AI generated], it can then be lifted out of any documentary, make its way onto the internet and into other films, and then it’s forever part of the historic record,” Antell said (via 404). “Archival moves at a human pace and GenAI does not move at a human pace, and so for humans to keep up with it, that’s a very unlikely thing to be able to happen.” Increasingly, it seems the race may already be lost.