Dear Gossips,   

It’s time to talk about AI again! This is going to be an ongoing and ever-evolving conversation, but this week a couple things happened that clearly illustrate the pitfalls of engaging with AI in the entertainment space. 

 

First, beloved indie studio A24 used AI to generate some posters for Civil War, currently the #1 movie in America. 

 

The posters include a post-apocalyptic Chicago with refugees on tour boats on the Chicago River. It also shows the Marina Towers, two of the most famous buildings in a city full of famous buildings, on opposite sides of the river, rather than next to one another as they really are. Similarly, the Echo Park swan boat in the Los Angeles poster is just a giant ass swan. So it was immediately obvious these were AI images, because human artists don’t f-ck up like that. A24 also admitted they used AI to generate these “what if” scenarios for the movie.

 

But A24’s whole reputation is about being the home of true artists, so why do this with AI? Just like the case of Late Night with the Devil, why not just use human artists to create these images? The answer is obvious—money. They’d have to pay artists to make posters, with AI they can scrape everything from the public domain to protected intellectual property and the algorithm treats it all the same and no one is compensated. 

The second incident is even more troubling, though, because beyond the ethical boundary of depriving people of compensated work, this involves the ethics of journalism and documentaries, in the already murky true crime space. Netflix is accused of using AI generated and/or manipulated images in the true crime docuseries What Jennifer Did, which tells the story of Jennifer Pan, who was convicted of planning her parents’ murders (her father survived, her mother did not). 

 

It seems that images of Jennifer Pan were at least manipulated, if not outright created, to show her as a “bubbly” person, as a high school friend describes her. Netflix has not yet commented, but a true crime documentary is the LAST place we should be using artificial images. True crime already occupies an uncomfortable place as entertainment, adding manipulated images to it just makes it even dodgier. Like where does that stop? When does the scale tip from recounting events in a factual manner to inventing scenarios to match the narrative you think will get the highest ratings?

This technology is evolving fast, and everyone is just playing in the sandbox without regard to genuine ethical concerns about the role of AI in journalism and media, and the potential to decimate entire sectors of industry just so corporations can save a buck. These are the questions that need asking, not “can you make posters of post-apocalyptic Chicago”. 

Live long and gossip,

Sarah