Opinion: When unregulated AI re-creates the past, we can’t trust that the ‘historical’ is real
1. An angry political figure passionately expressing a hateful speech to a cheering crowd.
2. A child weeping over the brutal slaughter of her loved ones.
3. Malnourished men in prison garb, reduced nearly to skeletons due to their identities.
As you visualize each scenario, vivid mental pictures may emerge, indelibly etched in your mind and our shared memory through documentaries, textbooks, news articles, and museum exhibits.
We grasp the importance of impactful historical photos such as these – photos that help guide us towards progress – primarily because they depicted genuine aspects of the world before we had a chance to witness them personally.
Opinion
In our current state, we have yet to establish boundaries for deepfakes. However, a hypothetical scenario from five years down the line suggests that the events of 2024 might compel us to address this issue.
As creators of archival material for documentaries and joint leaders of the Archival Producers Alliance, we express grave concerns over potential consequences when authenticity of such visuals becomes questionable. Many share our apprehension: In preparation for this year’s Oscars, Variety highlighted that the Motion Picture Academy is contemplating mandates for entrants to disclose AI usage in their productions.
In contrast to fictional movies, transparency is vital for documentaries, especially since last spring when we started incorporating artificial images and sounds into our historical documentaries. Without any established guidelines for disclosure, there’s a concern that blurring the line between fact and fiction could undermine the authenticity of the non-fiction genre and its significant contribution to our collective past.
Opinion
In the next decade, artificial intelligence might develop self-awareness or consciousness. It would be prudent to anticipate that such systems may undergo personal experiences, potentially even feeling human-induced pain.
In February 2024, OpenAI unveiled its latest text-to-video technology, Sora, by showcasing a video titled “Historical footage of California during the Gold Rush.” The video appeared genuine: A stream teeming with potential wealth. A clear sky and picturesque hills. A bustling settlement. Men on horseback. It seemed like a western movie where the hero triumphs and vanishes into the sunset. However, it was all fabricated.
OpenAI showcased “California Gold Rush Era Footage: A Simulated Presentation” to illustrate how Sora, set for release in December 2024, generates videos using AI technology that mimics reality. However, it’s important to note that this footage isn’t real; it’s a mix of authentic and fictional images, influenced by Hollywood’s creative interpretation and historical biases. Similar to other AI-driven platforms like Runway and Luma Dream Machine, Sora gathers content from the internet. Consequently, these platforms may unintentionally perpetuate online media’s flaws and biases. Despite this, it’s easy to see how viewers could be misled by the compelling nature of cinema.
As a passionate movie enthusiast, I’ve welcomed the advent of generative AI tools in the film industry with mixed emotions. On one hand, it’s exciting to witness the possibilities they bring. On the other, I can’t help but feel a sense of unease about what lies ahead. If we lose faith in the authenticity of visuals, it could tarnish the impact and truthfulness of compelling films, even if they aren’t manipulated by AI-generated elements.

Opinion
The upcoming significant fight concerning freedom of speech will likely revolve around Artificial Intelligence (AI), warranting the same level of legal consideration as traditional communication methods have received in the past.
A level of openness, similar to nutritional labels on food items that tell people what ingredients are used, might be a modest first step. However, it seems that enforcing transparency in AI systems is not likely to be regulated anytime soon, like a solution that’s approaching but still out of sight.
AI firms developing generative technology envision a future where anyone can generate audio-visual content effortlessly. However, such advancements raise serious concerns when applied to historical representations, as the widespread use of synthetic images intensifies the need for historians and researchers to maintain the authenticity of original sources. The task of these professionals – sifting through archives, ensuring accuracy in history presentation, and preserving the integrity of primary source material – becomes increasingly critical. The Oscar-nominated documentary “Sugarcane” this year serves as a testament to the importance of diligent research, authentic archival imagery, and well-documented personal narratives in unearthing hidden historical truths, specifically regarding the mistreatment of First Nations children in Canadian residential schools. This work is uniquely human and cannot be replicated or replaced.

Awards
Two documentaries, ‘Dahomey’ and ‘Sugarcane,’ are contending for an Oscar, both delving into the enduring impact of colonialism on their respective subjects.
The rapid emergence and abundance of AI models and digital content has made it hard to overlook this technology. Although experimenting with these tools for creative purposes can be enjoyable, the output doesn’t represent authentic human documentation – it’s merely a recombination or remix of existing material.
To address this situation, we advocate for strong AI media literacy within our sector as well as among the general public. At the Archival Producers Alliance, we’ve released a collection of guidelines – supported by over 50 industry associations – for ethical AI application in documentary film production. These principles are now being adopted by our peers in their projects. We’re also encouraging submissions of real-world examples demonstrating AI usage in documentaries. Our goal is to uphold the integrity of documentaries, ensuring they maintain their status and safeguarding the historical record they preserve.
In a modern context, we’re not starring in a conventional western movie where a hero rides in to save us from the peril of uncontrolled AI technology. Instead, it’s up to each of us, individually and collectively, to safeguard the authenticity and richness of our shared history. Authentic visual records don’t just depict past events; they illuminate them, reveal their intricacies, and perhaps most crucially in this era, help foster trust in their veracity.
As a cinema enthusiast, when the rich tapestry of past cinematic experiences becomes blurred and indistinct, it leaves our shared future feeling like an unintentional mashup, devoid of its original essence.
Rachel Antell, Stephanie Jenkins, and Jennifer Petrucelli jointly serve as directors for the Archival Producers Alliance.
Read More
- Cookie Run: Kingdom Pure Vanilla Cookie (Compassionate) Guide: How to unlock, Best Toppings, and more
- Reverse 1999 Willow Build Guide: Best Team, Psychubes, and more
- FC Mobile 25 Hero Chronicles event Guide and Tips
- Cookie Run: Kingdom Black Sapphire Cookie Guide: How to unlock, Best Toppings, and more
- Cookie Run: Kingdom Shadow Milk Cookie Guide: How to unlock, Best Toppings, and more
- Yellowstone and It Ends With Us star Brandon Sklenar lands next movie role
- Classic ’80s movie is finally getting sequel – and fans will be delighted
- Half-Life 2 and Dishonored Artist Viktor Antonov Has Passed Away
- Elden Ring Nightreign is Seemingly Getting Bosses and New Playable Characters as DLC
- Limbus Company Gift Fusion Guide — All EGO Fusion Gifts
2025-03-01 14:31