DAVID NOTOWITZ: How AI images, deepfake videos, and manipulated information will permit more lies to spread in the media

In the heart of Minneapolis, in June 2018, we all witnessed the aftermath of an incident that shook the community to its core. Thurman Blevins, a supposedly unarmed Black man, was allegedly shot in the back by the police. The city streets shut down, angry protestors blocked trains, and the media, as always, narrated the story with a sensational twist—the pronouncement that this was a racially motivated crime.

It's easy to be swept away by headlines, to form opinions without digging into the depths of the truth. The public anger was palpable when the officers weren't charged, but few understood why. Amidst the chaos, my company, the National Center for Audio and Video Forensics (NCAVF), an independent organization, was called upon to analyze the distorted and shaky footage.

The urgency of the Minneapolis case meant compressing weeks of analysis into a few days. Body camera footage, shaky from the officers' pursuit, needed meticulous frame-by-frame stabilization, and audio had to be dissected to distinguish gunshots and their timing.

In the digital age, where manipulated videos and deepfakes pose a growing threat, the importance of ensuring video authenticity cannot be overstated. And even if the video evidence is determined to be real, distortions in audio caused by gunshots, video issues related to frame rates and digital artifacts, and the limitations of the human eye can mean missing crucial evidence that might sway a case one way or the other.

But here's the catch: the public seldom hears about these intricacies. We, as experts in our field, could inundate you with technical details for hours, but these nuances rarely make headlines. Our methods, the tools that clarify what people saw and heard during an incident, often stay hidden behind the heavy curtain of media spin.

The media, in its rush to judgment and to publish, often neglects the complexity of forensic audio and video analysis. It should not be about taking sides but instead about presenting the facts with clarity. We don't render decisions; we provide insights. We provide the truth. 

In the face of mounting emotions, the public's lack of curiosity about details is disheartening. Strong emotions are natural, but when did our desire for knowledge become so feeble? In a world where a few keyboard taps can set off waves of consequences, it's imperative to question our own perceptions and judgments.

As the landscape evolves, AI enters the stage, promising both progress and potential pitfalls. The ability to discern real from manipulated becomes even more critical. Audio and video forensics, equipped with the latest technology, stand to help in the fight against misinformation. The same tools that clarify evidence today may well be the ones protecting us from the onslaught of deepfakes tomorrow.

Now, in the midst of the Israel-Hamas conflict, where misinformation, staged videos, and AI-generated images can easily be weaponized, the call for caution resonates louder than ever. Mainstream media and the public must exercise restraint, not jumping to conclusions until video and still image evidence is verified by sources that have been trusted in the past.

A glaring example was an October 17 explosion in the parking lot of a Gaza hospital, in which the New York Times trusted the Gaza Health Ministry’s version of events and didn’t wait to get comments or verification from other sources. The AP did a careful forensic analysis of multiple videos and concluded four days later that the explosion was likely from a missile launched from Gaza that misfired and not a missile fired by the IDF. The New York Times, to its credit, apologized two days after the AP article was published. However, in those six days total, they did extraordinary damage that cannot fully be reversed.

In a world drowning in information, the public must cultivate a hunger for facts. A simple Google search could unveil experts who navigate the intricacies of cases, shedding light on details conveniently overlooked by mainstream media. It's time to move beyond our biased opinions, to seek truth persistently, and question the information that we are presented. The media, too, must pause, reassess, and remember that true reporting transcends headlines—it delves into the heart of the matter, shining a light on the nuanced reality that must exist beyond the first emotional reaction.

Returning to Minneapolis, where our expertise was called upon to bring clarity to a turbulent situation, our analysis and resulting video sequences for the Thurman Blevins matter revealed critical details. The clarified video, released to our client, the state of Minnesota, showed that after a multi-block chase, officers demanded Blevins drop his weapon, to which he responded that he already had. However, immediately prior to the officers' decision to shoot, the footage clearly showed that Blevins pulled out a gun and turned toward them.

The release of these videos marked a turning point. Protests and disruptions in the city ceased, and, at least in this one instance, peace was restored. The power of video evidence, meticulously analyzed and presented, showcased the importance of a thorough and unbiased investigation. In the midst of heated debates and polarized opinions, it serves as a reminder that truth, no matter how obscured, can prevail if we commit to seeking it.
 

Image: Title: AI deepfakes
ADVERTISEMENT

Opinion

View All

MORGONN MCMICHAEL: UC Santa Barbara’s multicultural center shut down after discovery of over 100 antisemitic signs inside

The University of California-Santa Barbara has announced the suspension of its “Multicultural Center”...

JOBOB: Apple gives up on producing electric vehicles

Technology giant Apple has ended its project to build an electric vehicle, which the company reported...

Charlie Kirk and Michael Seifert: Google Ai’s wokeness isn’t just a bug, it indicates a deeper sickness

"This tech giant that used to be sort of the pinnacle of innovation has now revealed that they're not...