Film Video Digital

603 - 643 - 2627

Filtering by Tag: January 2019

CNET: Deepfakes, disinformation among global threats cited at Senate hearing

At this year's Worldwide Threats hearing before the US Senate's Select Committee on Intelligence, the leaders of the country's top intelligence agencies, including the National Security Agency, the CIA and the FBI, again pointed at tech issues as their biggest worry.

The Tuesday hearing covered issues like weapons of mass destruction, terrorism, and organized crime, but technology's problems took center stage. That echoes last year's hearing, when officials flagged cybersecurity as their greatest concern, after major hacks like the NotPetya attack, which cost billions in damages. But concerns over technology aren't limited to cyberattacks: Lawmakers also brought up deepfakes, artificial intelligence, disinformation campaigns on social media, and the vulnerability of internet of things devices.

Read More

Carnegie Endowment for International Peace: How Should Countries Tackle Deepfakes?

WHAT KINDS OF DAMAGE COULD DEEPFAKES CAUSE IN GLOBAL MARKETS OR INTERNATIONAL AFFAIRS?

Deepfakes could incite political violence, sabotage elections, and unsettle diplomatic relations. Earlier this year, for instance, a Belgian political party published a deepfake on Facebook that appeared to show U.S. President Donald Trump criticizing Belgium’s stance on climate change. The unsophisticated video was relatively easy to dismiss, but it still provoked hundreds of online comments expressing outrage that the U.S. president would interfere in Belgium’s internal affairs.

Read More

CNN: When seeing is no longer believing

Inside the Pentagon’s race against deepfake videos

Advances in artificial intelligence could soon make creating convincing fake audio and video – known as “deepfakes” – relatively easy. Making a person appear to say or do something they did not has the potential to take the war of disinformation to a whole new level.

Read More

Buzzfeed: It's Not Fake Video We Should Be Worried About — It's Real Video

The big mood these days is waiting on the tech apocalypse. All it takes is a video of a humanoid robot displaying the motor skills of a 6-year-old to have people preparing for Skynet to kill us all. The same goes, perhaps even more so, for fears of “deepfakes”: software getting good enough that anybody with an iPhone can doctor fake videos that can spark a riot. Seeing computers convincingly putting words in the mouths of presidents is scary, and once a Macedonian teenager can do it in minutes it’s game over, so the thinking goes.

But if the last few years — and yes, the particularly hellish last few days — have taught us anything, it’s that fake video isn’t going to destroy our ability to see the truth. It’s the real video we need to worry about, and our true problem is that we can all see the very same thing and disagree on what it was.

Read More

Interview with Hany Farid on "Everything in Your Archive is Now Fake"

John Tariot interviews Hany Farid, professor at Berkeley School of Information, where he focusses on digital forensics, image analysis, and human perception. He is one of the subjects in the New Yorker article, “In the Age of A.I., Is Seeing Still Believing?” Hany and John Tariot have been having an ongoing conversation about the impact deepfakes will have on archives, and he and John continue the discussion here on some of the issues raised at the Association of Moving Image Archivists conference session “Everything in Your Archive is Now Fake.”

Read More

Fortune: Fake Porn Videos Are Terrorizing Women. Do We Need a Law to Stop Them?

In the darker corners of the Internet, you can now find celebrities like Emma Watson and Selma Hayek performing in pornographic videos. The clips are fake, of course—but it’s distressingly hard to tell. Recent improvements in artificial intelligence software have made it surprisingly easy to graft the heads of stars, and ordinary women, to the bodies of X-rated actresses to create realistic videos.

These explicit movies are just one strain of so-called “deepfakes,” which are clips that have been doctored so well they look real. Their arrival poses a threat to democracy; mischief makers can, and already have, used them to spread fake news. But another great danger of deepfakes is their use as a tool to harass and humilia

Read More