AI + FinTech. Could Blockchain be used to combat Deepfake Videos?

The world of crypto currency is exciting, fast moving, and still a bit risky in terms of valuation, hacking of digital wallets and exchanges or losing your password (and access to your crypto account). In spite of these risks, early products such as Bitcoin, Ethereum and many other digital currencies are currently being used or trialled around the world.

But sitting underneath its most well-known application – digital currency – is a technology called blockchain that has the potential to be a game-changer for many industries.

In simple terms, blockchain is a method of encrypting data using complex mathematics and a shared, distributed ledger wherein all transactions are instantly visible to everyone. The identity of the parties conducting any given transaction are not in the public domain, but the value and record of the transaction is. And, in order for a transaction to be valid, all copies of the ledger must agree.

Although this sounds really complex, blockchain works in much same way as Google Docs, where multiple parties have access to the same document at the same time, but a single version of that document is always visible to everyone. Google Docs is like a shared ledger, but it’s a shared document.

Blockchain technology is being used or trialled for many different applications, including securities exchanges, e-voting, sports betting, and digital identity.

It may also be the solution to one of the latest problems facing society – fake news. The whole fake news phenomenon has been around for a while now, but with artificial intelligence it’s now possible to use text-based editors to literally change the words that someone says and superimpose someone’s image onto a video of another person in a way that is indistinguishable to the human eye.

With this kind of technology anyone’s face can now be superimposed on anyone else’s, creating what seem to be authentic videos of just about anything. Hollywood film makers have been doing this for years – even reanimating dead actors for reprisal roles in blockbuster sequels – but with recent advances in facial recognition and AI, this capability is now accessible to the masses.

A new term “Deepfake” -- a blending of "deep learning” (advanced AI-based learning capabilities) and "fake" -- has even been invented to describe this AI-based human image synthesis technique.

Deepfakes first surfaced on the Internet in 2017 in the form of fake celebrity porn videos, and were most recently used to make fake videos of President Trump, Hillary Clinton and Vladimir Putin, based on performances by the cast of "Saturday Night Live."

The implications of this are staggering. Imagine a realistic fake video of a world leader announcing a nuclear strike, a stock market crash, a terrorist threat, bio-weapon contamination or anything else that could cause mass panic or retaliatory action.

In the corporate world, a Deepfake of a company CEO could be used to authenticate funds transfers, “show” participation in criminal activities, create fake company positions or` implicate the CEO in a wide array of embarrassing situations.

And, the underlying AI, image and voice modification technologies are getting better all the time.

So, how can you detect a Deepfake? One method is to use anomaly detection to pick up on the little things that are unique to an individual such as posture, gait, hand gestures, phraseology, or the way that someone smiles. Anomaly detection techniques are already being used by ARIC, an AI from Featurespace that’s being used to combat fraud by picking up on an individual’s activities that look different to what it has seen in the past.

Another method that has recently been proposed is to use blockchain to authenticate the provenance of a video. Because blockchain technology can be used to securely store documents which cannot be altered at a later stage, it can also be used to determine whether a video matches the footage from an original recording.

Of course, all of this has to happen in real-time (or close to it), which is an added complexity when one considers the fast distribution of fake news across multiple social media channels.

The advent of Deepfakes is opening up new cyber security threat vectors, and we need to quickly adapt AI, blockchain and other technologies to combat these threats and help us to know if what we're seeing is indeed an accurate portray of real events.

For further information on Ode team or to enquire about making a booking for your next conference or event please contact the friendly ODE team

Asia/Pacific

  • +61 2 9818 5199
  • info@odemanagement.com

United States

  • +1 877 950 5633
  • enquiries@odemanagement.com
AI + FinTech. Could Blockchain be used to combat Deepfake Videos?
Go To Top