Millions of dollars will be made on digital forgeries that can’t be detected

The dynamics of our economy, the caliber of our relationships, and the future of our own lives depend on influence. It’s why getting a cosign from Mark Cuban can jetset an entrepreneur on a million-dollar path. It’s why upcoming rappers want a feature from Drake – knowing that will be enough to take them to the next level.

A decade into modern social media, influence has become a quantified and commoditized game. And we’re beginning to create technology that can be used to counterfeit one’s influence and alter mass perceptions.

This goes way beyond any scheme of buying fake followers or getting bots to retweet someone’s garbage. This is high-tech digital forgery.

How to Fake Influence.

Deepfakes took the Internet by storm early last year, showing us a new way AI could be used to superimpose someone’s face onto the face of another person’s in a video. It didn’t take long for the deepfakes of Trump to start coming out and everyone quickly got terrified. Since then, the narratives around deepfakes tend to focus on: What if someone deepfakes a video of Trump calling for a nuclear war with North Korea?

Yes, propaganda and general malice can and will be a huge issue. But there are literally a million and one ways to use this tool.

For instance, I could create a montage of celebrity endorsements for my product and create an ad campaign selling it with these fake endorsements. Likewise, I could digitally plaster myself into videos of all the businessmen, influencers, and connections I want to have, thus, creating the illusion of my network to build an actual network. I could even go so far as stealing YouTube and other video creators’ entire digital archive and swapping his/her head with mine, and now I’ve hijacked all of his/her material ever created.

This doesn’t even begin to cover all the ways this tool can be used to convincingly fake one’s influence. If deepfakes aren’t reeled in, they could literally demolish any sliver of trust we put into our social media timelines.

What happens when this hierarchy of influence collapses and we’re left skeptical of anyone’s true connections? I’ll tell you what happens: social media will be a wasteland at that point.

DARPA is pooling resources into their Media Forensics program to be able to identify when a deepfake is at play. But, this project will probably be kept in-house to track down content that stands to harm the government.

We need something that could be applied everywhere else.

Already a future thinker?
Then become a friend.

How To Catch Digital Forgeries.

Every time you take a photograph or shoot a video, that piece of content is encoded with metadata – including the date, location, time of day, creator, and many other points. This information is relayed from your smartphone into the photograph once you create it.

Apple, Samsung, and other device manufacturers could agree to include some hardware standard that writes additional metadata information to the photo, saying it’s is a real picture or video shot on our device. Then, before a Facebook or YouTube accepted the photo, they would verify that it still holds true to the standard originally written to the photo.

In this sense, it’s very similar to baseball card forgeries which might lack the holographic sticker or texture of a car and therefore wouldn’t pass the trained eye.

These protocols already exist for Wi-Fi and bluetooth of devices. And adding one that helps eradicate the tampering of content would be very beneficial.

Interestingly, there’s a company taking this approach, but in a very niche way.

Called Amber Authenticate, the tool is meant to run in the background on a device as it captures video. At regular, user-determined intervals, the platform generates “hashes”—cryptographically scrambled representations of the data—that then get indelibly recorded on a public blockchain. If you run that same snippet of video footage through the algorithm again, the hashes will be different if anything has changed in the file’s audio or video data—tipping you off to possible manipulation.

Lily Hay Newman, Wired

Amber Authenticate has applied this video verification technology to police body cams, so that courts can verify the authenticity of law enforcement evidence. Ideally, this prevents all tampering of evidence.

This begs the question whether a similar solution could be applied to the grander scale of all user-generated content. I do believe that social media is a great use case for the blockchain, whether Facebook updates their service or we get a mass migration to Mastodon and other blockchain-based social medias.

However, we must realize that although blockchain would do wonders in helping us verify what is legitimate and original content from the digital forgeries and deepfakes. The blockchain also never forgets.

If you’ve ever had to delete embarrassing photos from your Facebook or apologize for a regretful thing you said on Twitter, then you know that the Internet has a way of archiving your past whether you want it to or not. Just look at the debacle Kevin Hart had to endure recently – apologizing for a Tweet he made a decade earlier, which he’d already publicly apologized for.

If we were to implement the blockchain, this further ties us down to the digital archives of ourselves. The blockchain never forgets. And I think we all should be wanting some leniency on our digital interactions. Or at least have the opportunity to pick up and walk away from the digital ecosystem, knowing that we’ve erased all of our online data.

In an age when the Internet never forgets, it’s important we fight for our Right To Be Forgotten.

Join me (digitally) for lunch on March 1st to hear me discuss the Right To Be Forgotten