On the morning of Monday, November 8, two day after most news outlets had called the US presidential election for the Democratic candidate, Joe Biden, an image began circulating widely on social media. It showed an old front page of The Washington Examiner newspaper.
“PRESIDENT GORE” the banner headline read. The accompanying messages were often just as emphatic: don’t trust the media to determine an election. The sense was that many outlets prematurely declared Al Gore president two decades earlier, only to have George W. Bush ultimately occupy the White House.
Yet there was a different kind of media manipulation going on with the image of that front page: it was a fake.
As it spread virally online, the bogus broadsheet became just the latest example of digital disinformation in an age that has come to be defined by it.
Trump campaign spokesman Tim Murtaugh appears to have deleted this tweet after the Washington Times pointed out that it never ran such a headline. pic.twitter.com/vRPQH7vrej — Felicia Sonmez (@feliciasonmez) November 8, 2020
It’s not too late for a more reliable and trustworthy experience online, though.
Head in the cloud? AI on your mind?
It’ll take fighting one disruptive technology with another, using a solution that’s immutable, irrefutable and wholly transparent: blockchain. Groups as varied as newsrooms, nonprofits, major corporations and start-ups are all eagerly pursuing blockchain to create distributed, transparent networks for reliable media and digital information.
This new technology won’t necessarily stop people from posting false information. What blockchain-based news and media projects could do, at minimum, is foster a new sense of trust in what they see online by making it easier to track and verify. Such efforts could also encourage the public to exercise a healthier skepticism of online media overall.
“It’s getting hard to tell what stories are true and what stories are fake because we don’t know where they’re coming from,” Benjamin Gievis, co-founder of Block Expert, a Parisian start-up focused on blockchain, told Industrious. “What if we could create an ID and an ecosystem that could authenticate a news source and follow it wherever it’s cited or shared?”
This was the idea behind Safe.press, Block Expert’s first product.
Blockchain key to trust
Launched last year, the Safe.press consortium is open to any newsmakers, be they corporations, public agencies, NGOs, even individuals, as well as news outlets. Each time a member publishes a press release or article, a green Safe.press stamp is added to that page. The stamp functions like a digital seal of approval that’s linked to an associated blockchain key.
That key is instantaneously registered on a blockchain ledger, which Block.Expert built using IBM’s open-source Hyperledger Fabric. Whenever one of these registered news sources is appended to future stories or references, its key gets tracked.
“Blockchain, I feel, is the new generation of the Internet,” Gievis said. “We had the web, then e-commerce, then social media. With blockchain, we’re building a new type of network, and a new way of communicating, with transparency and security from the start.”
With its immutable, traceable keys, blockchain can become the digital equivalent of factory seals. Validators like Safe.press can help reveal whether an item—be it news, information, currency or physical object—has been tampered with or faked.
And Block Expert isn’t alone. IDC, the tech research firm, notes more than a dozen blockchain start-ups are focused on media trust.
“We’re not here to say if this is good news or fake news,” Gievis points out. “We’re here to say this is authentic news. It’s been recorded, it’s being tracked, here’s where it came from, here’s it’s value. It’s a new way of seeing the story.”
Blocking bad press
The hope is, as Safe.press and similar verification tools proliferate, it becomes difficult for outsiders to fake a press release or spoof a news site. Trusting an unverified article then becomes like eating at a restaurant without a health department grade—consumer beware.
And corporation beware, given the reputational risks fake news can present.
That’s been a concern for Orange, the Paris-based multinational provider of phone, Internet and media services.
It’s partly why all the company’s press releases now carry a Safe.press tag, starting with ID 5ccd05dd2410eef0d36e36ea744584c64562ae3d. That was the March 2019 announcement that Orange was Block Expert’s premier partner on Safe.press.
Orange joined the consortium because of its commitment to social impact and innovation, as well as to protect its brand reputation. Another French company saw its stock fall more than 20% in a single day following the publication of a false press release.
Going deep on fakes
Trust is also being challenged beyond written information.
Audio and video software are becoming so powerful that troublemakers and criminals can create what’s known as deep fakes. Using a cache of existing recordings and automated programming, voices and bodies can be swapped or even entirely fabricated within a matter of hours or minutes. One of the most famous examples involves the comedian Jordan Peele impersonating US President Barack Obama.
Kathryn Harrison was so concerned about these faked videos that after five years leading blockchain projects at IBM, she departed to found the DeepTrust Alliance. The nonprofit is assembling partners and developing programs to help validate when an image or video has been manipulated, and they recently released a framework outlining this ecosystem.
“Every time we upload and share content, it gets compressed, it gets cropped, it gets changed, perhaps it gets filtered,” Harrison explained. “All you need to do is change one pixel and the original hash, or identifying marks, are no longer valid. To the to the human eye, it looks the same. But an automated system won’t necessarily register it as semantically the same because the 1s and 0s are different.”
It’s an especially common problem on social media, where photos and video are regularly taken out of context. Images of one graphic war scene, protest, even a celebrity can be cropped, filtered or otherwise altered to avoid the detection of automated visual scanners that many websites run to catch fakes. This allows altered images to then be posted as something they’re not.
That’s how, for example, a scene from a 2014 Lebanese music video gets misconstrued as a lost girl in war-torn Aleppo, Syria.
New trust in digital news
Among the ideas the DeepTrust Alliance is proposing to address such fakery is registering machine learning algorithms that enable the creation of deep fakes and other image manipulations to a blockchain ledger. In this way, researchers can track which algorithms are being leveraged to create synthetic media. Such algorithmic tracking helps not only with social media but also messaging services, where deep-fake pornography and non-consensual imagery run rampant.
DeepTrust would like to see this algorithmic information embedded into the common metadata many media files already carry (creator, location, camera type, filters, etc.). That bundle of information could then be given a secure key that’s uploaded to the blockchain ledger. DeepTrust also wants to create a tighter link between the research and development community with real-world applications.
As with Safe.press and other disinformation countermeasures, though, DeepTrust’s tools still place significant responsibility with the public to verify the authenticity of what they are consuming. That’s why the DeepTrust Alliance is also urging stronger coordination across technology, policy and education to teach the public to be appropriately skeptical of anything online lacking reliable credentials.
“We’re definitely coming from behind in this fight, but COVID’s actually provided a lot of awareness of the problem, with just the amount of misinformation, and even good information, out there,” Harrison said. “We’ve got to invest in the tools that are going to get us on the same footing. It’s an arms race, so we have to keep up.”
Look no further than similar efforts at The New York Times. With help from IBM Garage and Hyperledger, the so-called paper of record has launched The News Provenance Project, it’s aimed at teaching news consumers how to be more vigilant about what they see, especially on social media.
“The technology itself builds trust, but it’s also the jumpstart to seeing the world with more trust,” Gievis said. “And that’s what we need right now.”