Self-learning systems
Despite Robert’s comment on non-state sponsored campaigns being a little more rudimentary, he’s also clear that it is on the more obscure corners of the internet – including “the chans”, as he calls them – where factions of people gather to fabricate the most damaging narratives, and collaborate to create stronger and stronger brand disinformation campaigns. If brands don’t yet know how to make sense of those factions, those corners, those campaigns, they should be worried.
“There’s not quite a formal Lynda.com course on it, but if you go into the right online forums you can see a discussion about how to best construct disinformation, how to weaponize efforts to inflict maximum damage against brand trust. It’s interesting to watch the coordination – it becomes very crowd-sourced.”
He gives an example of someone who’ll share an asset they plan to use to disparage someone or some organization and ask for feedback.
“They will rapidly iterate to a maximally effective meme or campaign to launch out onto the internet at large, so this training and self-learning system reinforces itself.”
“Repeat a lie often enough and it becomes the truth,” Robert says, using a famous quote often attributed to Joseph Goebbels, who held the title Reich Minister of Propaganda in Nazi Germany. “And it turns out that on the internet, enough is about seven times. That’s why proactivity is essential for brands.”
The strongest way repetition can boost trust in disinformation is to have an individual experience the message from multiple, seemingly unconnected sources. As plans for coordinated disinformation attacks continue to be iterated, it’s not hard to imagine how much havoc could be wrought by a small group of hyperactive individuals (whether state sponsored or not).
A sense of responsibility
Robert hasn’t always worked in disinformation, but his motivation for getting into the field is pretty refreshing.
He tells me that as someone who was an early adopter of the web and social media, and as someone who’s worked as a developer and technologist, he feels a strong sense of obligation to find solutions for the problems he sees.
“At the dawn of the social web, technologists failed to foresee the dangers of the systems we were creating.
“As the platforms evolved and they privileged virality and as our civil discourse increasingly came to take place in a set of systems that were optimized for marketing, not for civil discourse, we came to find ourselves in a pickle, where we remain.”
How we, as a society, find our way out of such a pickle is not yet clear.
Many thanks to Robert for taking time to speak with us. And, if you’re interested in misinformation, make sure to catch his presentation at our Now You Know conference in Chicago this May.