This version is outdated: Please see “What is a Wiki War?” for latest full version.
Wiki war elementals.
In a wiki war, a number of online forces appear to converge, sometimes erupting in a ‘perfect storm’ of miscommunication, mistrust, and misinformation. War, like the man said, is hell. A wiki war is no different.
These elementals in a wiki war are easily distinguished as digital wildfires, wiki noise, and social propaganda.
I’ll do the best I can to define each one.
What are Digital Wildfires?
According to the Macmillian online dictionary – a digital wildfire is ‘false or suspicious information’ that spreads virally online. A ‘dark meme’ that is collaboratively constructed by a mob type mind set. A false rumor that uses online social networks to spread at ‘breakneck’ speed.
In 2014, the World Economic Forum declared ‘digital wildfires’ a leading global threat for stability.
And that was published before the 2016 election. Digital Wildfires is a broader terms for fake news and online misinformation – now a dominant tactic of war between nations, states, shareholders and citizens.
Digital wildfires represent bad distribution with questionable attribution, a medium with which misinformation can quickly spread and become adopted as true by millions.
Digital wildfires, just like wild fires in nature, can be started intentionally, or unintentionally.
An example of an unintentional digital wildfire is the father in Australia who was playing with a Darth Vader mask in a kids playground and then was accused of being a pedophile across social media networks within 24 hours.
This digital wildfire was a rumor that the woman who started it apologized profusely over.
In a brief second – she really did believe this man was a pedophile, and snapped his picture. She shared it over social networks and both her and the man are now devastated over the results over what happened.
“Digital” is the key word in digital wildfires. It represents the technical mediums and environments that distributes the information. This medium is pure technology. Digital online environments via mobile phones or desktop computers enable exponential spreading of information, or misinformation, at break neck speed to hundreds of millions.
Other digital wildfires are started intentionally, with at least one online user (which is all it takes, just like a forrest fire) being fully aware of the outcome and intentionally takes steps towards that goal.
I believe the majority of digital wildfires, however, are unintentional. I think it may be easier to understand them and build solutions for them by distinguishing that set from the more nefarious and intentional ones.
Misunderstanding on the web is also noise – and ‘wiki noise’ is meant to somewhat define more online collaboration of some kind, and the very natural pitfalls that come from multiple points of distorted information emerging in one or more online communities into a larger distorted narrative that is then shared via Facebook and Twitter and ranked in Google search for discovery.
Once this ‘wiki noise’ is adopted, picked up and further analyzed on blogs, niche online communities, and discussion forums like Reddit, quicker adoption amongst online users should be expected.
These might be easier to mute or resolve.
What is Wiki Noise? Digital misunderstanding.
Pizza Gate is an example of a digital wildfire. False information emerging as a human reaction to what is believed to be true. While there is plenty of evidence I believe to show that things like PizzaGate are intentional digital wildfires, manipulated by a small set of deceptive actors – what really made it spread were, in their own minds – well intentioned people. They really believed they were white knights, defending children against a great evil in the world.
In the belief that we are doing the right thing, we can easily participate in one without realizing the harm we are causing or the repercussions that can follow.
Let’s refer to wiki noise as just ‘semantical confusion’. Something we just have to work through in an online collaborative to avoid any misunderstanding, a breakdown in ‘seeing what each other means.
That is really all ‘wiki noise’ is, common misunderstanding. And often it can be a feature of language or medium itself, not any one’s intention.
Sci Fi Author Robert Anton Wilson coined it as ‘semantic noise‘ back in the 1980’s. He saw it as an inherent flaw in any communications system, even among well intentioned people. I always loved his stories about it. Say or write the words, “I love fish” and one group interprets it as a preference for dining, another as a fondness for a home aquarium.
I’m Watzlavic! No you are!
The infamous ‘I’m Watzlavic!’ story told by Wilson and Dr Paul Watzlavic, a communications theorist and psychologist, highlights how a ‘communication jam’ coming from different contexts being perceived in a conversation can even lead to perceptions of insanity or paranoia about others.
Dr. Watzlavic told the story of his own personal account as a new staff member at a hospital. His first day on the job, he reported to the office, where he found a woman sitting at the desk. He assumed it was the director’s secretary and approached her.
“I’m Watzlavick” he said, assuming the ‘secretary’ would know he had an appointment.
She responded with “I didn’t say you were.”
A bit taken aback, Dr. Watzlavick exclaimed “But I am!”
To which he heard her reply ”Then why did you deny it?”
Dr. Watzlavic at this point classified her as a schizophrenic patient and was concerned she had wandered into the staff offices. Naturally, he became very careful in ‘dealing with’ her.
However, from the woman’s point of view, Dr. Watzlavick himself had appeared as a schizophrenic patient.
To her, a strange man had approached and said, ‘I’m not Slavic.’ Her own experience with paranoids taught her they often begin conversations with such assertions, vitally important to them, but sounding a bit strange to the rest of us.
When she replied – “I didn’t say you were,” she was trying to soothe him.
When she heard him reply ”But I am!” she added schizophrenia to paranoia in her assessment of him.
She then replied, “Then why do you deny it?”
She then became very careful in ‘dealing with’ him in return.
Dealing with these natural and unintentional communication jams can be serious business in online consensus building. It is easy for anyone to mirror their own personal psychology to filter out semantical noise or the confusion inherent in text and meaning.
This suggests that even if all participants in an online collaborative are all well intentioned individuals, the inherent flaws of the communication medium itself can sow the seeds of mistrust, even paranoia, in the consensus.
So a digital wildfire can be informed by nothing more than the inherent flaws of our own communication mediums and language, even without intention.
“Misunderstanding” is of course not the only thing that can ignite a digital wildfire – what can also ignite a digital wildfire is social propaganda, intentional misinformation.
What is Social Propaganda?
Social Propaganda, unlike ‘wiki noise’, is intentional.
It is also somewhat a ‘anti social’ methodology of raising status or lowering status of different voices and perspectives in any form of online consensus with information or misinformation timed to target, embarrass, threaten or compete with someone.
Social Propaganda is similar to just plain old propaganda, with the exception that it is used towards influencing the attitudes of others in a ‘small or closed social groups’ online to accept or discredit new voices coming into the group while boosting their own.
In online consensus building, any type of activist group or individual, any disgruntled commenter, will use a tactic I refer to in this study as “flag waving”. Flag waving strategies are methods used in social propaganda campaigns to misdirect or mislead an online community, a mixture of persuasion and deception.
There are various subtle, even petty misinformation strategies that are easily to perform and surprisingly effective in scope.
In wiki and consensus building groups – social propaganda tags such as ‘sock puppet’ and ‘troll’ are used to favor dissent against an opponent, and on Wikipedia my direct experience proved can be used as a tactic for editor suppression.
I’ve witness my own ‘propaganda’ campaign, instigated by the notorious Oliver D. Smith, spread across multiple platforms, attempting to redefine the narrative of who I am, and of course to discredit this website.
Combining social propaganda with wiki noise, we have the ingredients of a digital wildfire and wiki war that continues to play out on any medium it can find a home.
I’ve been very fortunate as a developer and researcher of online consensus building processes. I’ve gotten to personally witness wiki noise and social propaganda fuel my own digital wildfire, lasting now for more than two years. ‘Wikipedia, we have a problem‘ is gonzo research and live blogging as my own digital wildfire continues to spread around the web.
Starting with Wikipedia editor Manul in September of 2013, continuing during my request for a new consensus, concluding into my AE indefinite block, to the publication of my name as a Wikipedia editor with an article about me on Rational Wiki – to a handful of Wikipedia Review and Wikipediocracy forum members to Reddit and WikiInAction, my digital wildfire continues to spread more than three years later later.
How can we stop digital wildfires?
On Macmillan’s online dictionary page for ‘digital wildfire’ – it reports that;
Paradoxically, one of the most effective ways to deal with a digital wildfire turns out to be use of the same social media avenues to set the record straight.
To diffuse a digital wildfire – we have to broadcast.
Wikipedia We Have a Problem is a study into online harassment and editor suppression tactics introduced in consensus building, focusing on Wikipedia, wiki wars, and the personal events that can happen from participating in them.
It’s also a fair attempt to diffuse my own digital wildfire – and report the results of each event as they occur.