What is a Wiki War?

Wiki wars, defined somewhat.

War, like the man said, is hell. A wiki war is no different.

“Battles to the death for insanely low stakes” as put by veteran Wikipedia editor and RationalWiki brain trustee, David Gerard.

Wiki wars are literally conflicts between individuals or groups fighting over the control of narratives published on the internet, primarily Wikipedia. Even bots are used in wiki wars, with counter bots created to battle them in return.

A wiki war occurs across a digital, contextual battlefield. A landscape comprised of WikiMedia platforms, Google search, sub Reddits, and blogs which become weaponized to suppress edits by groups of online users or to intimidate and harass other users on the web.

Wiki Wars are very complex, and in a heated event they can often require 8 – 16 hour full time days in heated arguments, consensus building, research, and three dimensional chess strategies between admins and editors gaming the process.

Wiki wars often begin from one single event, an ‘edit war’ that occurs on Wikipedia between the editors on an article.

The edit war turns to wiki war when working through disagreements on the sly, employing various tactics of editor suppression to remove the dissenting editors from the article.

One aggressive event to remove an editor from an article can trigger hundreds of defensive, and then regressive responses, igniting communities around the web into a much larger event which can play out for years. (see GamerGate)

On an encyclopedia that ‘anyone can edit’, this can mean just about anyone can find themselves involved in one.

Wiki wars are far more nefarious than just a bunch of nerds arguing over oxford commas.

Wiki wars are dark collaboratives, evolved upon the design sensibility of WikiMedia software and playing off of PageRank algorithm for Google search.

Therefore while Wikipedia often is the center, these wiki wars spread outside of Wikipedia, to other WikiMedia platform sites such as RationalWiki, Encyclopedia Dramatica, Metapedia, Conservapedia, KiwiFarms, and a host of other wikis.

Wiki wars can easily be initiated by any interest based group or individual, for any reason, political, polemic, persuasive, or just in some instances, nothing at all.

The winner in the wiki war controls the broadcast signal around a specific narrative.

The encyclopedic voice of Wikipedia is credibility, notability, and reliability for searchable information. This is the grand prize for controlling a narrative, walking into global credibility and influence. The most valuable real estate on the web. Google search.

Wikipedia’s own guidelines encourage us to trust the narrative voice on Wikipedia with it’s emphasis on ‘neutral point of view’ editing, a paradise where only disinterested editors without any bias are tirelessly improving the encyclopedia.

WikiMedia Foundation broadcasts this optimistic and utilitarian message through PR and TED talks.

The value of Wikipedia as a powerful online publisher gives any shrewd agenda editor a touch point to disseminate references, context, and in some cases mis-information and mis-direction to influence every one interested in a search topic.

The losers in a wiki war are banned from editing an article in the least, and can face harsher repercussions, such as doxxing, harassment, reputation destruction, and stalking; as detailed on this study.

Note; many “losers” in the wiki war on Wikipedia can often rush off to another WikiMedia software platform, and restore their lost narrative on a host of other wikis, each one catering to a specific world view, such as RationalWiki, RightPedia, InfoGalactic, or Conservapedia.

Many other “wikis” on the web actually emerged from wiki wars happening on Wikipedia, such as RationalWiki and Conservapedia, and now with the emerging alt right “InfoGalactic”.

Other than GamerGate, we don’t hear about wiki wars much, really.

WikiMedia Foundation is, as one would naturally expect, the most influential voice broadcasting about all things “Wikipedia”.

Because of this, ‘Wiki Wars’ are sometimes reported as ‘cutesy’ little fun things happening on the worlds greatest thing since slice bread or only reserved for a dozen or so highly significant topics which Wikipedia always seems to be able to account for.

GamerGate did get some mainstream attention on them, but the complexity of the problem over clouded the mediums that gave it life.

So one reason we don’t hear about them much is primarily because we hear of all the good things about Wikipedia from WikiMedia more.

Wiki Wars are “buried out in the open”.

The other reason I believe they are difficult to bring attention to is just the complexity of the software itself. Why? Because a wiki war is so mind numbingly awfully complex to follow, highlight, and detail.

This complexity of the software itself cloaks discoverability of what is actually occurring in one, and unless someone has been directly involved, they are almost impossible to bring attention to.

Wikimedia software functions as a foundation for wiki war activities.

One of the hurdles in producing this site was actually detailing the full arc of events in a wiki war while working through the labyrinth of the platforms software.

“Wikis” in general are widely adopted via WikiMedia’s software, which is the underlying platform housing the majority of most online wikis beyond Wikipedia.

Any flaw, therefore, on Wikipedia is repeated across all wikis running off of Wikimedia’s software.

Elements in a wiki war.

In a wiki war, a number of online forces appear to converge, sometimes erupting in a ‘perfect storm’ of miscommunication, mistrust, and misinformation. In this sense, a wiki war is a ‘whole system’, with varying dynamics that find their balance inside of one.

Some of these “elementals” are easy to outline, distinctive elements that highlight their activity.

These three distinctions I see I refer to as digital wildfires, wiki noise, and social propaganda.

I’ll do the best I can to define each one.

What are Digital Wildfires?

Digital wildfires represent bad distribution with questionable attribution, a medium with which misinformation can quickly spread and become adopted as true by millions.  Pizza Gate is a recent example of a digital wildfire.

According to the Macmillian online dictionary – a digital wildfire is ‘false or suspicious information’ that spreads virally on the internet.  A ‘dark meme’ that is collaboratively constructed by a mob type mind set. A false rumor that uses online social networks to spread at ‘breakneck’ speed.

In 2014, the World Economic Forum declared ‘digital wildfires’ a leading global threat for stability. And that was published before the 2016 election.

Digital Wildfire is a broader term for fake news and online misinformation – now a dominant tactic of war between nations, states, shareholders and citizens.

False information emerging into a human reaction to what is believed to be true.

While there is plenty of evidence I believe to show that things like PizzaGate were also intentional digital wildfires, manipulated by a small set of deceptive actors – what really gave it adoption were well intentioned people. They really believed they were white knights, defending children against a great evil in the world.

In the belief that we are doing the right thing, we can easily participate in a “digital wildfire” without realizing the harm we are causing or the repercussions that can follow.

“Digital” is the key word in digital wildfires.

This medium is pure technology and therefore “digital wildfires” are an epiphenomenon of the internet. Digital online environments via mobile phones or desktop computers enable exponential spreading of information, or misinformation, at break neck speed to hundreds of millions around the globe.

These mediums also influence the behavior of the users in participation with them, not just the speed at which bad “memes” can distribute.

Digital wildfires, just like wild fires in nature, can be started intentionally, or unintentionally. I believe what accelerates them spreading is rather human, things like deception, misunderstanding, and confusion – all of which become amplified by digital technology and act like dried tender, just waiting to be ignited.

Just like forest fires, digital wildfires only require one online user with the intention of causing disruption to ignite.

I believe the majority of digital wildfires, however, are unintentional. I think it may be easier to understand them and build solutions for them by distinguishing that set from the more nefarious and intentional ones.

Unintentional wildfire: Wiki Noise.

Let’s refer to wiki noise as just ‘semantical confusion’ that arises uniquely in a digital environment. Something we just have to work through in an online collaborative to avoid any misunderstanding, a breakdown in “seeing what each other means”.

That is really all ‘wiki noise’ is, common misunderstanding. And often it can be a feature of language or medium itself, not any one’s intention.

Left unattended, wiki noise is like dried tinder, awaiting to be ignited.

Digitally, especially with the limitations within WikiMedia software, this noise can be amplified.

Sci Fi Author Robert Anton Wilson coined this as ‘semantic noise‘ back in the 1980’s. I always loved the stories Bob told about it. Say or write the words, “I love fish” and one group interprets it as a preference for dining, another as a fondness for a home aquarium, without even being aware there was any disagreement at all.

I’m Watzlavic! No you are!

The infamous ‘I’m Watzlavic!’ story told by Wilson and Dr Paul Watzlavic, a communications theorist and psychologist, highlights how a ‘communication jam’ coming from different contexts being perceived in a conversation can even lead to perceptions of insanity or paranoia about others.

Dr. Watzlavic told the story of his own personal account as a new staff member at a hospital. His first day on the job, he reported to the office, where he found a woman sitting at the desk. He assumed it was the director’s secretary and approached her.

I’m Watzlavick” he said, assuming the ‘secretary’ would know he had an appointment.

She responded with “I didn’t say you were.”

A bit taken aback, Dr. Watzlavick exclaimed “But I am!”

To which he heard her reply ”Then why did you deny it?”

Dr. Watzlavic at this point classified her as a schizophrenic patient and was concerned she had wandered into the staff offices. Naturally, he became very careful in ‘dealing with’ her.

However, from the woman’s point of view, Dr. Watzlavick himself had appeared as a schizophrenic patient.

To her, a strange man had approached and said, ‘I’m not Slavic.’ Her own experience with paranoids taught her they often begin conversations with such assertions, vitally important to them, but sounding a bit strange to the rest of us.

When she replied – “I didn’t say you were,” she was trying to soothe him.

When she heard him reply ”But I am!” she added schizophrenia to paranoia in her assessment of him.

She then replied, “Then why do you deny it?”

She then became very careful in ‘dealing with’ him in return.

Dealing with these natural and unintentional communication jams can be serious business in online consensus building. It is easy for anyone to mirror their own personal psychology to filter out semantical noise or the confusion inherent in text and meaning.

This suggests that even if all participants in an online collaborative are all well intentioned individuals, the inherent flaws of the communication medium itself can sow the seeds of mistrust, even paranoia, in the consensus.

So a digital wildfire can be informed by nothing more than the inherent flaws of our own communication mediums and language, even without intention.

“Misunderstanding” is of course not the only thing that can ignite a digital wildfire – what can also ignite a digital wildfire is social propaganda, intentional misinformation.

Intentional wildfire: Social Propaganda.

Screen Shot 2015-05-13 at 10.53.32 PM

Social Propaganda, unlike ‘wiki noise’, is intentional.

Social Propaganda is similar to just plain old propaganda, with the exception that it is used towards influencing the attitudes of others in a ‘small or closed social groups’ online to accept or discredit new voices coming into the group while boosting their own.

It is also somewhat a ‘anti social’ methodology of raising status or lowering status of different voices and perspectives in any form of online consensus with information or misinformation timed to target, embarrass, threaten or compete with someone.

In online consensus building, any type of activist group or individual, any disgruntled commenter, will use a tactic I refer to in this study as “flag waving”.

Flag waving strategies are methods used in social propaganda campaigns to misdirect or mislead an online community, a mixture of persuasion and deception.

There are various subtle, even petty misinformation strategies that are easily to perform and surprisingly effective in scope.

In wiki and consensus building groups – social propaganda tags such as ‘sock puppet’ and ‘troll’ are used to favor dissent against an opponent, and on Wikipedia my direct experience proved can be used as a tactic for editor suppression.

I’ve witness my own ‘propaganda’ campaign, instigated by the notorious Oliver D. Smith, spread across multiple platforms, attempting to redefine the narrative of who I am, and of course to discredit this website.

Screen Shot 2015-05-13 at 11.01.29 PM

Combining social propaganda with wiki noise, we have the ingredients of a digital wildfire that flow from wiki wars and continue to play out on any medium it can find a home.

So just how common are ‘wiki wars’ on Wikipedia?

Currently, there is no way to tell other than by visiting various noticeboards on Wikipedia and trying to figure that out from reams of discussions and accusations flying back and forth. There is no way to have a platform wide accounting of the problem because ultimately, no one is really accountable on Wikipedia.

However, the more time that carries on – the more the faulty architecture of Wikipedia becomes exposed, and what’s becoming exposed is that Wikipedia has no real solution to the problem. So if the problem happens a lot – then the leading source for global public education can only head down a path of discredit for Wikipedia if no solution emerges.

The crisis of wiki idealism.

The idealist message of what the internet can offer can often distract from genuine real world problems, such as fraud, harassment, fake news, propaganda, manipulation, slander, libel and tracking that are occurring online.

It’s no different on Wikipedia.

If we are to obtain the utilitarian ideal of what knowledge building, collaborative platforms offer, these darker problems will need to find a resolution.

The conflict however that I see is that historically, WikiMedia has promoted the idealistic message while somewhat sweeping under the rug the very real world type problems that are occurring because of it.

WikiMedia conveniently passes responsibility to a community that will not arrive at a consensus.

Currently, I believe there is no solution to the problem likely to be adopted on Wikipedia. WikiMedia software itself creates too much of a competitive environment, creating a competition while amplifying opportunities to exploit anonymity and twist it into deception and manipulation .

WikiMedia takes a hand off approach, touting idealism. Thumbing their noses at revenue models, or paying editors or admins, they give the responsibility to a community with no tools to arrive at it.

Therefore any changes to Wikipedia’s structure, or the adoption of a new one, would need to find community consensus, which is prevented by the same software that facilitates it.

They tend to sort of stick their heads in the sand, removing themselves legally from any liability for any abuse that occurs on the platform they created, pass of the responsibility to an unmanaged and anonymous online community with plenty at stake with little to no proper oversight.

Free of having to worry about core problems in community editorializing like any responsible publisher would normally have to, all Wikimedia has to worry about is fundraising and spreading the utopian message at TED talks, receive the adulation of the world, a massive PR friendly message by every web search.

It makes one question who is really benefitting from giving all the power and responsibility to a community that is growing toxic.

 

 

4 Comments

  1. I read pretty extensively into the issue with the feminists getting banned, and it was very clear that they had broken Wikipedia’s established rules. Say what you want about truth and agendas, but enforcement of Wikipedia’s long established policies was correctly carried out here–nobody was treated unjustly. To frame this as feminists being silenced because of an agenda is very dishonest.

    • When it comes to gamergate – there are many different points of view regarding what occurred. Regardless of what occurred, it’s simply a fact that women are a minority voice on the encyclopedia, even Jimbo Wales talkes about and acknowledges that problem. There are plenty people who would disagree with your assessment. Mark Bernstein’s own essays lauding Wikipedia showed this. The reason gamergate is mentioned in this study is because it showed how Wikipedia is unable to deal with *any* form of agenda based editing.I’m not involved with either feminism or gamergate – I’m just using that as the most public example of a consistent problem which I do not believe there is any solution for.

  2. Yeah, I always thought wiki was bias. I’ve never used it for more than a place to get source to learn about a topic, along with Google of course.

    #GamerGate is still going and I’ve been intimately active in it since August 2014. It’s amazing how twisted the Wiki editors have made the article. So many are hell bent on making #GamerGate about gender politics, they chased off and banned most of the editors that were willing to leave some of the “it’s about sexism” in the article, but wanted it to be mostly about the ethics and journalist corruption side of it. Editors break Wiki’s own rules constantly on what is a reliable source and then they make exceptions for articles from the same (un)reliable sources as long as the article says what they want.

    That in turn affects Google, BTW, since Google’s search algorithm assigns weights to pages based on how they’re linked from other sites. Something I think Google is working on is moving to a “fact” based algorithm rather than how connected a page is it’s raking will be based on how “factual” the information is.

  3. Wiki wars are, more often than not, fought between competing government agendas. If it is a government agenda versus an individual, the individual soon finds themselves banned from Wikipedia, with their name smeared through the mud. If it is two individuals fighting, then nobody really cares all that much. The interest comes when they can’t decide what lies to tell, and what lies to expose. The US government perspective is, obviously, the majority one, since this is a US government site, but what if two allies of the US are fighting over a perspective? What if more than two are fighting over what lies to tell? This happens in real life, over such things as just why Indonesia executed those two Australian drug dealers, where we have 4 or more different national government agencies who are telling lies about it, and they keep changing their mind as to which lies to tell. That’s when they are problems.

    If you are an individual fighting a “wiki war” against a government agency, your chances of success are about as high as if you are fighting a real war against real government agents. So unless you have some high ranking terrorist friends, or some competing governments to help you out, you can’t win. And even if you do have such friends, you are going to lose later on anyway, because, after all, they are the ones in power.

    Info wars are no laughing matter, as they decide what truth is out there. There are numerous examples of truths that have been changed thanks to Wikipedia. The most obvious one that I know of is their truth changing about the Port Arthur massacre, where they changed what the general public thought. See here: http://encyc.org/wiki/Port_Arthur_massacre_truth_changing

Leave a Reply