Alex Jones banned from Twitter, but lunatic fringe lives on


No more yelling "fire!" in a crowded theater:

Twitter has permanently banned the accounts of right-wing conspiracy theorist and radio host Alex Jones and InfoWars for violating the company's abusive behavior policies, the company said Thursday. The ban appears to be related to a heated exchange between Jones and a CNN reporter Wednesday, which Jones live-streamed on the Twitter-owned video service Periscope.

"We took this action based on new reports of Tweets and videos posted yesterday that violate our abusive behavior policy, in addition to the accounts' past violations," the company said in a series of tweets. "We wanted to be open about this action given the broad interest in this case."

You can place me firmly in the "good job" column on this. Usually not a big fan of censorship, but Jones has proven himself a danger to the public, especially after the crazy "Pizzagate" incident where an (of course) NC citizen grabbed his guns and assaulted a DC restaurant looking for an imaginary child sex operation. Unfortunately, Jones is not a "Lone" nut-job. A lot of the more dangerous elements in the White Supremacy crowd are incubated on virtually unregulated platforms like 4Chan, and I was pleased my local newspaper reprinted this deep dive on the subject originally published by WaPo:

The researchers found the use of the word “Jew” and a derogatory term for Jewish people nearly doubled on the “Politically Incorrect” discussion group on 4chan, an anonymous online messaging board, between July 2016 and January 2018. The use of a racial slur referring to African Americans grew by more than 30 percent in the same period., a social media site modeled on Twitter but with fewer restrictions on speech, saw even more dramatic increases since the site began operating in August 2016. The use of the term “white,” which often occurred in connection with white-supremacist themes, also surged on both platforms.

These two forums, although small relative to leading social media platforms, exerted an outsize influence on overall conversation online by transmitting hateful content to such mainstream sites as Reddit and Twitter, the researchers said. Typically this content came in the form of readily shareable “memes” that cloaked hateful ideas in crass or humorous words and imagery. (Facebook, the largest social media platform, with more than 2 billion users, is harder to study because of the closed nature of its platform and was not included in the research.)

And that answers the question of "where" some of these really bad memes originated, but in order to understand the "why," Facebook and Twitter users really just need to look in the mirror. The more "clever" and artful these memes appear, the more likely we are to share them. Especially if they touch on some sort of "common sense" idea like support for veterans or children. But often enough those things are merely teasers, to get you angry at one group or another. Sometimes it's imagery, sometimes it's context, but if you are incapable of seeing the larger picture, you become a conduit for hate.

And with the election of Trump, the efforts of these extremists have been kicked into overdrive:

Efforts to portray the Parkland, Florida, school shooting as a hoax and its survivors as professional actors initially coalesced on fringe forums on Reddit, 4chan and 8chan, an offshoot for those who consider 4chan too restrictive, before shooting to the top of YouTube’s “Trending” list.

The QAnon conspiracy theory began circulating on the same platforms last fall before exploding into public view in August, after months of refining its central allegations, purportedly from a top-secret government agent, that President Trump is secretly battling a shadowy cabal of sex rings, death squads and deep-state elites.

The 4chan and Gab forums showed similar surges of terms referring to racial identity and white supremacy, with racially and ethnically charged terms increasing steeply on both sites after data collection began.

They also hit dramatic peaks in late January 2017, when Trump’s inauguration was celebrated by members of the “alt-right,” a movement that espouses racist, anti-Semitic and sexist views. A second, higher peak - with posts containing the terms amounting to about 10 percent of all comments on the forums - came in the days surrounding the Charlottesville alt-right rally, in August 2017, which ended in violence and the death of a counterprotester.

When asked for comment on the findings, Gab responded via its Twitter account: “Hate is a normal part of the human experience. It is a strong dislike of another. Humans are tribal. This is natural. Get over it.”

You're always going to have fringe elements like this, but the real challenge is keeping them contained on their own bigoted platforms. Both Facebook and Twitter need to take a more aggressive stance on not allowing this stuff to spill over into their much more populated forums. But most of all, we need to (individually) work to stop hate whenever it pops up. If you see somebody sharing something hateful, speak out before you un-friend or un-follow them. It may not change their mind, but it might stop others who are reading it from sharing the same nonsense.