Division, Inc: Part III

Opinion: The Alt-Right has been Propped Up by Hostile Influence Operations

Preamble: Prior to reading the rest of this piece, I have a favor to ask. Find your ego, grab hold tightly, and temporarily put it in a locked box far away from you while you read this. There are several tools that make these influence operations so successful. One of those tools is based upon exploiting our ego. Our pride is a powerful thing. It is being used against us. That’s part of the reason hostile operations have been far more successful than we think. Because our ego refuses to believe that we could be tricked by these hostile actors.

Note: A few pieces of evidence I’ve found via Google Trends which partially informs this hypothesis will be at the end.


How did the ‘alt-right’ become popular? We make assumptions about how this happened, but we don’t really know. Since we’re making assumptions, how about assuming it didn’t become popular naturally? Keep in mind, none of this is meant to downplay the alt-right or its affects on society.

Related: Division, Inc: Part IDivision, Inc: Part II

We know the term was coined by Richard Spencer in 2010. We know it became popular on the internet. This is where it spread. But with whom was it most popular, though? Real Americans? Or hostile actors seeking to influence Americans? What we do know is that most of the entities online who follow this trend are anonymous. The accounts that ‘watch,’ comment, like, and subscribe on Youtube are anonymous. We don’t know who is behind the façade that the accounts are wearing.

If we honestly evaluate it, the apparent growth of the alt-right online has been at least moderately influential in our society. It has influenced real people to become alt-right. It has influenced media to report on the alt-right. It has influenced people who don’t support it to become concerned about it.

We’ve even seen marches in the streets, such as the tiki torch march in Charlottesville. Speaking of, have you noticed that these American alt-right videos on Youtube get hundreds of thousands of views, yet public gatherings reflect perhaps 0.1% of that? That’s a pretty drastic difference. Now, perhaps you are tempted to attribute that to folks not wishing to out themselves. And that might explain part of it. But does it explain all of that massive difference?

Can we rule out that the online alt-right isn’t partially backed by hostile influence operations? Before you answer that, I ask you consider this. There is a mysterious pipeline that pushes people towards the alt-right on Youtube. People have been investigating this phenomenon for some time. My hypothesis might explain that strange phenomenon.

The Youtube algorithms that power this pipeline are gameable. All algorithms are, once you figure out how they work. For instance, Twitter’s trending algorithm is gameable, and I have observed it being repeatedly gamed by hostile actors. Also, Twitter knows this is gameable because it has had to come up with rules designed to penalize those who game the trends.

Hashtags 

  • using a trending or popular hashtag with an intent to subvert or manipulate a conversation or to drive traffic or attention to accounts, websites, products, services, or initiatives; and
  • Tweeting with excessive, unrelated hashtags in a single Tweet or across multiple Tweets.

On Youtube, the recommendation algorithm is generally based upon observing where accounts go next after viewing a video. It is not too dissimilar from the Amazon recommendation that says “Customers who viewed this item also viewed” [this assortment of products.]

This is relatively easy to game, if you have a number of anonymous accounts at your disposal. You can simply carve out a path, by having those accounts first click a rather innocuous video, then navigate to a more controversial video, then to an even more controversial video. Employing enough accounts to do this over time with a variety of innocuous videos creates a well-worn path through the field.

Hostile actors just so happen to have anonymous accounts at their disposal. They can beat that path through the grass and create that pipeline for unsuspecting users to follow. They can even influence and incentivize Youtubers to create content that capitalizes on these ‘alt-right’ watchers.

They can also use anonymous accounts across the internet to talk about and promote the alt-right. And thus, they can popularize a movement around it and influence society.

But that’s not all that there is to say about this rise. In 2015, when this term really started becoming known, there was one major entity that was targeting America with hostile influence operations. Since then, others have copied those tactics, as discussed by Robert Mueller in his congressional testimony. But at the time, it was mainly Russia.

Let’s break down exactly what alt-right entities online generally support and see if there’s any alignment with what Russian hostile actors have been shown to support. Here’s a very partial list:

  • Support for Donald Trump
  • Isolationism
  • Non-interventionism
  • Anti-immigration
  • Focus on gender roles (LGBTQ issues)
  • Having agreement with Putin’s talking points

Are we sure that the alt-right is not backed by a hostile influence campaign? We should set our egos aside and examine it. I’ll be the first to admit that I’ve been influenced by the alt-right. I’ve been influenced to be concerned about it, well before this thought came into my mind. I’ve been concerned about that pipeline, I’ve been concerned about the effect on society, and I’ve especially been concerned by all the extremism we have been seeing coming from people who have been influenced by the alt-right.

What if this is part of the influence operation? Radicalizing our young men to terrorize our country? We should really think about this. I have one final note. Deciding whether it is or is not an influence operation doesn’t change the impacts it is having on society.

But informing people of this might allow them to become less influenced by it, both ‘for’ and ‘against.’ It might make people a little more empathetic to those who have been heavily influenced by this. It doesn’t mean we condone it. But influence operations are designed to influence. We have to identify and reverse the affects of influence operations. And I think it is reasonable to suspect that the ‘alt-right’ is one of those hostile influence operations.


Regarding the data below, I’m fairly certain most readers are familiar with what’s in Moscow. And undoubtedly, most are aware of the hostile influence operation that is run out of St. Petersburg. Perhaps not so many are aware that there is also a suspected hostile influence operation in the subregion of Krasnodar Krai. Just food for thought as you glance at the specific subregions that these searches are coming from.