(855) 950-1150

We’re all still trying to process what happened in Charlottesville, Virginia. You probably know white nationalists met with otherwise peaceful protesters, resulting in several deaths. By now, you’d think we were over this kind of outward racial animosity. We were wrong.

What was striking was the appearance of the average marcher. All looked young. All looked like they had some means, as if they could fit into any crowd in America without being noticed. Yet at the same time, they could harbor this hate and spread their misinformation. Such is the nature of the internet communities that fan these flames.

There’s a hidden culprit, though. One that fomented the flames faster than any man could. Predictive algorithms took what might have started as a simple curiosity and turned it into outright hatred through a series of escalating and insulating options. And that doesn’t bode well as we move to an even more digital future.

A Refresher on How Algorithms Think

We’ve already expounded on what an algorithm is and how to beat it, yet it bears repeating after this real-world example of machine learning in practice. Algorithms are simply complex formulas used to power options and choices on the modern internet. They power everything from your Facebook News Feed to Amazon product recommendations. Once built, well-run algorithms will continue to collect data on you and refine their formulas, offering more better outputs over time. The more you engage, the better they work, supposedly.

Yet an algorithm can’t work on you without collecting at least some information about you. Hence the need to compile data points from your browsing history, social media interactions and other activity. Drawing entirely from your past, algorithms, like those found on Google, Twitter, Facebook, Amazon and everywhere on the modern internet, will feed you suggestions that are similar to what previously interested you.

Why base your future decisions on your past? Because there is some predictive power in what you’ve done before. If you’ve purchased tickets to a baseball game, for example, you’re more likely to do that again than someone who hasn’t. That makes sense on its face, doesn’t it? And it’s a great way to maximize revenue from people who are most likely to buy from you, in theory.

There’s an inherent flaw in this line of artificial intelligence, though. Algorithms cannot synthesize any new information. They won’t change your perspectives, only tell you want you want to hear and calcify you as you are. There’s only room for growth in one direction with an algorithm. No room for expanding your horizons and taking a different perspective. You are the same as you always were. Because if you’re predictable, you’re easier to sell to.

Think about that. Everything on the internet is invested in keeping you exactly the same forever. Sounds like a nightmare.

How That Led to Charlottesville

That brings us to the nightmare scenario in Charlottesville. This was an example of what happens when people are characterized by and marketed to based on their most identifying and profitable characteristics. You breed insular groups that resist the outside world, confined only to their algorithmically-chosen news. Here’s a hypothetical step-by-step guide:

  1. Guy sees a mildly conservative article that makes some good points, which he then actively interacts with in some way online (clicks a link, likes a post, etc)
  2. Algorithm pounces on this action, offering more media that it believes Guy will interact with
  3. Guy continues to see more conservative news, some of which he interacts with and some he doesn’t
  4. Algorithm shifts with this feedback, refining outputs to serve more material Guy will like
  5. Guy sees more messages from people paying to reach the “conservative market,” of which he is now a member
  6. Algorithm serves ads to Guy based on his interaction history, which has grown increasingly conservative mostly due to the algorithm’s insistence
  7. Guy is now inundated with conservative talking points, some of which are of interest and some are not
  8. One of the things Guy sees is a conservative rally, which does not interest him
  9. The algorithm continues to show different types of conservative news and messages
  10. Guy sees this rally multiple times before he is presented it in a way that makes him want to interact with anything about it
  11. Now investigating the rally, Guy sees other similar Guys who share some views with which he agrees
  12. Guy reaches out to the people in this community to bond with them over shared views and interests
  13. The algorithm takes these new bonds and cross references them to offer new people, products and messages with which Guy can interact
  14. Guy grows his network with the help from the algorithm’s suggestions, thereby creating an echo chamber
  15. Leading up to the rally, Guy has his views reinforced by the community members and news he sees, further calcifying his positions and making him impervious to outside reason
  16. Guy shows up to the rally and runs over people with his car

This isn’t the only example. We’ve seen the red feed versus blue feed comparisons. ISIS is well known for recruiting this way on Twitter, drawing insurgents toward radicalism using algorithms. It’s a real danger that’s both unknown and invisible. And it shapes daily life for better and worse. That’s just the beginning.

The Future of Internet Marketing Is Dark

So that’s how a series of complex formulas can reduce people to their easiest to understand traits, Flanderizing entire populations and polarizing the public in order to make more money. Nuance has been destroyed. You are the sum of your internet actions, nothing more. At least, that’s where we’re going to end up if this continues.

In reality, we as people are far more complex than algorithms can understand. We live with nuance, shifting our personalities as situations present themselves. What we have done does not always predict what we will do next. It can’t; life is too dynamic.

I’m terrified about letting my newborn child use the internet. A person is not the same at 13 as they are at 23 or 33. People grow. People change. Algorithms work by keeping you from doing that. Because if you change, they don’t know anything about you anymore and offer no value.

Tragedy fueled by hate is never easy to understand. Yet by understanding the mechanics that can fuel said hate, we can adapt and not let the invisible inner workings of the internet push us toward a future in which we don’t want to live.

Leave a Reply