Face It. Hate is Profitable.

Of all the places I’ve traveled to around the world, my favorite by far is New Zealand.

Not only is its natural beauty absolutely spectacular, but its people are incredibly warm and welcoming. And as a Native Hawaiian, I was especially struck by how the indigenous Māori people had forged and maintained respectful and equal relationships with the western world. They weren’t marginalized or caricatured cast-offs lost in the military industrial-slash-tourism complex. They were respected and celebrated first peoples who proudly walked down the street and into bank lobbies in full regalia.

It was an idyllic vision of a Hawaii in the multiverse, a different version of our future where things didn’t go so horribly wrong in so many different ways.

So it was especially devastating to learn of the New Zealand shootings. The beautiful country and its beautiful people were targeted specifically because they were so peaceful, so welcoming of diversity. The message the attacker meant to send was, “nowhere in the world was safe.”

This hateful act has, of course, sparked dozens of different debates around the world. Did this Australian adopt the rhetoric of a particularly American brand of white nationalism? (Yes.) Can you draw a line from this so-called “lone wolf” to the president of the United States? (Yes.) Are our hyperconnected, instantly-live platforms unqualified to deal with terrorists using them to broadcast their atrocities around the world? (Absolutely.)

But the question that persisted in my mind, especially a day after mulling the central role Facebook plays in the lives in 2.3 billion people, is whether social media platforms can do something to stop terrorists from finding recruits, finding each other, and finding supporters so easily.

The answer is “yes.” The answer is also, “but they will never do it.”

Hate is profitable.

Terrorists are responsible for their own heinous acts. But terrorists are created, not born. Terrorists get their information, find their tribes, and spread their messages the same ways you and I do. And fringe as extreme as they may be, they are getting a lot of help.

Massive platforms like Facebook, Twitter, and YouTube “optimize for engagement.” They make automatic, algorithmic suggestions for every bit of content or action. From “you might also like” to “recommended just for you” to prioritizing things — anything — that will get you to click, comment, or share.

How pervasive is this “engagement at any cost” mindset? Consider the newsfeed.

Every user of Facebook, Twitter, and Instagram wants their news feed shown in plain, chronological order. It’s a top, if not the top, feature request. It’s not controversial. It is trivially, stupidly, exasperatingly easy to implement… because that’s how these platforms all worked at first.

So why do you suppose they won’t do it? Why is “Top Posts,” or “Top Tweets,” the default? Why do they hide the settings to turn this off, if they let you turn it off at all? (Twitter trumpeted the introduction of this feature, without mentioning that the new “sparkle button” removed the previous ability to make the switch permanent.) Is it because they know what you like, can find it faster, and just want to make you happy?

Please.

They know what will catch your attention. They know what will get you “engaged.” (Or enraged.) They know what will be more likely to lead you deeper into a rabbit hole, and what will make it harder to climb back out. Have they built a literal, iron-clad trap? No. But the slippery, spiral path that quickly leads people to the darkest corners of the internet is not an accident.

Sure, cat photos are compelling. But there’s nothing like a good knock-down, drag-out clash among friends to get those fingers tapping.

Hate is profitable. Conflict is profitable. Schadenfreude and shame are profitable. While we smugly point fingers, tsk-tsk, and think we’re being clever as we strategically dole out likes and shares, we forget that we are all just gruel-fed hamsters running on wheels deep inside giant, hyper-engineered, artificially intelligent, fully gamified, corporate-controlled virtual worlds that we absurdly think belong to us.

They won’t fix it. It’s working just fine for them. So what can we do?

We can get off the hamster wheel, for starters. It won’t take long for the lights to go out. And then, if we must, we can build better ones.

Illustration by JBCharis for “Know Your Hate Groups” via The Nib.

2 Responses

  1. Ruston Hill says:

    It seems that “hate” is profitable, but who profits from it? Who stands the most to gain from a profoundly divided society?

  1. March 17, 2019

    […] piece is excerpted from a post on Hawaii Blog. Illustration by JBCharis for “Know Your Hate Groups” via The […]

Discover more from Hawaii Blog

Subscribe now to keep reading and get access to the full archive.

Continue reading