Face It. Hate is Profitable.
Of all the places I’ve traveled to around the world, my favorite by far is New Zealand.
Not only is its natural beauty absolutely spectacular, but its people are incredibly warm and welcoming. And as a Native Hawaiian, I was especially struck by how the indigenous MÄori people had forged and maintained respectful and equal relationships with the western world. They weren’t marginalized or caricatured cast-offs lost in the military industrial-slash-tourism complex. They were respected and celebrated first peoples who proudly walked down the street and into bank lobbies in full regalia.
It was an idyllic vision of a Hawaii in the multiverse, a different version of our future where things didn’t go so horribly wrong in so many different ways.
So it was especially devastating to learn of the New Zealand shootings. The beautiful country and its beautiful people were targeted specifically because they were so peaceful, so welcoming of diversity. The message the attacker meant to send was, “nowhere in the world was safe.â€
This hateful act has, of course, sparked dozens of different debates around the world. Did this Australian adopt the rhetoric of a particularly American brand of white nationalism? (Yes.) Can you draw a line from this so-called “lone wolf” to the president of the United States? (Yes.) Are our hyperconnected, instantly-live platforms unqualified to deal with terrorists using them to broadcast their atrocities around the world? (Absolutely.)
But the question that persisted in my mind, especially a day after mulling the central role Facebook plays in the lives in 2.3 billion people, is whether social media platforms can do something to stop terrorists from finding recruits, finding each other, and finding supporters so easily.
The answer is “yes.” The answer is also, “but they will never do it.”
Terrorists are responsible for their own heinous acts. But terrorists are created, not born. Terrorists get their information, find their tribes, and spread their messages the same ways you and I do. And fringe as extreme as they may be, they are getting a lot of help.
Massive platforms like Facebook, Twitter, and YouTube â€œoptimize for engagement.â€ They make automatic, algorithmic suggestions for every bit of content or action. From â€œyou might also likeâ€ to â€œrecommended just for youâ€ to prioritizing things â€” anything â€” that will get you to click, comment, or share.
How pervasive is this “engagement at any cost” mindset? Consider the newsfeed.
Every user of Facebook, Twitter, and Instagram wants their news feed shown in plain, chronological order. Itâ€™s a top, if not the top, feature request. Itâ€™s not controversial. It is trivially, stupidly, exasperatingly easy to implementâ€¦ because thatâ€™s how these platforms all worked at first.
So why do you suppose they wonâ€™t do it? Why is â€œTop Posts,â€ or â€œTop Tweets,â€ the default? Why do they hide the settings to turn this off, if they let you turn it off at all? (Twitter trumpeted the introduction of this feature, without mentioning that the new â€œsparkle buttonâ€ removed the previous ability to make the switch permanent.) Is it because they know what you like, can find it faster, and just want to make you happy?
They know what will catch your attention. They know what will get you â€œengaged.â€ (Or enraged.) They know what will be more likely to lead you deeper into a rabbit hole, and what will make it harder to climb back out. Have they built a literal, iron-clad trap? No. But the slippery, spiral path that quickly leads people to the darkest corners of the internet is not an accident.
Sure, cat photos are compelling. But thereâ€™s nothing like a good knock-down, drag-out clash among friends to get those fingers tapping.
Hate is profitable. Conflict is profitable. Schadenfreude and shame are profitable. While we smugly point fingers, tsk-tsk, and think weâ€™re being clever as we strategically dole out likes and shares, we forget that we are all just gruel-fed hamsters running on wheels deep inside giant, hyper-engineered, artificially intelligent, fully gamified, corporate-controlled virtual worlds that we absurdly think belong to us.
They wonâ€™t fix it. Itâ€™s working just fine for them. So what can we do?
Illustration by JBCharis for â€œKnow Your Hate Groupsâ€ via The Nib.