how the algorithm plays us

The politics of the fray: the Algorithm and the New Crusades ~ I

Friday 23 August 2024 6:43 PM

The Olympic wars have left me frankly exhausted as well as troubled. Troubled as a member of the human community of planet Earth, as a member of the social media subculture and also as a member of the Body of Christ. In case my phrase “the Olympic wars” is at all obtuse, I have in mind the intense public reactions around the Paris Olympics Opening Ceremony, the legitimacy of two competitors in women’s boxing, and in a slightly different sense the performance of Australian Rachael Gunn in break dancing.


A few weeks back I posted a transcripted English translation of a significant interview with the French Olympic Committee’s artistic director, responding to initial complaints about a segment within the previous evening’s opening ceremony; complaints levelled mainly by Christian leaders and viewers. My intention then was to follow up with my own reflections on the Christian controversy.


I’ve since revised that plan. For one thing it’s already received saturation analysis, reanalysis and counter-analysis, on the part of many, such that I’d now be very late to the party indeed; and arguably all has been said. But there’s another reason too, and I think a weighty one. What has struck me with increasing force is a much wider context within which these vexed Olympic conversations fit. One which has received little attention at least in the volumes of commentary I’ve heard or read. I refer to the social media world within which the controversies have played out, not only among Christians but through the general population on social media. More specifically I’m thinking of the largely hidden ways algorithms drive or shape public engagement.  My own social media engagement is almost entirely on Facebook; so that algorithm is the one I “know” best.


I will comment on specific “battlefronts” within the “war” in part two, my next post. But as a foundation for those thoughts, I offer here a potted analysis of how these mysterious “algorithms” work with us, in us, and between us humans, as we engage with the content on our feeds and with one another, in reacting and commenting. What follows is a blend of my own observations and at least a cursory look at more expert commentary, which has helped me confirm or shape my own perceptions. My human observations are of myself specifically, that is to say how I’ve behaved online at times, as well as what I’ve noted in other individuals or groups.


The politics of the fray:

At their most “innocent”, social media algorithms serve an important and useful function as curators of the voluminous content on the platform. They direct the content to wherever, or more - whomever, it might be of most interest and value. Yet increasingly over the past decade or more this curatorship has come ultimately to serve the commercial interests of the platforms and their advertising clients. The algorithms’ quest is to keep us users engaged as much as possible for as long as possible. Simply, the more we scroll the more content we encounter, including advertised products and services. Where in the earliest times of social media history, the purpose of the platforms was connecting us to one another, now in the age of the algorithm it’s more getting our attention to wherever and whatever will best serve the advertisers. And among the proven strategies for that mission of attentive engagement is, in the frankest bluntest terms: keeping us outraged, at something. Hence my subheading: the politics of the fray. The algorithm keeps us in the fray, by feeding our outrage. In my analysis, it does so (i) by immediacy and virality, (ii) by herding us into packs, and (iii) by entrenching us in a position.


Immediacy and virality: War fronts open up quickly on Facebook. This is because they’re meant to. By its diligence the algorithm distributes information almost instantly and everywhere. In the blink of an eye, the thoughts of a person or outlet the algorithm knows you respect, on a subject the algorithm knows you care about, pops up in your notification feed. Your attention is immediate, your engagement likely.


Because this has all happened in seconds, your consequent reactions take barely longer. Immediacy engenders instant reactions. Since your outrage is the currency, the best immediate reaction the algorithm can elicit is rage. This is, as one writer puts it, the Algorithm of Outrage. Thanks to the immediacy of the rage-inducing content, the algorithm would have us rage first, and ask questions only much later, if at all.


One way I’ve observed this instant rage at work is in the quest for a political ‘gotcha'. The rage-triggered social media user will be watching for ‘evidence’ of foul play on the part of the perceived enemy. A ‘gotcha’ might be a rumoured action or plan the ‘suspect’ has allegedly sought to conceal. (Just one place a conspiracy theory may sprout). Or it may be something they’ve openly said or written somewhere that may be interpretable as an ‘admission’. As anyone practiced in gotcha politics knows, it would take an exceptional counterweight to loose the mind’s grip on a ‘gotcha'.


The pack: I’ve reflected a few times before, particularly in addressing misinformation and conspiracy theories, on the development and functioning of what I’ve come to call “closed information circles”. Social researchers use various terms for this phenomenon and associated processes. Among them - echo chambers, confirmation bias, and Social Media-Induced Polarisation (SMIP). What I like about my own term, the "closed information circle", aside from the modest ego fix, is its emphasis on where members of the group get validation and confirmation of what they believe to be true. The circle provides all the confirmation of “facts” and all the validation of belief the adherents could ever want. External fact checkers or professional media, be damned.


The closed circle phenomenon as I’ve considered it before typically evolves over an extended period. But as our focus here is the algorithm-directed social media fray, a different term might be helpful. So I’m calling it “the pack”. A fray is a hunt, if you will. Having captured our attention with content that feeds our outrage, the algorithm herds us into packs. The fray continues, and sustains itself, through the safety, encouragement and validation of the pack. We hunt together. We’re a part of something bigger, big enough to keep us engaged. What’s more energising than our own rage? The pooled rage of the pack.


So the pack provides our validation, confirming the target and the cause. And the collective passion of the fray (or hunt) takes precedence in our attentions. Or in other words, so engaged are we in the raging mission of the pack and the rightness of the cause, that nothing external matters. And the “facts" of course are clear. To the pack. 


Now thinking back over my social media life, my own engagement, I can see myself thus caught up (engaged!) many times. But this Olympic war seemed to take it to a new level and with a wider reach. Such was the immediacy of our attention, the hold of the pack and the pooled outrage, that common academic research disciplines, particularly around sourcing of claims, seemed barely to matter.


What replaces methodical external verification within the circle or pack is a crumb trail of enticements. Random snippets of information each offering possible clues to what the pack supposes lies hidden. A kind of a random unplanned brainstorming session. Some crumbs fall by the wayside, as they seem to lead nowhere. Others however rise to the surface arousing curiosity and even excitement. Those are the ones that get shared and reshared within the circle, perhaps acquiring greater assumed authority along the way.


In my observation of myself and others, such crumbs take common forms, marked often by leading phrases …

    •    I wonder if …

    •    I heard someone say …

    •    It’s interesting [/funny/curious] how …

    •    I saw a comment somewhere …

    •    I’m pretty sure …


By means such as these, what takes shape within the pack is a rapidly expanding corpus of trusted information. A fast moving notification feed, containing consistent content, shared by a rising number, building an evolving story, pieces apparently falling into place. All of this and more without fact checking anything. The algorithm tells you it’s true, and the pack confirms it.


Entrenching us: The algorithm feeds our rage with nearly instant and universal content, all nurtured through the pack. And thus the algorithm entrenches us in our rapidly formed certainty. Through our content and notification feeds it continually reinforces our assumptions and beliefs, quelling any doubts. It perhaps also validates the rage we feel on our own behalf and/or of others whose rights we perceive as threatened. Our feed keeps us and our rage at the centre. It’s about us.


If I could only say one thing about these Olympic wars that saddened and alarmed me above all else, it would be the widespread over-confident entrenchment in the collective rage. It’s not that I haven’t encountered such entrenchment before. As one who has read and engaged significantly about the phenomenon of science denial, notably around climate change and Covid, I’ve encountered it more times than I could possibly recall. What has made this information war distinctive is its sheer pace and volume. It didn’t require months or years of exposure to misinformation, gradually reshaping the sense of reality. On each battlefront it was nearly instant, certainly within hours or at most a single day, that the social media mass seemingly became entrenched in a common rage and a common narrative. That barely rational kind of entrenchment I’ve little doubt is the work of social media algorithms.


Notable among the signs for me was a seeming imperviousness to new information. Rapidly engendered certainties - cemented by the pack, continually validated by a torrent of confirming information - renders any contra-information impotent. What perplexed me was an apparent inability or unwillingness to modify belief, or even suspend judgement, in light of new information, even from quite early in the conversation. The adopted certainties were immovable.


An online behaviour I’ve noted in myself and others many times is doubling down, in the face of counter evidence. If I’ve nailed myself to an argument or position, contending for it very publicly and solidly, then being confronted with countering evidence threatens the humiliation of surrender. The humiliation of regret is too much to bear. And so I double down to avoid the ultimate public loss (being wrong). I do that in the fire of online battle, and I’ve witnessed it in others also. I’d suggest its occurrence is just one more sign of the entrenchment created by the Algorithm of Outrage. That too was bountifully in evidence in the Olympic wars.