Censemaking

What Do We Behold With Our Tools?

Algorithms influence our online experience and as that becomes a bigger part of life it has consequences beyond the digital realm.

Visit a website and you’re probably no longer surprised to see advertisements for the very thing you were looking at on another site while online shopping. By now, any social media user should have an inkling that what pops up in our feeds are not random, sequential, or timely, but predicated on what we’ve clicked on before, looked at, and what advertising might fit that pattern.

This is not new. It is getting to be an ever-bigger part of our digital world which — accelerated by the many stay-at-home, shelter-in-place, work-from-home trends of 2020-21 — is becoming a far greater part of our social-emotional-cognitive world. This matters for the reasons that Marshall McLuhan articulated more than 50 years ago before the Internet existed.

We become what we behold. We shape our tools and then our tools shape us

Father John Culkin (commonly attributed to Marshall McLuhan)

The idea that our tools or environments or contexts shaping us is also not new and pre-dated McLuhan’s quotes in the modern era. Buildings shape us, our cities shape us, and the landscape around us shapes us. This landscape is now increasingly digital.

What’s distinct about the digital media landscape is how dynamic, responsive, and pervasive it is. Unlike buildings, it follows us everywhere. It’s also something that is far easier to manipulate by many people, simultaneously. That has implications not only for what we think about, but when and how.

Ice Driving a Ship of Fools

The film (and novel) Ship of Fools is set aboard a ship bound for Nazi Germany in 1933 and portrays a journey both literal and metaphorical from one land to another and the anticipation about what might come in years ahead from the landing. When you are skidding on an icy road you are advised to look where you want to go, not where you are going. That means — against your instincts to focus on the threat — focusing on the opportunity of safety.

The parallels to the current context of 2020-21 are many. Our digital media feels are being shaped by algorithms to drive us where we are going, not where we want to go. Even something as deliberate, important, and high profile as determining who is to get a vaccine is now being shaped (in some places) by algorithms and if the experience of Stanford Medical School is anything to go by, it’s not going well.

Algorithms seek to predict your present behaviour based on past patterns. Yet, as the systems surrounding these patterns change as we are seeing with the COVID-19 pandemic, those algorithms are more likely to miss the mark. Add in the many layers of social (human) bias that underpin the programming of these algorithms and what we’ll see going forward are people looking the wrong way as they slide along the ice.

There are few answers to this. What it means is that any choice to use data gathered from the public internet — which is increasingly the majority of what we see (and even traditional media outlets see) is one that must be done mindfully. Researchers at the Brookings Institution refer to this as algorithmic hygiene. I guess we add this to the list of things we need to wear a mask and wash our hands from.

Stay aware. If we are looking to make real innovation and changes that make a difference, we need to be basing on something more than what Facebook or Google thinks we want to see.

Photo by Jamie Morrison on Unsplash

You may not think about it this way, but your support of independent writing through reading sites like this and others that are not algorithm-driven contribute to the proliferation of new thinking and diversity of ideas. Thank you for reading.

Exit mobile version