Born Clippy
The final part of the "Born Clippy" series, trying to look at what the AI Clippy tsunami might tell us about the tech moment and movement we currently find ourselves swamped in.
Part 4 (of 4): Everything's Clippy
By Darragh Coakley
“We become what we behold. We shape our tools, and thereafter our tools shape us” – Marshall McLuhan
There is a pattern of thought behind all this.
How tech is marketed is ostensibly always about the future. Part of this is the standard marketing pitch and tone. But scratch that a bit and the deeper beliefs tend to reveal themselves.
I think that the veneration of the Steve Jobses, the Zuckerbergs, the Bill Gateseses has convinced many in the tech world that they are ultimately infallible. That they are seeing the ones and zeros before anybody else.
If you convince yourself that you are the soothsayer, it creates a mindset fully unmoored from consequence or accountability.
This mindset inevitably bleeds into how one conducts oneself. How could it not?
In Ted Chiang’s brilliant piece, he notes the fear among the Silicon Valley illuminati (long since discarded to jump on the AI $$$ train) about AI bringing about human extinction.
They argue that AI would do so through pursuit of a singular goal without thought of consequence and the impact on people or the world.
This scenario sounds absurd to most people, yet there are a surprising number of technologists who think it illustrates a real danger. Why? Perhaps it’s because they’re already accustomed to entities that operate this way: Silicon Valley tech companies.
They analyse these tool only in the way which they would use them.
You can see this mindset even in the shibboleths these monolithic, edgelord businesses bandy about.
“Move fast and break things”
“Be impatient - it will create the progress the world needs”.
To me, that sounds like a recipe only for bad, rushed decisions and broken things.
What we see now, in the era of enshittification, is that this worldview has bled into the products, the systems, the processes. Everything.
What also becomes clear and clearer is how these mantras, this mindset, forever forward-looking, can learn nothing at all from past experiences. The same issues that beset Clippy beset these new “helpers.” In genuine human terms, the alleged glory of new tech does not- and will not - compensate for less usable, less effective software.
The accusation thrown about for anyone cautious of “AI everywhere for everything” is that they are a luddite.
That's maybe not inaccurate. The Luddites were not opposed to technology itself. They just didn't want technological unemployment on a mass scale. They didn't want their lives and the lives of their families to be cast aside in the name of profit for a select few.
And for that they were ultimately subjected to brutal oppression and mass execution.
To quote my man Byron, speaking of the treatment of the luddites
"but never, under the most despotic of infidel governments, did I behold such squalid wretchedness as I have seen since my return, in the very heart of a Christian country".
The plight of the luddites was not a tech issue - it was a human issue.
Our glorious techlord aristocracy have a common idiom, which I’ve seen AI evangelists in the wild echo - that if you don’t know the tech, your take doesn’t matter.
This is a crappy strawman argument, dressed as logic. Tech issues - in the real world - cannot exist absent human issues.
Perhaps I’m writing myself out of a gig, but for all the writing, reading, presenting, etc. that I do on AI, I know comparatively very little about it compared to many others. And the deeper I go, the more aware I am of how little I know.
What I do feel that I can bring to the table is some critical analysis, some consideration for people, some awareness of how tech fits within a larger process, etc.
A background in edtech serves you well for this.
So when people tell you that “you don’t know or understand AI”, that may be true from a purely technical perspective.
But that doesn’t mean that you don’t have critical thinking, common sense, consideration for actual people.
I mean, Jesus, given the times we find ourselves in, somebody somewhere most definitely should have these things.
There’s not enough of them these days.
As with the luddites, what should be obvious in all of this is just how little interest there is in people. In you.
Not even in a managerial or a humanitarian sense, but interest in people as customers. As where the profit comes from. As whom the product or service is ostensibly meant for. As whom gives meaning to it.
It don’t matter if you - the user, the customer - don’t like it. This should be clear in just how few qualms there seem to be in throwing people aside. Users are no longer a factor in the processes. That may, however, be more deliberate than accidental.
With more AI everywhere and more databases needing data and more models needing training, your data is a resource. You’re less of a person to be engaged with, more of a crop to be harvested.
When the next AI helper asks you for your info, remember that, historically, harvesting rarely works out well for whatever or whoever is getting harvested.
Maybe some of that lack of concern about what users do or don't - or care about or don’t care about - is based on pure vibes. The vibes that we're all already locked in.
Because the explanation from those selling to you as to why this is necessary, why this is good, why this is better, is and forever will be "Because it is. Because it's the future."
Don't believe what your human eyes and ears tell you. Don't believe your subjective experience. This is better because it's the latest tech. And the tech (however it feels to you, whatever it does to you) is inevitable.
If you want a vision of the future, behold the Tesla sat on the front lawn of the White House. Behold the leader of the free world elucidate:
"Everything's computer! That's beautiful. Wow"
Entirely befuddled by what he's looking at, what he's encased in. He can only sell it on the basis that he's been told it's good. It's the future. It's inevitable. There's just no other choice.
Everything's computer. Everything's Clippy.
Underworld - Born Slippy
About the author
Darragh Coakley is a Technical Officer with the Department of Technology Enhanced Learning (TEL) at Munster Technological University (MTU), who has been working in the digital education and higher education space for over 17 years, across a range of roles, contexts and environments. He is an Advance HE Senior Fellow with a BA (Hons) in creative digital media, a MA in e-learning design and development, a level 8 certificate in designing innovative services, and a level 8 certificate in cyberpsychology.