Once Upon a Glitch

AI doesn’t think like us—it predicts the next word based on patterns from vast text data. This can create familiar yet surprising results, producing stories that resonate or veer into the absurd, without true understanding or consciousness behind them.

 

Once Upon a Glitch

written by jaron summers (c) 2025

We’re told that AI has no consciousness. Maybe that’s true. Maybe it’s just really good at pretending — like a cat who knocks your stuff off the table and then acts like you’re the problem¹.

But here’s the wild part: AI appears to think and reason — but it’s more like a talented parrot with a billion books inside its head and no idea what a lake is.”

AI absorbs everything it can get its digital hands on, and when you give it a prompt, it guesses what comes next. Over and over. Really fast. Until it builds a paragraph, or a story, or a love letter to your houseplants².

This means it can write a novel that a huge number of people might love — not because it’s conscious or understands what it’s saying, but because it’s really, really good at predicting word patterns that feel “just right” to our brains. Patterns we’ve absorbed over years without even realizing it — like why “Once upon a time” just feels like the start of a story, or why “He turned around slowly…” makes us expect something dramatic³.

Sure, the AI might toss in some fresh, shiny new ideas — but those could just be the result of its unpredictable word-wizardry.

It’s not thinking outside the box; it’s just predicting what’s near the box… and sometimes accidentally sets the box on fire in a fun way.

Take this sentence: “Jack ran to the cool liquid of the lake.” Solid. Feels like something you’d read in a novel where Jack is probably shirtless and emotionally complicated.

But let’s say the AI is feeling spicy. It sees the word “cool” and thinks, “Cool… cool… cool cat?” Now we’ve got: “Jack ran to the cool cat in the lake.”

Wait, what?

Exactly. Suddenly Jack’s sprinting toward a chilled feline just floating there like some mystical aquatic guru.

And, now your story has gone off the rails into magical realism or absurd comedy — and you kinda love it.

So, maybe AI doesn’t think like we do. But by predicting words based on all the patterns it’s learned, it sometimes lands on brilliance — or chaos.

Or both.

Either way, it keeps things interesting.

Footnotes:

  1. Chalmers, David J. (1995). Facing up to the problem of consciousness. Journal of Consciousness Studies, 2(3), 200–219.
    — Chalmers makes the distinction between systems that simulate intelligent behavior and systems that have actual subjective experience — a key issue when we talk about AI and consciousness.

  2. Radford, Alec et al. (2019). Language Models are Unsupervised Multitask Learners. OpenAI.
    — Describes the architecture and behavior of GPT models, which are trained by predicting the next token (word or character) in massive datasets.

  3. Goldberg, Adele E. (2006). Constructions at Work: The Nature of Generalization in Language. Oxford University Press.
    — Explores how humans subconsciously learn and generalize language patterns, many of which are mirrored (though unconsciously) by language models.

 

Our Favorites

Picture of jaron

jaron

Jaron Summers wrote dozens of primetime television and radio programs, including those for HBO, CBS, ACCESS TV and CBC. He conceived the TV and Film Institute of Canada. Funded by the University of Alberta and ITV, Jaron ran the Institute for 12 years, donating his services for a decade.

Leave a Reply

Your email address will not be published. Required fields are marked *

Read More

Wacky tales

The Art of Negotiation

One of the many reasons I am so successful with my life is that I have studied how the masters
Blog

SIZZLE

Suicide …. 20 times more than the world’s highest
Wacky tales

Baby Talk

Most parents have no business raising children. They labor under total illusion as to their offsprings’ intellect.Take my friends, the
Blog

Simplify, Simplify, Simplify,

My wife, Kate, and I have had our most serious arguments because of extreme clutter, spawned by her deep-seated neuroses.