Models in Motion
“In our work, in our lives, the world keeps moving. Let’s keep updating our models, too.”

June 2, 2025
The line was thin, but their voice bright. “We found a paying gig in a diner on the Pacific coast.” They were in British Columbia, hiking through forests and working on farms for room and board as they made their way south. They were determined, unsure, but excited. I was worried, though I tried not to show it. I found myself scrambling a little, reaching for the lessons I had learned in my work as an advisor – listening first, framing choices, guiding gently – hoping I could apply those same patterns to parenting my adult children.
Until recently, they had always lived half in the digital world. As a teenager, they explored video games and social media, following bright threads of knowledge and story wherever they led. They spent hours learning history and philosophy from YouTube channels like Crash Course and Extra Credits. They knew more about the intricacies of Persian imperial politics in late antiquity than most college students. They seemed restless, but quiet, content.
After high school, something shifted. They moved in and out of college programs, each step a recalibration: construction, then information technology, then game design. I missed some of these shifts. I wondered why they chose construction before returning to technology, or if it even mattered.
A few months ago, they quit college altogether, dreamed up a plan to stake a claim out West, and moved out a few months later. They were twenty-three.
In cybersecurity, we talk about concept drift. Our detection systems, trained on yesterday’s patterns, start to slip when the world changes. An attacker finds a new trick, and our AI models, built on old data, fail to see it. We retrain them to see the new patterns because we must. The threat doesn’t stand still.
But concept drift isn’t just a problem for algorithms – it’s a human problem, too. We build mental models of our children, our partners, our colleagues, and ourselves. If we don’t keep updating those models, if we don’t stay open to new data, we lose sight of what’s really there.
AI is good at analyzing data, at finding patterns we can define and label. But for all the hype around the capabilities of AI, humans see more. We pull information from countless sources an algorithm would miss – hints, silences, the way someone hesitates on a word. We read the data of our own feelings, too, though it can be messy. Antonio Damasio, a neuroscientist who studied the deep ties between emotion and reason, argued that emotions are not separate from intelligence, but part of how we make sense of the world. They’re another way of seeing.
So we, too, must retrain. Not once, but always. My youngest is not who they were two years ago. I’m not the same father I was then. In a way, that’s the real lesson of concept drift: nothing stays still. Models grow stale. So do we, if we’re not careful.
When the call ended, I sat quietly for a moment. I thought of them hiking through forests, testing their own models of what the world could be. I thought of the work ahead of me as a parent, as an advisor, as a leader. In the office, as at home, the world keeps shifting: new challenges, new processes, new ways of seeing. I thought of the patterns I had missed and the new ones I still needed to see.
In our work, in our lives, the world keeps moving. Let’s keep updating our models, too.