The Inheritance 2040

Her First Breath, My Last Someday
When my first daughter took her first breath, my immediate thought was: I'm going to die one day.
Not from fear. From recognition.
This person would have to navigate life without me eventually. The yin and yang of it struck me in that moment: I watched her inhale, and acknowledged I would exhale for the last time. Same epiphany occurred when my second was born as well.
In both instances, the question that followed wasn't morbid. It was practical: What do I need to teach them so they can stand on their own when I'm gone?
Six years later, as AI reshapes what education means, that question has evolved into something more fundamental: What does it mean to be "educated" in a world that AI will fundamentally change?
Promises and Prerequisites
Google's vision is compelling. Their recent paper on AI and the future of learning argues that AI can unlock the power of learning science—sparking active participation, encouraging deep practice, and enabling personalized tutoring at scale. The technology can adapt to a student's needs, provide tailored feedback, and serve as an always-available, non-judgmental tutor.
I've experienced this firsthand. When building my personal website, I used Claude as an "ala carte" educational tool. I asked it to create a custom frontend development learning plan with the goal of using Next.js. It generated interactive tutorials on HTML, JavaScript, and React—ensuring I understood the foundations before moving forward. The system worked.
But here's what Google's framework assumes: that people want to learn.
Andrej Karpathy made an observation that cuts to the heart of this: going to school is like going to the gym. The equipment is there. The trainers are available. The programs are designed. But watch what happens every January—people flood in with New Year's resolutions, then fade by spring.
The promise of AI education is real. But the prerequisite isn't access to the tool—it's the internal drive to use it properly. Google's paper acknowledges this, noting that "the students who most productively engage with AI will be those that are already highly motivated students." They call it the 5% problem.
The technology can personalize the curriculum, adapt to your pace, identify your gaps. But it cannot give you the discipline to show up when the learning gets hard. That happens when no one is watching.
What Remains Human
Mustafa Suleyman, in his vision for Humanist Superintelligence, argues for AI that is "problem-oriented and tends towards the domain specific." Not unbounded intelligence pursuing capability for its own sake, but bounded systems designed to serve human needs within defined limits.
This resonates with my work as a systems engineer. The nature of large language models is that they're not deterministic—outputs vary, context matters, edge cases emerge. So, we bound them through various frameworks. We create quality gates, human-in-the-loop validation, explicit constraints. Not because the AI can't do more, but because the constraints make it useful.
But some domains resist automation entirely. Not because AI lacks capability, but because the human interaction is the point.
In my work, requirements elicitation requires people-to-people interaction. You can't validate that you're solving a real problem without sitting across from the customer, reading their body language, hearing their tone, understanding what they're not saying, determining their painpoints. AI can help decompose and analyze requirements, but the act of understanding what someone actually needs—that's relational.
The same principle applies to education. Body language, tone of voice, facial expressions—these augment learning in ways AI cannot replicate. Conflict resolution, soft skills, navigating ambiguity—these are learned through experience, not tutorials. The encouraging nod I give when my oldest struggles through a math problem. The look given when she can read a book but doesn’t want to. These moments shape learners in ways that transcend information transfer.
I think about this constantly with my daughters. I'm introducing my oldest to AI tools now, in controlled ways, building her awareness of what these systems can do. My bet is that when they come of age and AI is infused into everything, they'll still look to me for guidance—precisely because I was the one who taught them how to think about these tools in the first place.
But more than that, they'll have seen me learning, watched me struggle with challenges and watched me work out when I don’t feel like it.
AI tutors can be infinitely patient and always available. But they can't model what discipline looks like. That's still my job.
The Transition We're Already Living
Karpathy's distinction between pre-AGI and post-AGI education is interesting. Pre-AGI education is "useful"—instrumental, career-oriented, credentialing. Post-AGI education is "fun"—intrinsic, curiosity-driven, pursued for its own sake.
The transition from pre-AGI to post-AGI education isn't technological—it's psychological. And it raises questions about equity.
The risk isn't that AI education becomes too expensive for some to access. The risk is that it becomes too easy to ignore. Some people will thrive with AI tutors—the intrinsically motivated, the curious, the disciplined. Others will sign up and fade out, just like those New Year's gym resolutions.
I don't know how to solve that. Part of me says that people need the latitude to fail or prosper by their own volition. The human experience includes both the thrill of victory and the agony of defeat, and some of the best lessons come through failure. But I also believe access matters—that everyone deserves the opportunity to engage with these tools meaningfully and safely.
What I do know is that you can provide access, but you cannot provide motivation. That has to come from within.
But this internal drive isn't just about the discipline to learn new skills; it's also about the wisdom to recognize our limits. In an era where AI offers infinite speed and information, I’ve found myself drawn not toward acceleration, but toward the one thing technology cannot optimize: the scarcity of my own time.
The Urgency to Slow Down
Neil deGrasse Tyson makes a case that death gives meaning to life. If we lived forever, he argues, we'd never feel urgency about anything. Mortality creates scarcity, and scarcity creates value.
I've felt this. But the urgency I experience isn't to learn more or achieve more. It's to slow down.
In my twenties, I wanted to maximize every hour. Now, entering middle age, I want to be present for the hours I have. My kids are growing fast. My parents are getting older. I see them a handful of times a year—each visit feels weighted with the knowledge that the number remaining is finite.
AI promises infinite learning. Personalized tutors available around the clock. Any skill accessible on demand. But I've chosen to constrain my learning—to play to my strengths, follow genuine interests, and let the rest go. Not because I couldn't learn more. Because I want to be present for what matters. Wisdom isn't knowing everything—it's knowing what to ignore so you can focus on what counts.
AI amplifies some aspects of growth while leaving others unchanged. Skills can be learned faster. But health still requires showing up—you can't outsource the workout. Relationships remain human-to-human and habits stay foundational. Progressive overload applies to the mind just as it does to the body: small, consistent effort over time builds capacity. The reps still matter.
The Inheritance
Six years came and went. My girls won't remember the moment they took their first breath, but I'll never forget it.
I think about the world they’ll inherit when they come of age—a world where AI tutors are ubiquitous, where personalized learning is assumed, where the question isn't "Can I access this knowledge?" but "Do I have the discipline to pursue it?"
The inheritance I'm building isn't knowledge—they’ll have access to more than I could ever accumulate. It's not credentials—those will be transformed in ways we can't predict. It's not even specialized skills—those are becoming obsolete faster every year.
What I'm leaving them is the discipline to keep learning when it's hard, the wisdom to choose what matters, and the presence to be fully there for the people they love. That's the inheritance.
Champions are made when no one is watching.
In a world where AI can teach anyone anything, the real question isn't what you can learn. It's what you'll choose to become.

What will you choose to become? Reach out at mike@mikescorner.io




