I am betting on education
How will I change edtech?
I bet on edtech right now.
While all cowards run away from it because it has a graveyard of startups, the opportunity hasn’t been bigger..
AI fundamentally changes our way of doing any type of intellectual task, and few people are prepared to do understanding-work rather than knowledge-work.
Up until recently, most cognitive work done by humans consisted of applying existing mental models to familiar problems over and over again. Engineers applied physics models to machines. Lawyers applied legal frameworks to cases. Accountants applied financial rules to transactions. The work wasn’t necessarily simple, but structurally it was the repeated application of known models.
This also meant the system wasn’t very scalable. If you wanted more output, you needed more humans trained in roughly the same knowledge space to apply those models. The bottleneck was human cognitive throughput.
Now something similar to what happened with manual labour is happening to knowledge work. Machines once automated away and scaled physical effort. AI systems are beginning to automate and scale the application of mental models.
So the natural question becomes: what remains?
Historically, when a layer of work becomes automated, humans move one level of abstraction higher. We’ve seen this pattern before. In programming, we moved from binary to assembly to higher-level paradigms like object-oriented programming. Each step abstracted away the lower layer and let humans operate at a higher level of reasoning.
The same thing happened with labour more broadly. We transitioned from doing physical tasks to designing and coordinating the application of those tasks—what eventually became knowledge work.
Understanding-work is simply the next level of abstraction.
Instead of repeatedly applying existing mental models, the role shifts toward building, refining, and discovering the models themselves. The work is no longer primarily about using frameworks, but about figuring out why those frameworks work, where they break, and what new ones might replace them.
You may argue that AI will automate that away too, and you’d be right, but what it cannot automate afterwards (in the next few years at least) is the decision of “what to pursue”. That decision requires having an optimising function (meaning), and the higher meaning will be set by humans.
The question will shift to “What do we focus on and why?”.
Even if only a small fraction of people are doing that work, look at Florence. Look at Ancient Greece. People move toward art, creativity, experiences. But to experience those things fully (at the highest level) requires understanding the problem space. Understanding your surroundings at a core level. Either way you cut it, understanding sits at the pinnacle of human experience.
Feynman put it best: the joy of seeing things. A scientist can see the art and beauty in the stars better than most artists today, simply because he understands it better. Not every scientist. But someone at the intersection.
So if "total understanding of the world" is what humans will ultimately strive for - and since it's unattainable - I'm not worried we'll lose our meaning to machines. The real question: how do we teach understanding?
You don't. Not in a classroom. School is actually the opposite of preparation for a post-AGI era.
Understanding comes from interacting with the environment. From curiosity. From digging deep and connecting datapoints. It's grasping a concept so well it becomes intuition, where you barely need to rationalise it anymore. How do you get there? By getting rid of the teacher/student dynamic in the first place.
An AI system is the closest thing to a perpetual learner we've ever had. It isn't just a student; it's a teacher too. And as a human, to build real understanding, you have to play both roles. Evaluating where you are. Testing yourself against reality. Developing intuition for what you don't know you don't know. That's the playbook.
Edtech is doing the exact opposite. It's dumping more content into students as if we didn't already have too much. What's actually needed are tools that give students more agency. Tools that reduce the latency between action and feedback and where feedback comes from nature, or is derived from its laws. Tools to build with. Tools that help students create mental models, test them, and iterate.
That's what I'm building.