ALEKSI PARTANEN: The Unicorn Paradox
Why the 'polished response' is a trap, and how Estonia can avoid Finland’s slide into mediocre innovation.
Last weekend, while having breakfast at a hotel in London, I did something unusual: I read a physical newspaper. It was the Financial Times Weekend edition, the kind that leaves black ink on your fingers and makes you feel like you’re participating in a dying ritual.
Inside was a brilliant essay by Brendan McCord called “The Answer Machines.” I’ve been turning it over in my head ever since. His argument is that we are entering an era where the “polished response” — the perfect essay, the correct answer, the well-structured argument — is becoming a commodity. AI produces it now. The very skill we have spent centuries teaching humans to master, machines now perform faster and arguably better — at least if you’re measuring by “polish.”
I got so excited that I started reading excerpts aloud to my wife. She listened patiently before reminding me that she had recommended the piece to me in the first place. Noniin.
On the flight back to Tallinn, I couldn’t stop thinking about this. In my world — venture capital, tech, and startups — the atmosphere around AI has an intensity that occasionally slips into the religious. You’re either on the train or you’re roadkill. To ask, “What if we’re getting this wrong?” is to mark yourself as a laggard who doesn’t grasp the future. It is exhausting.
To be clear: I am not arguing against AI in schools. I am no romantic who thinks we should return to pen and paper. One in four Estonians already has an AI account; the tools are already in the students’ pockets. That ship has sailed — and it should. My argument is narrower: we are focusing on the wrong thing. Everyone is talking about the tools — how to build them, how to scale them, and who can do it fastest. Almost nobody is talking about the teaching.
I have seen what happens when you get that balance wrong. I’ve seen it because I’m Finnish. I went through my academic path in Finland just before the floor fell out.
The Finnish Disaster
For context, between 2006 and 2012, Finland ranked first in the European PISA tables. We dominated the ranking. Delegations arrived from around the world to study our “miracle.” Education became an export product. Then, we decided to modernize and “fix” a system that was not broken.
By 2015, we had slipped to Rank 2 — Estonia passed us. By 2022, we crashed to Rank 10. This wasn’t a decline; it was a collapse. We went from the best in Europe to mediocre in sixteen years.
What happened? We got intoxicated by “Phenomenon-Based Learning,” “self-directed education,” and “21st-century skills.” We restructured everything around ideas that sounded brilliant in policy documents and academic conferences. The theory was that students, given freedom and digital tools, would naturally direct their own learning toward deeper understanding. It was a very Nordic, very optimistic view of human nature. We repeated the exact changes that had previously decimated Swedish education, ignoring their warnings to stop.
Strong students did fine. Strong students usually do — you could teach them in a barn, and they’d figure it out. But results of the weaker students — those who required structure, clear expectations, and the friction of a challenge — fell through the floor. The achievement gap exploded. We built a system optimised for the top 20% and called ourselves progressive.
I bring this up because Estonia is now exactly where Finland was in 2012: Rank 1 in Europe. And Estonia has just launched TI-Hüpe, the “AI Leap,” to put AI tools in every classroom. Coming from Finland, watching this is like seeing someone about to walk off a cliff I’ve already fallen over.
Tiger Leap (The One That Worked)
Estonia has done this before, and they did it well.
In 1996, five years after independence from the Soviet Union, half the country barely had telephone access. Toomas Hendrik Ilves convinced the government to bet everything on technology. They called it Tiigrihüpe — Tiger Leap.
They built it on three pillars that people often forget. Yes, they put computers and the internet in every school by 2001. But they also trained 4,000 teachers in 40-hour programs in the first year alone. They created native-language courseware and a platform for teachers to share materials.
The technology was the visible part, but the investment in teaching was what actually mattered.
Tiger Leap worked because computers are tools. When a student uses a search engine, they are still doing the hard work: evaluating credibility, synthesising sources, and forming conclusions. As Steve Jobs said, the computer is a “bicycle for the mind.” It gets you there faster, but you still have to pedal.
AI is not a bicycle. AI is a self-driving Tesla. You can keep your hands on the wheel for the “feel” of it, but the car doesn’t actually need you. Eventually, you stop looking at the road.
The Trade-off
Ivo Visak, who runs TI-Hüpe, recently noted that 95% of Estonian secondary school students already use AI heavily. In his words, “they’ve delegated their thinking almost entirely to the machine.”
He asked the defining question of our era: “Why would a student choose a tool that helps them learn, when there’s another tool that just gives them the answer instantly?”
The answer is: they won’t. If someone had offered fifteen-year-old me a machine that did my homework perfectly, I wouldn’t have said, “No thanks, I prefer the learning process.” I would have pressed the button.
The data support this pessimism. A recent MIT Media Lab study comparing students’ writing with AI versus writing alone found a troubling divergence: the AI group produced better essays but displayed lower brain connectivity and worse retention. We are trading neural pathways for paragraphs.
The Investor Angle: Why I Care
I am a venture capitalist. I look at founders and decide whether to back their vision from the idea stage onward. Why do I care about PISA scores?
Because PISA scores in 2022 are founder quality in 2026.
Estonia produces a strange density of successful startups—Skype, Wise, Bolt, Pipedrive. More unicorns per capita than anywhere else in Europe. This is why our venture fund is here, and why I moved my family here. These companies weren’t built by people who looked for shortcuts. Taavet Hinrikus didn’t disrupt international banking because he found a “polished response.” He disrupted it by sitting with a hard problem — why is this slow, why is this expensive? — and spent years building a solution.
In education, there is a concept called “desirable difficulty.” The struggle of being stuck and frustrated is not a bug; it’s a feature. It builds the psychological capacity to tolerate uncertainty — the exact capacity required to start a company.
If we remove the difficulty, we remove the growth. A hint of our current direction is hidden in the PISA 2022 report. Finland, the former champion, reported that only 37.5% of pupils are ready to put in additional effort when work becomes challenging — the lowest among all participating OECD countries.
Teachers Are the Gold Reserve
Estonia’s Education Minister, Kristina Kallas, calls teachers the “absolute epicenter” and the “gold reserve.” In Estonia, teachers helped build the state after independence; they were founders, not just employees. That cultural weight is our greatest advantage.
The original Tiger Leap invested in teachers. The AI Leap must do the same, but the task is harder. In 1997, a computer couldn’t answer questions about the French Revolution; it just helped you find the books. Today, AI can write a better essay on the Bastille than most researchers.
The teacher’s job is no longer to transmit knowledge — machines have commoditised that. The job now is to teach students to think critically about what the machine spits out. It’s about evaluating whether a student actually understands a concept or has simply learned to prompt well.
This requires a complete pedagogical shift. We need to move toward “expensive” assessment: oral examinations, defending ideas in person, and watching a student’s face while they think. The cheap way of measuring learning is broken forever.
Normaalne
Estonians have the word normaalne. It’s a baseline expectation that things should work properly, and competence is assumed. I’ve said before that if other countries implemented even 5% of the Estonian “normaalne” approach, they’d see 10X improvements.
The question is whether we can apply normaalne to AI. Can we do the boring, difficult work of training teachers and redesigning assessments rather than just rushing to look innovative?
The risk isn’t that AI will destroy education overnight. The risk is more subtle: we put the technology in, congratulate ourselves on being forward-thinking, and discover a decade later that we’ve produced a generation with shiny output but no independent capacity to think.
Finland’s slide was the result of a thousand “reasonable” decisions that prioritized student autonomy over structure. By 2022, the “extra mile” had become a bridge too far for 60% of Finnish students.
Estonia can learn from this. But learning means accepting that AI isn’t the point. The teachers are the point. The difficulty is the point. The AI is like the weather — it’s here whether we like it or not. But we must decide if we are teaching our children to build shelters or just giving them an umbrella and telling them it’ll never stop raining.
Estonia has surprised the world before. For the sake of the next generation of founders, I hope we do it again.
Eesti 2030: Uus enesemääramine
By Sten Tamkivi & Daniel Vaarik
Idufirmad kasvavad kümme korda kiiremini kui ülejäänud majandus, kuid mida tähendab tehisintellekti superefektiivsus Eesti riigile?
KAREN K BURNS: Estonia Deserves Better Than This AI Advisory Committee
Last week, the Estonian Prime Minister announced the formation of an AI advisory committee to guide government policy on artificial intelligence.
The composition? Five men: three entrepreneurs, one venture capitalist, and one IT figure whose relevance peaked years ago. No member has deep experience in building or researching AI systems. Not one woman. Not one ethicist, sociologist, legal scholar, or representative from the vulnerable communities this technology will profoundly affect.



