What Do You Do While You Measure?
Building got fast. Learning didn’t.
I started the Saturday morning with 18 API tools for a product I’m building. By the end of the day, it was 32. All meaningful additions, all covering important questions like how you track web analytics when there isn’t really a website, and what “active usage” means for something that lives entirely inside other people’s AI tools. A full product sprint compressed into hours.
I wrote about it, shared it, optimised for SEO and AEO, started the distribution work. But whether any of it matters is a question that takes weeks to answer, not hours. The building is done. The learning hasn’t started.
MVP was a cost discipline
The Lean Startup loop solved for two things at once. It saved you from wasting money on building the wrong thing, and it helped you isolate what was working by keeping the product small enough to generate clean signal. Build the minimum, measure, learn, iterate. Both the cost discipline and the signal discipline were real.
The cost discipline is dead. The code, the hosting, the marketing content is more or less free now. When a landing page takes two hours and the full product takes a day, why test with the landing page? Why would someone sign up for something that doesn’t work yet when you could just show them the real thing?
The signal discipline is harder to dismiss. A smaller product is easier for users to understand and easier for you to measure. Build everything and someone bounces, you don’t know which part failed. But in practice, I think the balance has shifted. If you have taste and a clear idea, a complete product demonstrates value in a way that a landing page or a half-built prototype can’t.
But the code getting cheap doesn’t mean the product got easy. The thinking, what to build, who it’s for, why it matters, is still the hard part. Taste still matters. You just express it in hours instead of months.
The time wall
Here’s what the Lean Startup model never had to deal with: a world where building takes days and measuring takes weeks.
The loop was designed for a world where all three phases ran at roughly the same pace. The build phase naturally throttled the whole cycle. By the time you’d finished building, you’d usually absorbed some learning from the previous round. The phases overlapped, and the rhythm felt balanced.
Now one leg takes a day and the other two take weeks. A blog post takes six weeks to rank. A feature takes weeks to generate usage data. Word of mouth takes longer. You finish building on Tuesday and you’re standing there, complete product in hand, waiting for signal that arrives on the market’s schedule, not yours. The loop is lopsided, and the forcing function that used to pace it is gone.
The people building fastest can measure least
There’s a cruel dimension to this. At 150 visits a week, even perfect instrumentation doesn’t help. The data is too thin to learn from quickly. You need weeks or months to see any meaningful pattern. At scale, tens of thousands of visits, you can see statistically significant differences within days. You make a change and you know whether it worked.
The measurement gap is widest for the people building fastest. Solo founders, small teams, greenfield products. The ones most excited about the compressed build cycle are the ones least able to measure anything, because they don’t have the traffic, the users, or the sample size to generate signal quickly. The people who need to learn most urgently can learn least quickly.
The only risk left isn’t technical. Technical risk is near zero for most products. What remains is market resonance risk, and you can’t test for that without a market showing up.
What you do in the gap
So what do you do while you wait? Every instinct you have leads to a version of the same problem.
Build more features. You have no signal telling you which ones matter. You’re stacking decisions on top of unvalidated assumptions, but each individual bet costs so little that it feels rational. It’s the old hiding-in-the-workshop problem, just at 10x speed.
Switch to marketing and distribution. Aggressive outbound is faster than SEO. You can record demos, DM founders, post in communities, and force a response within days instead of waiting six weeks for a blog post to rank. But even aggressive distribution compresses into a couple of days of work, and then you’re waiting again, this time for people to check their DMs, try the tool, and form an opinion. The gap shrinks but it doesn’t close. You get qualitative signal faster (”it broke on Windows,” “does this work in Cursor?”), which is valuable, but it’s still not the quantitative signal that tells you whether the product has legs.
Build a different product. I try to keep my focus on marketing, but I can’t help myself from building new tools. A founder I know gave the same answer when I asked what he does with the freed-up time: “building brand new products that may never be released.” Now you have two products waiting for signal instead of one.
Idea creep
There’s a subtler trap than building too many features. When building is cheap, you don’t just add more knobs to the radio. You start building a radio that’s also a toaster and a fitness tracker, because it only took an extra hour.
Feature creep adds more to the same product. Idea creep changes what the product actually is, because the marginal cost of another concept is near zero. When building was expensive, you were forced to be an essentialist. When building is cheap, you have to be a disciplined curator, and that’s harder because nobody’s stopping you.
Idea creep does three things that feature creep doesn’t. It destroys your marketing clarity, because if the product covers three ideas, you don’t know what the headline should be, and a confused market gives you silence that tells you nothing. It explodes the mental context you have to hold, because three ideas means three personas, three feedback loops, three timelines running simultaneously, even if AI writes the code. And it muddies measurement permanently. A clean idea gives you clean signal, even if that signal takes weeks. A blended product gives you ambiguous signal forever. You never know if the idea failed or the packaging failed.
My 32 API tools were all deepening the same premise: analytics for products where there’s no website to analyse. Traffic breakdowns, conversion attribution, actor retention, event tracking, all answering the same core question from different angles. A user never sees 32 buttons on a dashboard, an AI agent just picks the right tool for the question being asked. That’s feature depth, not idea creep. The distinction matters, because one sharpens the signal you’re waiting for and the other destroys it.
Talk to 2,000 customers every year?
The old customer discovery rhythm was find five customers, have the conversations, take the feedback, build for two months, find five more. The build phase naturally paced the discovery. You had time between conversations to absorb what you’d learned and act on it before the next round.
Now the build phase is a day. To keep the loop balanced, you’d need to find five new customers every day. That’s the textbook answer — do things that don’t scale, talk to people, stay close to the problem.
It’s probably the right answer. But most people aren’t going to do it, and most people who’ve built successful products historically didn’t do it at that intensity either. They had an audience, or they got lucky with timing, or they just kept building until something stuck.
The new discipline
The old discipline was about restraining what you build. Now building is cheap, so it’s rational to just build the thing.
The new discipline is about what you do after you build. You don’t build to learn. You distribute to learn. And the speed of that learning has a ceiling the market sets, not you. No amount of AI-assisted building changes how fast a customer decides to try your product, or how long it takes for word of mouth to travel.
You can build a product in a day. Learning whether it matters still takes as long as it always did. And the only thing that actually shortens the gap is direct, unscalable contact with the people you’re building for. Concierge onboarding, personal demos, jumping on calls to debug someone’s specific setup. The time you saved on building has to go somewhere, and the place it has the most impact is the most human part of the work.
Turns out the human touch still has a place in the AI age.

sharp piece, the time wall is real. our briefings track exactly which products are getting investor attention so you can shorten the learning loop, would love to see what you think