I get the question at least twice a week. Sometimes in a WeWork elevator, sometimes over $80 cocktails in Lan Kwai Fong, sometimes in a desperate 2 AM WhatsApp voice note. "James, I have this idea. Can you look at it and tell me if it's viable?"
I used to try to answer directly. "The market seems crowded." "The unit economics look thin." But those are opinions, not diagnostics. After enough burned fingers—mine and other people's—I've realized that viability isn't a feeling. It's a math problem with three variables.
I found the cleanest articulation of this in an unlikely place: Stanford's 423-page 2026 AI Index Report. Everyone else was obsessing over model benchmarks and data center power consumption. I was reading the methodology appendix, because I'm broken that way. Their analytical framework for judging the AI industry turned out to be a universal truth for judging any industry.
Here are the three filters. Run your idea through them. Most don't make it past the second.
Filter One: Penetration Rate (How Much Friction Is Left?)
Penetration rate isn't just "how many people use this?" It's how much friction is required to make someone use this daily?
Here's a counterintuitive fact from the Stanford report: the US—home of OpenAI, Google, Anthropic, and basically the entire AI industry—ranks 24th globally in generative AI penetration. Twenty-fourth. At 28.3%.
Who's first? Singapore at 61%. The UAE at 54%. India and Indonesia close behind.
How does the country that invented the technology barely crack the top quarter of adoption?
Friction. Legacy infrastructure. Entrenched corporate interests. HIPAA, SOX, state-level privacy laws, procurement committees, compliance review boards. The US has a hundred years of institutional sediment that AI has to push through. Every enterprise sale requires six months of security review. Every hospital deployment needs legal sign-off. The technology is ready; the market isn't.
Meanwhile, the UAE appointed a Minister of State for AI in 2017—five years before ChatGPT existed. By 2025, AI was mandatory in primary schools. The government subsidized the friction of market education. They didn't just build the tech; they greased the rails.
The question for your idea: Are your potential customers already desperate for this? Or do they think "sounds interesting, but not right now"? If you're entering a low-penetration market, your primary expense isn't product development. It's market education cost. And market education is the most expensive thing in business. It's a bonfire of cash with no guaranteed ignition.
Most founders assume that because the technology is superior, adoption will follow. The US AI market proves that's a lie. Superiority without penetration is just a beautiful product in an empty room.
Filter Two: Value Capture (Who Keeps the Money?)
This is the cruel one.
Value capture asks: Of all the wealth your industry creates, how much actually flows to the people selling it?
Look at ChatGPT. Stanford estimates US users extract $172 billion in consumer surplus annually. That's the value of time saved, work automated, efficiency gained. But OpenAI's revenue? A fraction of that. Users capture $7,500 of annual value and pay $240 a year. The other $7,260 stays in their pocket.
Nobel laureate William Nordhaus found this pattern across the last fifty years of technology: innovators capture about 3% of the total value they create. The other 97% leaks to consumers.
Why? Because people don't pay premiums for "better." They don't pay for "faster." They don't even pay for "saves me an hour a day"—because that hour is abstract. Maybe they just spent it scrolling Instagram. The value is blurry, so the price stays low.
But there's one exception. One type of value people pay absurd premiums for: liability and reliability.
If you sell an AI compliance tool that guarantees a bank won't get fined by regulators, you're not selling efficiency. You're selling insurance against disaster. The value isn't abstract—it maps directly to a concrete fear. A $10 million fine avoided is worth $500,000 a year in software fees. The math is obvious and urgent.
The question for your idea: Are you selling hope or fear? Hope is optional. People defer hope. They'll "think about it" and never return. Fear is mandatory. If your product prevents a specific, measurable disaster, the price is elastic. If it just makes life slightly better, prepare for a lifetime of churn and discounting.
Filter Three: Capital Density (Which War Are You Actually Fighting?)
Capital density measures where the money is flowing, and more importantly, what tier of the industry it's flowing to.
In 2025, global corporate AI investment doubled. Private investment grew 127%. Half of all private funding went to generative AI. The numbers look like a gold rush.
But 95% of that money went to infrastructure—compute, data centers, foundational models. The application layer is fighting for scraps.
Every gold rush has two battlefields, and most founders pick the wrong one:
The Infrastructure War: Requires infinite capital and infinite patience. Data centers, chip fabrication, foundational LLMs, EV battery plants. You need billions to play, and you need to survive years of losses before scale dilutes your costs. This is where governments and mega-corps fight.
The Application War: Requires cash flow and retention. You don't need heavy assets, but you need to answer two questions on day one: How do I acquire a customer profitably? And why do they buy from me a second time?
Amateur founders look at the billions flowing into OpenAI and think: "The money is flowing! I need to jump in!" Then they try to build a "platform" that serves every industry with a thin AI wrapper, essentially fighting an Infrastructure War with Application War capital. They die quickly and quietly.
The question for your idea: Which battlefield does your bank account allow you to fight on? If you have $50,000 and six months of runway, you cannot build infrastructure. You must build an application that generates revenue before the cash runs out. Know your tier, or you'll be the first casualty in someone else's war.
The Practical Test: AI Training Courses
Let's run a real 2026 idea through all three filters. I hear this constantly: "I'll sell AI training courses to corporate professionals. Everyone needs to learn AI. It's a no-brainer."
Penetration Rate: Has the audience reached mandatory friction? No. Most professionals think: "AI is important, but my boss hasn't fired me yet. I'll wait." You're trying to educate a market that isn't in pain. Your customer acquisition cost will be brutal because you're selling vitamins, not medicine.
Value Capture: You're selling hope—staying relevant, future-proofing a career. But after two classes, the student thinks: "I learned some tools, but I don't know how to apply them to my actual job." The value is blurry. There's no immediate penalty for dropping out. Churn will eat you alive.
Capital Density: The founder thinks: "I'll build a platform to teach every role in every industry!" They're trying to fight an Infrastructure War (platform, content library, enterprise sales) with Application War capital (a course, a landing page, some ads). They'll run out of money before they prove unit economics.
Three filters, three failures. Not because the founder is stupid. Because the idea was fighting the wrong war, with the wrong value proposition, in a market that wasn't ready to pay for education.
The Honest Ending
Will these three metrics guarantee your startup succeeds? Absolutely not. Anyone promising a guaranteed success formula is selling you a course (probably about AI training).
But what this framework will do—every single time—is prevent you from fighting the wrong war, with the wrong capital, against a market that isn't ready for you. It won't tell you which idea will win. It will tell you which ideas will die before they deserve a chance.
Run the filters. Be honest about the answers. Most ideas don't survive contact with reality because the founder never let reality contact the idea in the first place.
— James, Mercury Technology Solutions, Hong Kong, May 2026


