We earn commissions when you shop through links on this site — at no extra cost to you. Learn more

Back to all essays
Own Your TechSovereign Compute

Open Source AI Is Smaller Than You Think. I Did the Math.

·5 min read
George Pu
George Pu$10M+ Portfolio

27 · Toronto · Building to own for 30+ years

Open Source AI Is Smaller Than You Think. I Did the Math.

The Week That Changed My Mind

I spent last week inside open source AI.

Reading repos. Testing bundles. Following the usual suspects - Open WebUI, Ollama, LM Studio, Jan, Goose.

By Friday I was convinced every serious founder was self-hosting something.

GitHub stars everywhere. New releases every week. The timeline on my feed was wall-to-wall "ditch ChatGPT, run it local, own your stack."

Then I did the math.

Here's what I actually found.

Stars Lie. Forks Tell the Truth.

GitHub stars are a vanity signal. A star costs one click and commits you to nothing.

People star things they saw in a tweet, things they want to read later, things their friend made.

The star count of an open source AI project has almost nothing to do with how many people actually installed it and used it.

Forks are closer to the truth. You fork when you want to run, modify, or deploy. It's still not a perfect signal — some forks are dead, some are scraping, some are just contributors — but a fork means somebody clicked past read-only.

So I pulled the ratios on the projects everyone says are winning.

Open WebUI: 132K stars, 19K forks. Ratio ~14%.

Ollama: 169K stars, 16K forks. Ratio ~9%.

These two are the self-hosting stack. Ollama is the engine — the thing that actually runs models on your hardware.

Open WebUI is the interface — the thing that makes it feel like ChatGPT. Together they're what anyone installs when they want to run AI locally.

Ollama has more stars than Open WebUI but a worse fork ratio. 169,000 people clicked a button. 16,000 got curious enough to clone and attempt to run it.

The Funnel Nobody Talks About

Now here's where I have to be honest: what follows are assumptions, not data.

Nobody publishes install-to-active-user conversion rates for open source AI projects. But the assumptions are illustrative, and I think they're directionally right.

Most forks never successfully install.

Self-hosting anything with GPUs, CUDA, Docker, and a model registry is not a one-click experience in 2026, no matter what the README claims.

Take Open WebUI's 19,000 forks — the generous number. Say half get to "it works on my machine." ~9,500 successful installs.

Most installs are tire-kickers. People spin it up on a Saturday, realize their laptop doesn't have enough VRAM, close the tab, move on. Say 20% become recurring users. ~1,900 people.

Open WebUI's own website claims 290 million downloads. That number includes every Docker pull, every pip install, every CI/CD pipeline that fetched the image automatically.

The gap between 290 million downloads and maybe a few thousand recurring users tells you everything about how vanity metrics work in open source.

The Ratio

ChatGPT has 900 million weekly active users. Over a billion monthly.

Even if my funnel assumptions are off by 10x — even if there are 20,000 daily active Open WebUI users globally — the ratio is still roughly 50,000:1.

That's the size of the bubble we're in.

I Am One of These People

I'm not mocking this community. I'm in it.

I've been planning to buy a Mac Studio when M5 Ultra drops for this exact reason. I read these repos. I know the contributors. I think this stuff matters.

But being inside the bubble means everyone I talk to is inside the bubble, and the feed reflects the bubble back at me.

Inside the bubble it feels like everyone is self-hosting. Outside the bubble, almost nobody is.

The Free-User Problem

Here's the uncomfortable part for anyone building in this space: the individual self-hoster does not convert to a paying customer.

Want the full playbook? I wrote a free 350+ page book on building without VC.
Read the free book·Online, free

Yes, open source produces real businesses — but they sell to enterprises, not to the guy running Ollama on a Mac Mini.

The person who installed it because it was free will install the next thing because it's free too. If your business plan starts with "get open source AI users to pay," you've already lost.

The Actual Opportunity

Not selling open source AI to open source AI users. The numerator is too small and the willingness to pay is too low.

The opportunity is translation. Taking what works in this tiny-but-real nerd bubble and making it legible to the 999 million people in the ChatGPT bucket who would benefit from sovereign AI if it were actually usable.

Duolingo did this for language learning. Notion did this for developer tools. Apple has been doing it for half a century — taking nerd hardware and making it normie-friendly.

The OSS AI stack already works. The quality is there.

Qwen is fine. Gemma is fine. Llama is fine. A Mac Mini M4 at $599 can run a useful model. That's not the bottleneck anymore.

The Bottleneck Is People, Not Software

The bottleneck is a 50-year-old accountant in Saskatoon who doesn't want Sam Altman reading their client's tax returns can't currently figure out how to install Open WebUI + Ollama + their data + a reverse proxy with SSL.

The bottleneck is a family doctor in Brampton who knows her patient notes shouldn't be on American servers but has no idea that a $599 Mac Mini could run a local model good enough for clinical documentation.

The bottleneck is a litigation partner at a Bay Street firm who heard "AI" at a conference, went back and asked IT about it, and IT said "we can't, privacy policy," and that was the end of the conversation.

Nobody in the developer community is motivated to fix any of this, because developers think it's already easy.

These Three People Would Pay

Not for the software. The software is free and it's going to stay free. You can't monetize the stack.

They'd pay for someone who says "here's the box, here's what it does, here's how your data stays yours, and here's my number if it breaks."

The Nespresso model for AI. Collapse 14 decisions into one. Pre-configured hardware. One-click deployment. A phone number for when something goes wrong.

The 10,000 people in the bubble aren't the customers. They're the supply chain.

They build the tools. The business is being the bridge between the builders and the billion people who need what they built but will never touch a terminal.

My Takeaway

I'm still going to run experiments in the bubble.

I'm still going to buy the Mac Studio when M5 Ultra drops.

I'm still going to bundle OSS AI for my own use and open source whatever I build.

But I'm not going to confuse being inside the bubble with being at the center of a market.

The bubble is maybe 10,000 people globally.

The market — the real one — is the accountant in Saskatoon, the doctor in Brampton, and the partner on Bay Street who all need the same thing: AI that works, that they own, that someone they trust set up for them.

The path runs through translation, not through the repo.

If you're inside this bubble with me, nod, and then look up and ask who you're actually building for.