We earn commissions when you shop through links on this site — at no extra cost to you. Learn more

Back to all essays
Own Your TechSovereign ComputeGeorge's Takes

AI Real Estate: From Apartment to Mansion

·6 min read
George Pu
George Pu$10M+ Portfolio

27 · Toronto · Building to own for 30+ years

AI Real Estate: From Apartment to Mansion

A note — when you shop through links in this post, I earn a commission — at no extra cost to you. It doesn't affect what I recommend. Full policy

AI Real Estate: From Apartment to Mansion

A $3,999 Mac Studio runs a 122 billion parameter AI model on your desk.

Running the same model on AWS costs roughly $5 an hour. About $3,700 a month if you leave it on.

The Mac pays for itself in five weeks.

I posted this as a tweet this morning. The response was immediate, but the pattern in the comments mattered more than the numbers. Over and over, people said the same thing.

I didn't know this was possible.

That's the real problem. Not the technology — the awareness.

The AI you use every day lives on someone else's servers.

Under someone else's rules. At someone else's pricing.

Almost nobody realizes there's another option, because nobody is explaining it in terms that make sense.

So let me try.

Think about it like real estate

Everyone understands the difference between renting an apartment and owning a mansion.

You don't need to know how mortgages or property taxes work to feel the gap.

The same spectrum exists for AI.

Most people don't know they're standing on the bottom rung.

There are four tiers. Each one is a legitimate answer for different people.

The point isn't that everyone needs to climb to the top.

The point is that the ladder exists and almost no one is pointing at it.

Tier 1: Rent the apartment

This is what you're doing right now.

You open ChatGPT or Claude. Type a question. Get an answer.

Someone else handles everything - the servers, the model, the updates, the infrastructure.

It's convenient. It's fast. It works.

But you own nothing.

Your conversations live on their servers.

Your data trains their models unless you opt out, and good luck verifying that.

They can raise prices whenever they want. They can change terms overnight.

They can shut off access tomorrow and you have no recourse.

It's a nice apartment. The landlord is friendly. The furniture is included.

But when the lease changes, you find out by email.

If this is you, that's fine. Most people are here.

You should just know it's not the only floor in the building.

Tier 2: Rent the house

You move to a bare-metal GPU. A physical server in a datacenter you choose.

DigitalOcean in Toronto. Lambda in the US. Hetzner in Europe.

You pick the jurisdiction. You pick the model. You run your own software stack.

Nothing proprietary touches the machine.

It costs somewhere between $500 and $5,000 a month depending on the hardware.

You still don't own the server, but you own everything that runs on it.

Your data stays in the jurisdiction you chose.

Your models are open-source and replaceable.

If one provider raises prices, you move to another.

You have leverage because you're not locked in.

It's a house instead of an apartment. More space, more control. If the landlord gets difficult, you pack up and leave without losing your furniture.

For most businesses handling sensitive data - law firms, healthcare, financial services, anyone with client confidentiality to protect — this is the practical path.

You don't need to own hardware. You just need to own your stack and choose where it lives.

Tier 3: Buy the house

This is what I've been doing.

You buy a Mac Mini or a Mac Studio. Install Ollama. Download an open-source model. Run it on your desk.

$599 to $10,000 depending on the spec.

After that? Inference is free. No monthly bill. No API calls. No metering.

A $3,999 Mac Studio with 192GB of unified memory runs a 122B parameter model.

That's not a toy. That's a production-capable AI sitting in your office, running on your power, answering to nobody.

No landlord. No terms of service. No one can raise your rent because there is no rent.

This is the purest form of ownership on the ladder.

You own the chip. You own the software. You own the building it's sitting in. Your home. Your desk. Your rules.

The limits are real. Your home internet isn't redundant.

If the power goes out, your AI goes down.

You're bounded by the hardware you bought - no elastic scaling, no burst capacity. You maintain everything yourself.

And the machine lives in your space, which means your space has to accommodate it.

But for a solo founder, a content creator, a small team that wants to experiment with ownership without a monthly bill - this is the starter home.

Small. Yours. And it appreciates every time open-source models get smaller and faster, which is happening fast.

Tier 4: Buy the mansion

This is where it gets interesting. This is where almost nobody is looking.

You buy an NVIDIA GPU or a high-spec chip.

You ship it to a colocation facility.

If you're finding this useful, I send essays like this 2-3x per week.
·No spam

They rack it in a cabinet with redundant power, industrial cooling, dual network carriers, and physical security.

You SSH in from anywhere in the world.

Total cost: the chip up front, plus roughly $1,000 a month for the rack.

This is the Bobby Axelrod play.

You own the mansion outright.

The grounds crew keeps the lights on, the HVAC running, the security tight.

You show up when you want. The place runs whether you're there or not.

The tier-3 house is yours, but it's bounded by your home's infrastructure.

Your Wi-Fi. Your power grid. Your physical space. Your tolerance for the noise of a GPU running at full tilt.

The tier-4 mansion is yours too, but it lives where industrial equipment belongs: in a facility built to handle it.

You've outgrown residential. Residential can't support what you're trying to do.

Most founders I talk to don't know this option exists.

They think the choice is "rent from OpenAI forever" or "buy a Mac and hope your home Wi-Fi holds."

There's a whole tier above that, and the people renting you the apartment have no incentive to mention it.

The math works at surprisingly small scale.

$10,000 for the chip.

$12,000 a year for the rack.

$22,000 all-in for the first year.

$12,000 a year after that, for as long as the chip lasts.

For inference workloads, chips last a long time.

Compare that to $3,700 a month on AWS for similar capability. $44,400 a year.

The mansion pays for itself in six months and keeps paying you back every month after.

This isn't a tier reserved for hedge fund managers. Any serious agency, law firm, or mid-size consultancy handling sensitive client data can afford it.

The reason they haven't is that nobody has walked them up to the gate and shown them it's open.

Why this matters right now

AI is not like other software. It's not a tool you use and put away.

It's becoming the layer that touches everything you do.

Your communication. Your analysis. Your decisions. Your client work. Your thinking.

The AI layer will be as fundamental to your business as your operating system. Probably more.

And right now, almost everyone is renting it from three or four companies who can change the terms at any time.

I'm not saying everyone needs the mansion.

The apartment is fine for casual individual use.

Renting the house makes sense for businesses that want sovereignty without capital expense.

Buying the house is the entry point for anyone who wants to experience ownership on their desk.

The mansion is for anyone who has done the math and decided they want the infrastructure their business depends on to actually belong to them.

The point is that all four floors exist.

Most people only know about the first one. The companies renting you the apartment are not going to be the ones to tell you the mansion is on the market.

Where I am on the ladder

I'm climbing it myself, in real time.

Right now I'm running models on cloud infrastructure and on a Mac on my desk. Tier 2 and Tier 3, in parallel.

In two months, when Apple ships the new Mac Studio, I'm upgrading the house. Bigger unified memory, faster inference, more headroom for the models that are coming.

After that, the mansion. I'm buying NVIDIA chips and putting them in a colo 50 steps from my office.

Close enough to walk over when something needs hands.

Far enough that the noise, heat, and power draw aren't my problem.

Fifty steps. That's the whole gap between residential and industrial.

Between "buy the house" and "buy the mansion."

Most people think the mansion is some abstract thing that lives in AWS us-east-1. It's not. It's a building you can walk to.

I'll document every step. Real costs. Real tradeoffs. Real experience.

Because right now, almost nobody is doing that for people who don't already speak the language.

The first act of sovereignty isn't buying the mansion.

It's knowing the mansion exists.