It's Over

GPT-4.5 Flops

Hello readers,

Welcome to the AI For All newsletter! GPT-4.5 came and went, SoftBank is going into debt to keep OpenAI afloat, Microsoft canceled 1GW worth of data centers, and Cortical Labs fused human brain cells with silicon hardware. Is this hell? Let’s find out!

It’s Over

Sam Altman leaving OpenAI headquarters

I’m not going to mince words today. GPT-4.5 is a catastrophic disappointment and a damning confirmation of everything I have been warning you about. GPT-4.5 received underwhelming reviews and failed to make any real waves. Remember, this model was supposed to be GPT-5. It is marginally better than GPT-4o, bad at coding, and so expensive to run and use that OpenAI doesn’t even consider it a true successor.

OpenAI wrote, “GPT‑4.5 is a very large and compute-intensive model, making it more expensive⁠ than and not a replacement for GPT‑4o. Because of this, we’re evaluating whether to continue serving it in the API long-term as we balance supporting current capabilities with building future models.” Um, GPT-4.5 was your future model, and it’s only sort of better than GPT-4o while being way more expensive (as much as 30x).

Listen to me, reader. I need you to absorb the following: OpenAI has been working on this model since the release of GPT-4 (almost two years ago), and it’s a failure. They burned an obscene amount of money to build GPT-5, only to downgrade it to GPT-4.5 when it didn’t meet expectations. There won’t be a GPT-5 model, but rather some unholy Gesamtkunstwerk of a product that combines OpenAI’s current models.

GPT-4.5 did receive some faint praise. Andrej Karpathy wrote, “Everything is a little bit better and it's awesome but also not exactly in ways that are trivial to point to.” Well, that’s reassuring. Not vague and subjective at all. While doing damage control, Sam Altman wrote, “it is the first model that feels like talking to a thoughtful person to me. a heads up: this isn't a reasoning model and won't crush benchmarks. it's a different kind of intelligence and there's a magic to it i haven't felt before.” Meaningless drivel from the least objective person imaginable. What a chad this GPT-4.5 is, huh?

“it's awesome but also not exactly in ways that are trivial to point to”

“it's a different kind of intelligence and there's a magic to it i haven't felt before”

If alarm bells aren’t going off in your head right now, they should be. GPT-4.5 confirms the worst about generative AI. Contrary to popular belief, AI is not getting better. LLMs have flatlined. It’s over. The economics are hopeless. Never forget the names of the horrible men who led you down this ruinous path (unless you wish to forget them, and I assure you, I would have the greatest sympathy if you did) — Sam Altman, Dario Amodei, Elon Musk, Mark Zuckerberg, Jeff Bezos, Satya Nadella, Sundar Pichai.

But wait, it gets worse. In a desperate move, OpenAI is planning to charge $20,000 a month for “PhD-level agents” — a product it doesn’t have and very few (if any) people would buy. SoftBank is trying to fund OpenAI with money it doesn’t have, saddling itself with up to $24 billion in debt. Microsoft canceled over a gigawatt of data center capacity, which might suggest that it has doubts about the future growth prospects of generative AI. Satya Nadella has tacitly admitted that AI hasn’t generated much value, and we know that revenue has been modest and profits non-existent.

One possible explanation for Microsoft’s decision is DeepSeek-R1, an unremarkable but efficient model that made even normies question the amount of money being spent on GPUs and data centers. DeepSeek definitely has investors rethinking their strategies. “DeepSeek's lower-priced chatbot highlighted how competition will drive down profits for direct-to-consumer AI products and enterprise software companies may find it easier to monetize new technology,” said portfolio manager Brian Mulberry.

What’s ironic is that the people who are burning billions of dollars on chatbots are the same people who fetishize efficiency. Generative AI is efficient the way taking a sledgehammer to a house spider is efficient. If your job involves little more than reading and writing emails, of course generative AI is going to seem revolutionary. Whatever mundane use cases generative AI has, they’re not valuable enough to sustain the investment. Even if the cost of compute were to drop significantly, this would only further commoditize AI, splintering any profit and deflating the industry.

Barring some miracle nerd making a breakthrough, AI won’t amount to more than a ubiquitous feature. The slower and more expensive reasoning models (which don’t reason reliably and have more niche use cases) aren’t the answer here. Also, AGI is not imminent. Anyone who tells you otherwise is trying to sell you something.

🌈 Now for some positivity …

Cortical Labs introduced the CL1, a code-deployable biological computer. Real neurons are grown across a silicon chip, which sends and receives electrical impulses into the neural structure, allowing for computation. The neurons exist in a simulated world created by the OS, which reminds me of Joscha Bach’s ideas. Not since The End of Evangelion have we seen such biomechanical horror. I love it! The CL1 is basically a Raspberry Pi made out of HUMAN BRAINS!!! Just what the world needs.

🔥 Rapid Fire

🔎 Industry Insights

“Telecom operators have long provided the infrastructure to power communication and connect people. Now they are poised to take on a new role: building the AI infrastructure that enables enterprises, governments, and consumers to unlock AI’s full potential. But finding the sweet spot to capture meaningful revenue and renewed success will require speed and precision amid complex market dynamics, uncertain demand, and significant competitive headwinds.”

Source: McKinsey

💻️ Tools & Platforms

  • Qatalog → AI assistant for work

  • Anterior → AI platform for clinicians

  • Langbase → Serverless AI development

  • Raycast → Productivity tools with AI

  • Nesh → Sales AI for industrial