the memory moat

I've spent the last week properly getting stuck into GPT-5. It's definitely a noticeable step up in raw intelligence, and as someone who spends a fair bit of time coding, that's been really pronounced.

I haven't personally tripped over the stricter prompt adherence everyone's talking about. Seems like others have though, and prompt optimisation is suddenly back in the spotlight. Got me thinking: is this harsher response to lazy prompts actually down to the new routing mechanism? Maybe poor prompts lead to bad routing decisions, penalising users for something intended to improve the overall experience. That would be classic tech irony.

Anyway, one positive standout feature for me is GPT-5’s newfound knack for referencing past conversations, with much greater nuance and accuracy. It's pulled out some subtle throwbacks to chats from ages ago, which has genuinely impressed me. Is it just the bigger context window? Or have they been quietly tinkering under the hood?

Scrolling X earlier this week, I noticed Google’s Gemini announcement, where they’re finally bringing memory to the chatbot. Cool, I thought, it’s been ages since I’ve opened the Gemini app (I do use the API a fair bit), I’ll go check it out. And it was at that moment that for the first time I truly realised it'll be a struggle to shift away from ChatGPT as my daily driver. All those conversations stored up over the past few years are becoming genuinely valuable. If other users feel the same, OpenAI could end up with a seriously wide moat. Given they've hit 700 million weekly active users - nearly 9% of everyone on Earth, in about two years(!) - that could be a wiiiiide moat.

There's a lot of noise about AI valuation bubbles right now. Those multi-hundred-billion-dollar price tags often feel punchy. But if memory becomes the glue keeping users loyal, early movers like OpenAI and Anthropic might just end up looking smart rather than overpriced. Even Google, with its vast resources and reach, could find it tough breaking through the practical, emotional, even nostalgic ties of ♥ making-memories ♥!

Previous
Previous

Building a 3D Mind Palace to Understand Vector Embeddings

Next
Next

GPT-5 for one-shot simulation