Is Memory Going to Become the New iMessage Lock-in?
Chances are that if you are reading this, you know all about iMessage lock-in. In case you do not, here is the gist: iMessage serves as both a technical and cultural lock-in keeping users on iPhone, making it more complicated to switch to an Android device. Of course it is entirely possible to go through all of the steps to switch, but there is so much friction that most simply choose not to. A few weeks ago OpenAI rolled out an advanced memory system to ChatGPT, making it so that the LLM can remember all of your previous conversations together. It is a tremendously useful feature that I wrote about already. But over the past several weeks, I have wanted to once again explore using other models in my workflow. I had been pretty steady with ChatGPT, but the latest version of Gemini and new apps like Raycast for iPhone have sparked my interest. I have been attempting to use them in my daily life but ChatGPT’s extensive knowledge of me, my interests, my current life situation, and the projects that I have been working on with it have created a great deal of, you guessed it, friction.
I can ask ChatGPT about things we were discussing the other day, reference things on the fly, and pick up where I left off without having to return to the previous chat thread. But when I open up Gemini or Claude, they know nothing about me. They cannot reference previous chats and they certainly cannot simply be hot swapped into your workflow if you’re a heavy ChatGPT user. Those other models just are not in tune with you the way that ChatGPT is after some extended use. This leads me to the main difference between memory lock-in and iMessage lock-in. iMessage lock-in is an Apple-specific issue widely argued about across the industry. Switching from Android to iPhone is easy and Google does not lock users into a proprietary messaging tool forcing you to disable it before you switch. There is also no cultural stigma associated with switching to iPhone from Android. I would argue that in most cases, the general public in the US would go straight to “congratulations, you upgraded to an iPhone!” Blue bubbles are a big part of the culture in the US but they are a uniquely social thing. Memory is a personal thing. No one else cares, at least now, which LLM you choose to use in your daily life. Except for you. You care. Each one is unique in its own way. But none of them at the moment match the power of ChatGPT with its memory. So, a few years from now, maybe even sooner, you can easily imagine a world where each of the LLM providers locks you in by making memory exclusive.
We can catch this before it happens though. How? Memory should be exportable and importable from every provider. We do not need a new kind of proprietary format or anything like that. We just need OpenAI, Google, Anthropic, xAI, Microsoft, and others to implement memory and incorporate a way to migrate all of your chat history over in a click or two. It does not even have to be easy, it just needs to be possible.
This might seem like I am getting ahead of myself, particularly because ChatGPT is the only option that has memory like this. Others have ways to manually store information about you, but those just are not natural and they generally have small context windows. I hope that once all of these companies finally implement memory (to me it seems essential) that they build in these migration tools. I imagine there are conversations to this effect happening in conference rooms in San Francisco, they probably do not want to make it easy to switch. It would mean that they would have to compete continuously for users. But here is the thing, I do not want this new cohort of tech giants to end up coasting the way that legacy ones often have because of similar factors. If they really are all the future of technology, and I think they are, they should do things differently.