Craig Saves the Day, Gives Engineers the Green Light to Use Off-the-Shelf LLMs

According to The Information, Apple is now letting engineers build products using third-party LLMs. This is a huge change that could seriously alter the course of Apple Intelligence. I have been a proponent of Apple shifting its focus from building somewhat mediocre closed first-party models to using powerful open-source third-party ones. The company is already so far behind its competitors, that I would rather they focus on making really good products rather than really good models. They may be able to do both simultaneously, but I suspect that Apple could ship spectacularly good products using off the shelf models sooner as other teams build first-party models for the future in secret.
Now that Gemini can see your screen on Android and Copilot can see your screen on Windows, these competing assistants are now truly on another level. They are really assistants in the truest sense of the word now that they can visually understand your context while you are using your devices. They make Android and Windows devices more compelling alternatives too. I am hopeful that Apple will be able to catch up much faster by doing what they do best, which is productizing advanced technology. The only difference is that this time, at least for now, they will be productizing someone else’s tech.
I imagine they will be largely looking at Mistral, Gemma, and Phi. Llama would be an obvious contender too if the company’s relationship with Meta was not so contentious. DeepSeek would be another option, though the optics of using it outside of China would likely not be ideal. We will just have to wait and see!