Apple's models are not competitive. Apple has not demonstrated any leadership in fundamental models so far, and I don't expect that to change any time soon.
If anything I'd expect Google, OpenAI, Anthropic ... Or even Meta to have a better on-device "lite" model before Apple.
2. Apple’s been using machine learning, neural networking and other AI technologies in their operating systems before most of these AI companies even existed.
3. Apple was the first to ship AI-specific hardware (Neural Engine) in the iPhone X in 2017
> Google, OpenAI, Anthropic ... Or even Meta to have a better on-device "lite" model before Apple.
Google's "lite" model Gemma 2B has 2 billion parameters and their Gemini Nano has 1.5 billion. The largest lite model Meta tops out at 1.5 billion.
The model demonstrated at WWDC has 3 billion parameters [1] with 2-bit and 4-bit quantization. So Apple has already surpassed Meta and Google when it comes to LLMs on smartphones from the jump. Seems pretty competitive.
3rd party developers will have access to it as well. If the task requires a more comprehensive model, it can access Apple's Private Cloud Compute [2] seamlessly. Sounds like a win-win to me.
This model is also exposed to automation; you'll be able to create custom workflows that incorporate AI very easily.
[1]: From Apple's press release:
A key component of Apple Intelligence is a new on-device Foundation Model. This model, with approximately 3 billion parameters, has been engineered for efficiency and is optimized to run directly on Apple's latest silicon. To achieve this, Apple has employed advanced quantization techniques, including 2-bit and 4-bit quantization for the model's weights and an 8-bit KV cache, which significantly reduce the model's memory and computational footprint without compromising performance
If anything I'd expect Google, OpenAI, Anthropic ... Or even Meta to have a better on-device "lite" model before Apple.