Resolves yes if Apple announces that Siri has been upgraded to use a LLM which runs on-device. Note that the size of the model is not a consideration for the market; a LLaMA sized model or smaller is fine.
https://www.macrumors.com/2023/12/07/apple-iphone-16-microphone-upgrade-siri-ai/
Writing in his latest Medium post, Kuo says that "strengthening Siri's hardware and software features and specifications is the key to promoting AI-generated content," adding that Apple's generative AI ambitions and integration of large-language models (LLMs) into Siri
This didn't happen today, but it seems that with iOS 17, on-device LLMs will be doing computation on-device for their autocomplete function.
@FranAbenza LLMs are not using a new architecture - Attention is All You Need was published in 2017. I cannot find confirmation that SIRI is using a transformer, because Apple publishes absolutely nothing about the technical details of the algorithm, but it would not be surprising if SIRI is already transformer based.
@vluzko afaik Siri is powered by Wolfram. But probably uses a transformer model to glue everything together. I am aware of the 2017 paper. But chatGPT for example adds some expansions to the architecture concept