Welcome @soleblaze
Thanks for sharing your thoughts.
Apple would have to do something either very limited in scope or very magical to make a local LLM (1) broadly useful, (2) private, and (3) performant.
You can run Ollama on a Mac but it’s not up to the tasks our userbase are expecting. For example, you can’t feed it a PDF to summarize. And it still guides you to (stay online for better results) .