Apple wants AI to run directly on its hardware instead of in the cloud::iPhone maker wants to catch up to its rivals when it comes to AI.
Apple wants AI to run directly on its hardware instead of in the cloud::iPhone maker wants to catch up to its rivals when it comes to AI.
I’ve been playing with llama.cpp a bit for the last week and it’s surprisingly workable on a recent laptop just using the CPU. It’s not really hard to imagine Apple and others adding (more) AI accelerators on mobile.
Oh yes and the CPUs on phones have being getting more powerful every year and there was nothing that could take advantage of their full potential now with a local AI will be great for privacy and response.