Apple may be working on a way to let LLMs run on-device and change your iPhones forever

Image: Collected
Apple researchers have apparently discovered a method that’ll allow iPhones to host and run their own large language models (LLMs).

With this tech, future iPhone models may finally have the generative AI features people have been eagerly waiting for. This information comes from a pair of papers published on arXiv, a research-sharing platform owned by Cornell University. The documents are pretty dense and can be tricky to read, so we’re going to break things down for you. But if you're interested in reading them yourself, the papers are free for everyone to check out.

One of the main problems with putting an LLM on a mobile device is the limited amount of memory on the hardware. As VentureBeat explains in their coverage, recent AI models like GPT-4 “contain hundreds of billions of parameters”, which is a quantity smartphones have difficulty handling. To address this issue, Apple researchers propose two techniques. The first is called windowing, a method where the on-board AI recycles already processed data instead of using new information. Its purpose is to take some of the load off the hardware.

The second is called row-column bundling. This collects data into big chunks for the AI to read; a method that will boost the LLM’s ability to “understand and generate language”, according to MacRumors. The paper goes on to say these two techniques will let AIs run “up to twice the size of the available [memory]” on an iPhone. It’s a technology Apple must nail down if it wants to deploy advanced models “in resource-limited environments”. Without it, the researchers' plans can't take off.

On-device avatars
The second paper is centered around iPhones potentially getting the ability to create animated 3D avatars. The content will be made using videos taken by the rear cameras through a process called HUGS (Human Gaussian Splats). This tech has existed in some form before this. However, Apple’s version is said to be able to render the avatars 100 times faster than older generations as well as capture the finer details like clothing and hair.

It’s unknown exactly what Apple intends to do with HUGS or any of the techniques mentioned earlier. However, this research could open the door for a variety of possibilities, including a more powerful version of Siri, “real-time language translation”, new photography features, and chatbots.

Powered up Siri
These upgrades may be closer to reality than some might think.

Back in October, rumors surfaced claiming Apple is working on a smarter version of Siri that'll be boosted by artificial intelligence and sporting some generative capabilities. One potential use case would be an integration with the Messages app, letting users ask it tough questions or have it finish up sentences “more effectively.” Regarding chatbots, there have been other rumors of the tech giant developing a conversational AI called Ajax. Some people have also thrown around “Apple GPT” as a potential name.

No word on when Apple’s AI projects will see the light of day. There has been speculation that something could roll out in late 2024 alongside the launch of iOS 18 and iPadOS 18, although exactly when we'll see any of this remains unknown.

Source: https://www.yahoo.com

Tags :

Share this news on: