Our smart devices take voice commands from us, check our heartbeats, track our sleep, translate text, send us reminders, capture photos and movies, and let us talk to family and friends continents away.
Now imagine turbocharging those capabilities. Holding in-depth, natural language exchanges on academic or personal queries; running our vital signs through a global database to check on imminent health issues; packing massive databases to provide comprehensive real-time translation among two or more parties speaking different languages; and conversing with GPS software providing details on the best burgers, movies, hotels or people-watching spots trending along your route.
Tapping into the seductive power of large language models and natural language processing, we've witnessed tremendous progress in communications between us and technology that we increasingly rely on in our daily lives.
But there's been a stumbling block when it comes to AI and our portable devices. Researchers at Apple say they are ready to do something about it.
The issue is memory. Large language models need lots of it. With models demanding storage of potentially hundreds of billions of parameters, commonly used smartphones such as Apple's iPhone 15 with a scant 8GB of memory will fall far short of the task.
In a paper uploaded to the pre-print server arXiv on Dec. 12, Apple announced it had developed a method that utilizes transfers of data between flash memory and DRAM that will allow a smart device to run a powerful AI system.
The researchers say their process can run AI programs twice the size of a device's DRAM capacity and speed up CPU operations by up to 500%. GPU processes, they say, can be sped up to 25 times current approaches.
To read more, click here.