Share the post "Apple Neural Engine To Help With Facial and Speech Recognition"
In a new development, Apple is reportedly working on a new kind of chip, potentially for future iOS devices, that will be used just for processing.
According to media reports, the chip called the Apple Neural Engine internally, and could be used for offloading facial recognition in the photos application, some parts of speech recognition, and the iPhone’s predictive keyboard to the chip.
By moving AI processing to a dedicated chip, battery life in devices could also see a boost since the main CPU and GPU wouldn’t be crunching as much data and gobbling as much power.
The report says Apple plans to integrate the chip into its devices, but it’s unclear when that’ll happen, and if any iOS devices launching this year will have it.
Apple’s work on an AI chip shouldn’t surprise anyone who’s paying attention to the competition. Virtually every tech company is working on improving AI processing on mobile devices.
Qualcomm’s latest Snapdragon 835 chip, which is already in devices like the Samsung Galaxy S8, has a special module dedicated to processing AI tasks.
Years ago, Apple started designing its own mobile processors to improve performance and reduce power consumption, and it’s really paid off.
Despite having fewer cores, the iPhone 7 still crushes the Galaxy S8 when it comes to sheer performance.
iPhones and iPads also come with an Apple-designed M-branded motion coprocessor to collect sensor data from the various included sensors (accelerometer, gyroscope, compass, etc.). It’s this M chip that helps with tracking health and fitness data.
Furthermore, in addition to the main Intel processors in the new MacBook Pros, there’s also a small Apple-made T1 chip for powering the Touch Bar. Apple’s AirPods also have a custom W1 chip that helps with pairing them to iOS devices.
Clearly, Apple loves making custom chips for things. We’re all for it, especially if that means longer battery life.