Rumors have long suggested that Apple is developing a version of the AirPods Pro equipped with infrared (IR) cameras (via MacRumors). However, the purpose behind the cameras has so far remained quite vague.
Why did Apple spend $2 billion on a company earlier this year?
Earlier this year, the Cupertino giant paid $2 billion for Q.ai (its second-largest acquisition after Beats), an Israeli AI startup that develops technology to interpret microfacial movements.
I’m talking about reading whispered or unspoken words by analyzing skin and music movements in real time. At the time, the takeover caused quite a stir, but few responses. Now there is a growing theory that connects the dots.
The idea is simple: IR cameras on AirPods Pro could potentially track microfacial movements around the mouth and jaw, while Q.ai’s software could translate those movements into commands or text.
What do earbuds have to do with silent speech?
In other words, facial movements would allow users to compose messages, control apps, or talk to Siri without actually making a sound. In July 2025, Apple also received a patent for camera-based systems similar to Face ID’s dot projector for proximity detection and 3D depth mapping.
AirPods already have accelerometers and skin detection sensors, which could mean the hardware foundation is already in place. For regular users, this could mean interacting with devices privately, especially in noisy environments, or without disturbing those around them.
The exact use cases of how Apple will implement the technology and how it will be presented in iOS are still unknown to us. The AirPods Pro 3 with IR cameras are currently expected to launch later this year, probably in September 2026.




