Apple’s big AI gamble: The iPhone will now understand users by reading their facial expressions
Now your iPhone will be able to read your face and understand what you’re trying to say without you even speaking.
Apple tries to provide users with a better experience every year by adding new and advanced features to its iPhones. Now, the company is working on a feature that will allow the iPhone to not only listen to users’ voices but also understand their facial expressions and gestures.
Apple has acquired Q.ai, an Israeli artificial intelligence startup that works on advanced audio and imaging technology. Apple confirmed the acquisition on Thursday, although the financial terms were not disclosed.
Q.ai uses machine learning to improve audio processing in devices, especially in noisy or challenging environments. This technology is capable of understanding whispers and clarifying voices amidst background noise. Importantly, Q.ai is working on understanding softly spoken words by recognizing even the slightest movements in facial skin.
Last year, the company also filed a patent describing how subtle movements of facial skin can be used to read speech, identify a person, and estimate their emotional state, heart rate, and breathing patterns. This would allow future Apple devices to understand what the user wants to say, even if their voice is very soft or completely inaudible.
However, Apple has not yet clarified how it will integrate Q.ai’s technology into its products. Over the past year, the company has added several AI-based audio features to its devices. Its AirPods already support live translation.
Apple is working on new ways to adapt its devices to challenging real-world conditions. The company is also developing a system that recognizes subtle movements of facial muscles, which could further enhance the user experience on headsets like the Vision Pro.
