Industry reports from the smartphone front suggest that Apple will inevitably launch an iPhone with its facial recognition camera tucked under the display, while Google’s Pixel 6 reportedly was set to release with face unlock capabilities in a move that was reversed. And while smartphone makers mostly use our faces to biometrically open phones and make custom emojis, researchers from Carnegie Mellon University previewed technology that allows users to perform actions with the movements of their eyes and simple hand gestures.
Under-display facial recognition coming for iPhones: industry figures
Avid iPhone users should expect Apple’s smartphones to keep its screen cut-out design to accommodate a Face ID camera until future iPhones in the near future can incorporate one beneath the display, says two notable Apple analysts.
Ming-Chi Kuo, an analyst with TF International Securities known for his forecasting of Apple’s future products, tweeted that he anticipates the first real full-screen iPhone – the iPhone 16 – to be released in 2024 with an under-display Face ID. He also writes that, “A low-light condition is detrimental to front camera quality, and ISP & algorithm are critical for quality improvements.”
It largely corroborates a report from Mark Gutman, Bloomberg’s technology correspondent, who writes in his newsletter ‘Power On’ that a fully-embedded front-facing camera for Face ID fixed onto the display is still “three or four years away.” He writes that the upcoming iPhone 14 will continue to feature the pill-shaped cut-out for Face ID and a circular cut-out for the camera as a stop-gap solution for Apple’s inability to feature an under-display camera for its face biometric system.
The swirl of industry analysis regarding Apple’s under-display camera for Face ID was made in March, when South Korean tech news site The Elec reported that Samsung would develop an under-display camera for the iPhone Pro 15 and iPhone 16. Ross Young, the CEO of Display Supply Chain Consultants, challenged the report, saying that it would first debut on the iPhone 16.
Pixel 6 Pro retracted face unlock method; may still be added: report
9to5 Google reports that Google’s Pixel 6 Pro smartphone was set to release with facial recognition as an unlocking method, but the decision was canceled close to launch for unknown reasons.
The news site cites sources that told them that the Pixel 6 Pro was intended to feature facial recognition alongside its under-display fingerprint biometric sensor. 9to5 Google also found advertising materials prior to launch that mention face unlock as an option. The sources did not divulge why face unlock was pulled.
More interestingly, 9to5 Google say a source informed them that Google still plans to add face unlock to the Pixel 6 Pro in the next major quarterly Android update, though plans might still change. The site also predicts that face unlock may become a default biometric feature for Pixel smartphones from now on. If true, it would be an unexpected U-turn for the Pixel line. Face unlock was removed from the Pixel 5 despite predictions, and Google’s flagship Android smartphone appeared to be leaning towards fingerprints as the sole biometric unlock option.
Carnegie Mellon research develops gauze-tracking smartphone tool
A research team at Carnegie Mellon University (CMU) has developed a way for people to manipulate and use their smartphones with their eye movements and hand gestures in front of a camera, making progress on an entirely hands-free control method for their mobile devices.
The researchers in the Future Interfaces Group at Carnegie Mellon University’s Human-Computer Interaction Institute sought to find a more natural way to interact with smartphones, and decided that where our eyes are focused is the “precursor” for our impending actions. Also recognizing that a second hand or voice commands could be unwieldly, the team developed the EyeMU tool, which allows users to perform actions on a smartphone with gaze control and hand gestures. The EyeMU was presented in a research paper at the International Conference on Multimodal Interaction in 2021.
The use of gaze-tracking and prediction is not new, with pre-existing applications for accessibility and gaming. But the research from CMU would be a leap forward in functionality on a smartphone. “Current phones only respond when we ask them for things, whether by speech, taps or button clicks,” says Andy Kong, an undergraduate researcher who was the lead author of the paper. “If the phone is widely used now, imagine how much more useful it would be if we could predict what the user wanted by analyzing gauze or other biometrics.”
Karan Ahuja, a doctoral student in human-computer interaction who participated in the research, says a challenge was making it compact enough to fit a smartphone. “That’s a resource constraint. You must make sure your algorithms are fast enough. If it takes too long, your eye will move along. ”
Chris Harrison, an associate professor in the HCII and director of the Future Interfaces Group who participated in the paper, says there are significant complications with gaze recognition on smartphones. Describing it as a “Midas touch problem”, he says, “You can not have a situation in which something happens on the phone everywhere you look. Too many applications would open. ”
But with his interest piqued in eye-tracking, Kong first developed a prototype of EyeMU that uses a laptop’s camera to track the user’s eyes to move the cursor around the screen. From there, he and Ahuja built on the framework with Google’s Face Mesh to analyze gaze patterns of users looking at different areas of the screen and rendered the mapping data. From there, the team developed a gaze predictor that uses a smartphone’s front camera to register where the user is looking and identify it as the target. Then they merged the gaze predictor with the smartphone’s motion sensors to enable commands. Some examples include a user staring long enough at a notification to identify it as a target, and then flicking it away with a head movement.
Harrison notes that big tech companies like Google and Apple have made advancements on gaze prediction, but “just staring at something alone does not get you there.” The associate professor concluded by saying, “The real innovation in this project is the addition of a second modality, such as flicking the phone left or right, combined with gaze prediction. That’s what makes it powerful. It seems so obvious in retrospect, but it’s a clever idea that makes EyeMU much more intuitive. ”
The biometrics industry is looking into eye movements as a new modality. Grand View Research projected a compound annual growth rate of 26.3 percent from 2018 to 2025 for the global eye tracking industry that would translate into $ 1.75 billion in generated revenue.
accessibility | Android | biometrics | eye tracking | face biometrics | Face ID | facial authentication | gesture recognition | Google | iPhone | research and development | smartphones | under display