Intel unveils RealSense ID, an on-device facial authentication device that will become available in 2021’s first quarter, priced at $99 and will be targeted for ATMs, smart locks, and kiosks (Kyle Wiggers/VentureBeat)
Intel has announced the latest version of RealSense the range of tracking and depth sensors developed to provide machines with the ability to sense depth. It’s known as RealSense ID, it’s a device-based system that combines active depth sensors along with an algorithm for machine learning to provide facial authentication.
Intel claims it is RealSense ID adapts to users as physical attributes like glasses or facial hair alter as time passes, and is able to work in various lighting conditions for people “with an array of sizes or skin tones.”
Significant developments in the method of AI can aid in providing better quality healthcare that will result in better outcomes for all. 115.2K1 Play Video.
Transform 2022 Sign-up NOW and watch on-demand now the most recent advances in the ways AI can help improve health outcomes and improved results for everyone
However several studies and VentureBeat’s own research on benchmark data obtained from the public have demonstrated that the algorithm for facial recognition is susceptible to a myriad of biases. One issue is that the data used for training algorithms tend to favor whites over males.
IBM discovered that 81% of the faces in the three sets of facial images most often used in academic research have skin that is lighter in hue. Researchers have found that the photographic methods and technologies can also help favor lighter skin tones, which includes all types of sepia-tinted films as well as digital cameras that have low contrast. This is the reason Amazon, IBM, Microsoft, and others have all imposed moratoriums on selling face-recognition programs.
In response to critiques, Intel states that RealSense ID will cost $99 and will be made available in the first quarter of 2021. It comes with an anti-spoofing function integrated to protect against attempts to enter fraudulently by using photos, videos, or masks. The company claims it has a “one-in-1-million” fake acceptance rate, handles facial images locally, protects user information, and activates via “user consciousness.” That means that it does not authenticate until it is asked by an authenticated user.