Facial recognition tech is more and more employed for everything from convenient online logins to government surveillance, particularly in China. A new trial reported by Fortune shows doubt on the accurateness of such systems, though, by displaying that they can be tricked by consumers using masks.
To conduct the test, AI firm Kneron developed high-quality 3D masks that imitated the face of a different person, and tried whether somebody can wear one to trick facial recognition tech. Scientists were capable of making buyouts from a different person’s account using the WeChat and AliPay payment systems.
The team was even capable of fooling the tech at airports. In Amsterdam in Schiphol Airport, they managed to fool a self-boarding network with only a pic of a face of a different person. They also fooled train station systems in China where users employed facial recognition to pay for their travel. All trials were performed and supervised with permission.
This is not the first time issues have been lifted about the correctness of facial recognition tech. Earlier in August, the American Civil Liberties Union declared that the tech misidentified 26 lawmakers in California, majorly people of color. A different report from the UK discovered that the cop’s facial recognition tech has an error rate of 81%.
Surely, some facial recognition systems are enhanced as compared to others. Kneron, for instance, declared that their test did not trick the iPhone X. On the other hand, other systems were bypassed easily with the help of deception and can possibly result in fraud. The issue is that users can impersonate wealthy people and use their banking accounts, or even that terrorists can avoid security actions.
“This displays the danger to the privacy of consumers with sub-par facial recognition that is camouflaged as ‘AI’,” CEO of Kneron, Albert Liu, claimed to the media.