[Reading time – 2 minutes 40 seconds]
“Biometrics” has become the buzzword today for authentication. No matter where you turn, instead of entering a password you’re asked to swipe your finger or look into a camera in order to prove that it’s really you. But the risks associated with our biometric data are still very real. Only a few states have biometric laws to protect users. Are these laws adequate?
Concerns About the Use of Your Finger or Your Face
One of the major concerns regarding the use of biometrics centers around one question: What happens when my biometric profile is stolen? A threat actor could then use it to log in to every single account that I have. And how could I ever correct that? I can’t grow a new one finger or face. Am I then prevented from using biometric systems for the rest of my life?
And if this seems like just a good plot line for a movie, it’s already happened. Earlier this year security researchers from vpnMentor discovered that the online database of BioStar 2, a “web-based biometric security smart lock platform,” was left unprotected. The researchers were able to access online almost 28 million records (23 gigabytes of data) that contained the personal employee information of customers who were using BioStar 2 (employee home addresses and emails), their security levels and clearances, when they started working, their usernames and passwords, and biometric data such as fingerprint information and facial scans, along with other data.
Restrictions on Biometrics
In order to protect users, some states have started to pass legislation restricting how biometric data can be used. Texas and Washington have biometric privacy laws on the books. Arizona, Florida, and Massachusetts are considering legislation to address biometric privacy. The California Consumer Privacy Act (CCPA), which goes into effect very soon (January 1, 2020) treats biometric information the same as all other personal information: residents of California can access their information, delete it, take it with them, and tell businesses not to sell it.
However, the state with the most comprehensive law is Illinois. Passed back in 2008, the Illinois Biometric Information Privacy Act mandates that companies collecting biometric data must first obtain user consent. Companies must also notify users about why and how their data will be used, how it will be stored, and for how long. What’s more, companies are restricted from selling user’s biometric data without consent.
But a recent court ruling has made the Illinois law even that much more powerful. In January 2018 the Illinois Supreme Court ruled that plaintiffs do not have to first prove that harm came to them as a result of a violation of the law. Rather, they only have to prove that the law was broken–whether or not they suffered any harm–in order to bring a suit against the company. This has resulted in a large number of lawsuits filed in Illinois regarding biometric data: on average about three to five are filed each day.
But what’s interesting is the relationship between the plaintiff and the company. It’s not just customers or clients who are suing; rather, it’s employees who are suing their employer for collecting their biometric data. Last year (2018) a survey by Gartner revealed that about 6 percent of companies in the U.S., Canada, and Europe track employees using biometrics. Workers are often required to swipe a fingerprint or use facial recognition to enter a building and also clock into work.
Of course, it’s not just employers who are using biometrics to authenticate their employees. Several financial institutions now use biometrics, often without informing their clients. Fidelity Investments and Charles Schwab use the unique voice patterns of their customers to identify them on the phone. On the one hand, this can act like multifactor authentication; on the other hand, are those voice patterns used in other ways? And how are they protected? And will clients be allowed to permanently delete this biometric data?
Biometrics may indeed be a more common means of authentication going forward. But are there adequate safeguards in place to protect users?