As you deal with mounting pressure to implement artificial intelligence (AI) in the workplace, you should be aware of laws in several states and localities that may impact the type of AI product you ultimately choose or how you implement the AI product into your workflow. These laws place limitations on the use and collection of biometric information—data commonly used in workplace AI products.
Certainly, the implementation of AI can no longer be avoided, however, knowing the applicable laws and litigation risks allows you to implement procedures to comply with the law while still using AI as a tool in preventing workplace accidents, keeping employees safe, and monitoring productivity. If you are considering implementing AI in the workplace—or already utilize AI now—consider asking the following questions before proceeding.
What to Consider
- Are you subject to a state or locality with a law limiting the use or collection of biometric information?
Various forms of biometric information privacy laws exist in Illinois, Texas, Washington, and New York City. Employers should consider not only the laws of the state in which they are located or the states in which their workers reside, but also the laws in the states where their workers perform work. Also, expect more states and localities to implement similar laws as AI usage becomes more common.
- Does your AI product utilize biometric information?
Each applicable law utilizes a different definition of “biometric information” or “biometric identifier.” Therefore, you should check the definition in the applicable jurisdiction.
For example, a “biometric identifier” under the Illinois’ Biometric Information Privacy Act (BIPA) means “a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry,” but does not include “writing samples, written signatures, photographs, human biological samples used for valid scientific testing or screening, demographic data, tattoo descriptions, or physical descriptions such as height, weight, hair color, or eye color.” Courts that have interpreted the BIPA found that a photograph and perhaps, by extension, a video is not itself a biometric identifier.
It might not be obvious that your AI product is actually using biometric information, but common AI products in the workforce that claim to monitor employee safety, check productivity, and prevent accidents do utilize forms of biometric information in the process. For example, AI products connected to cameras can detect an employee’s improper use or lack of personal protective equipment and new AI products claim to detect an employee’s fatigue or use of substances like alcohol or drugs. These AI products may monitor an employee’s heartrate, speech (including pronunciation and tone), or facial biomarkers around the eyes or jaw. AI products may also collect data for monitoring employee safety and productivity from sensors or devices worn by employees, real-time scanning of face and body movements, or analyzing video footage.
- Does your AI product collect, store, or use biometric information in violation of the law?
Each applicable law also contains different restrictions regarding the use of biometric information, so you should check the law in your applicable jurisdiction. The laws tend to broadly limit the use of biometric information, including its collection, capture, storage, retention, destruction, and use. However, some laws provide exceptions for security purposes.
- Do you have a consent or notice policy in place?
Many of the applicable laws require employers properly notice the employees of the utilization of biometric information and also obtain consent from the employee. Therefore, when using such systems, employers should first ensure they have provided the required notice or obtained the necessary consent.
- What are the consequences for failing to follow the law?
Some of these laws provide for private causes of action, while others are enforced by the state attorney general. The monetary consequences for violations vary by state. Under these biometric information privacy laws, employers are at risk for lawsuits from employees, or an investigation by the attorney general if they implement AI products that collect biometric data without following consent procedures.
BIPA has sparked the majority of litigation in this area because, unlike some of the other laws, individuals harmed by a violation of BIPA may file a claim against the employer and obtain monetary relief. A successful plaintiff can recover their attorney’s fees and costs, an order to stop the violation, and monetary damages for each violation. The monetary damages that a plaintiff can recover are either $1000 per violation ($5,000 for reckless or intentional violations) or the actual damages, whichever is greater.
A Cautionary Tale
On March 30, 2025, the Northern District of Illinois declined to dismiss a lawsuit by a former truck driver against his employer, a motor transportation company. The former driver brought a proposed class action due to the company’s installation of a video monitoring system inside of the trucks that used AI to detect distraction and risk on the road and/or caused by the driver including cell phone use, eating or drinking, smoking, seatbelt use, and general inattentiveness. The former employee alleged the technology learned the drivers’ habits by capturing a driver’s face and recognizing their facial features using the individual’s unique facial geometry. The former employee sued under BIPA alleging the video monitoring system captured his biometric identifiers while he was driving, compared his identifiers with other drivers’ identifiers to enhance the facial recognition software, and stored the identifiers online for the employer to view—all without the employer receiving consent from the drivers to collect or store the biometric identifiers or biometric information.
The employer asked the court to dismiss the claims arguing, in part, that the video monitoring system did not violate the language in BIPA because it was not using “biometric identifiers.” The court did not dismiss the case. Instead, the court noted that if the former employee’s allegations were true, the employer violated BIPA because a scan of face geometry that can identify a particular driver qualifies as a “biometric identifier” that requires the driver’s consent before obtainment. The case remains ongoing.
While a decision has not yet been reached, the case described above has implications for employers with workers who perform any work in Illinois, even if they do not live or regularly work in Illinois. First, the court declined—at least at the current stage—to remove non-Illinois residents from the proposed class definition because discovery is needed first and there is no bright-line exclusion of non-Illinois residents from this particular BIPA claim. Second, the court refused to dismiss the claim based on the employer’s argument that resolution would require an application of Illinois law to individuals outside of the state. Though the court acknowledged that BIPA does not apply to conduct that happens outside of Illinois, the court stated, “the Illinois state legislature might very well have intended for BIPA to cover non-resident visitors and workers to encourage them to visit and do business here.”
The nuances of this case should cause employers to be cautious when implementing similar AI products that collect biometric data. While we do not know whether the Northern District of Illinois will ultimately find in favor of the former employees, the employer must continue spending time, money, and resources to defend against the claims and engage in discovery. It is better to consider the legal implications and take steps to ensure compliance before someone asks you to defend your actions in court.
If you need help reviewing whether your usage of AI products creates legal liability or are ready to implement practices to adhere to local biometric privacy laws, consider consulting with your legal counsel.
For more information, contact any member of Ice Miller’s Workplace Solutions Group.
This publication is intended for general information purposes only and does not and is not intended to constitute legal advice. The reader should consult with legal counsel to determine how laws or decisions discussed herein apply to the reader’s specific circumstances.