No Result
View All Result
SUBSCRIBE | NO FEES, NO PAYWALLS
MANAGE MY SUBSCRIPTION
NEWSLETTER
Corporate Compliance Insights
  • Home
  • About
    • About CCI
    • CCI Magazine
    • Writing for CCI
    • Career Connection
    • NEW: CCI Press – Book Publishing
    • Advertise With Us
  • Explore Topics
    • See All Articles
    • Compliance
    • Ethics
    • Risk
    • FCPA
    • Governance
    • Fraud
    • Internal Audit
    • HR Compliance
    • Cybersecurity
    • Data Privacy
    • Financial Services
    • Well-Being at Work
    • Leadership and Career
    • Opinion
  • Vendor News
  • Library
    • Download Whitepapers & Reports
    • Download eBooks
    • New: Living Your Best Compliance Life by Mary Shirley
    • New: Ethics and Compliance for Humans by Adam Balfour
    • 2021: Raise Your Game, Not Your Voice by Lentini-Walker & Tschida
    • CCI Press & Compliance Bookshelf
  • Podcasts
    • Great Women in Compliance
    • Unless: The Podcast (Hemma Lomax)
  • Research
  • Webinars
  • Events
  • Subscribe
Jump to a Section
  • At the Office
    • Ethics
    • HR Compliance
    • Leadership & Career
    • Well-Being at Work
  • Compliance & Risk
    • Compliance
    • FCPA
    • Fraud
    • Risk
  • Finserv & Audit
    • Financial Services
    • Internal Audit
  • Governance
    • ESG
    • Getting Governance Right
  • Infosec
    • Cybersecurity
    • Data Privacy
  • Opinion
    • Adam Balfour
    • Jim DeLoach
    • Mary Shirley
    • Yan Tougas
No Result
View All Result
Corporate Compliance Insights
Home Data Privacy

Facial Recognition Technology in the Workplace: Employers Use It, Workers Hate It, Regulation Is Coming for It

Only 3 U.S. states currently regulate facial recognition tech in the workplace. That may soon change.

by Henry Kronk
March 3, 2021
in Data Privacy, Ethics, Featured
Illustration representing a facial recognition technology scan of a face.

Using facial recognition technology for tracking employees in the workplace is largely unregulated in the U.S. While the GDPR in the E.U., a few states and industry organizations have provided leadership, experts believe face-scanning will receive attention from lawmakers in the near-term.

On a recent weekend morning in January, Christian Godwin walked into the restaurant on the Jersey Shore where he works as a bartender and was called over to see his manager. He was informed they would be using a new time clock system to punch in and out of work. It operates, in part, by scanning and automatically identifying each worker’s face. “We’re putting this in place so no one clocks in for you,” his manager told him. Christian asked if the time clock used facial recognition technology. “It’s just a picture,” his manager responded. Christian pushed back. He asked how his data would be used, where it would be stored and other details about the device. Christian said his manager told him he didn’t have that information available. After more back-and-forth, according to Christian, his manager ended the conversation by saying, “If you don’t do it, give your two weeks’.”

The restaurant already had other processes in place to vet workers and secure the premises, Christian said, including security cameras and an employee management system that required him to sign in with a secure login. And getting his bartender’s license had required submitting to a background check and putting his fingerprints on file with police. Christian said he had a good relationship with his employer prior to this incident.

“With the face-scanning time clock, they crossed a line,” he said. “While I have no issue with accountability, that was too personal. It gave them too much control.”

Christian decided to file for OSHA whistleblower protection. His name has been changed in this article to protect his identity, and specific details about his employer have been withheld. He also remains employed at his place of work and says he has not yet used the biometric time clock.

Neither New Jersey state nor federal regulation currently prevents employers from using facial recognition as described here. But many believe that will soon change. In the opinion of Kayvan Alikhani, CEO of Compliance.ai, federal regulation governing a situation like Christian’s is imminent.

“There is a very strong case right now for national policies in the United States relating to data protection, and biometric would be a subset of that,” Alikhani said. “I think that we can expect that to happen over the next one or two years.”

Billions of Devices Now Give Companies the Ability to Track Workers.
Research Indicates Execs Aren’t Confident They’re Doing So Responsibly.

The use of facial recognition technology by law enforcement and government agencies has garnered widespread criticism in the U.S. and around the world. Large-scale data mining operations that scan and catalog faces from publicly available images on the internet have also drawn fire. In perhaps its most extreme form, China has adopted face-scanning systems to track its citizens, including its Uyghur Muslim community. Human rights groups have accused China of interning dissident members of this population en masse in government-run re-education camps. The U.S. government has claimed these actions constitute genocide.

But private uses of face-scanning cameras — such as those used to surveil employees while at work — are both legal and widespread in most states and many countries around the world.

Use of facial recognition technology goes well beyond clocking hourly workers in and out. Some employers scan faces to monitor worker productivity. Others use the technology for security. Still others use it to create image databases that can be leveraged for internal purposes or sold to third parties.

Twenty-three U.S. states currently have passed or are considering legislation that pertains to facial-recognition technology. But current laws focus mainly on uses by government agencies like law enforcement or for consumer protection purposes. Just three states regulate face-scanning systems in private contexts.

Illinois broke ground in 2008 with the Biometric Information Privacy Act, which primarily requires that entities obtain consent before collecting biometric data, store the data securely and destroy it in a timely fashion. Since, both Texas and Washington have passed similar laws.

In the absence of governing regulation in most states, companies have begun to use face-scanning systems on larger scales. And in many instances, it’s clear that technological practices and capabilities are developing faster than their practitioners can grasp. Accenture researchers who compiled the 2019 “Decoding Organizational DNA” report surveyed more than 1,400 C-suite executives who operate in 13 different sectors. While 62% of respondents said that their company was using emerging technologies to collect data on their employees, just 30% were confident they were using that data responsibly.

“Securing biometric data involves a heightened risk,” said Sarah Pearce, a global data privacy compliance expert and partner at Paul Hastings in the U.K. “And if you are relying on a third-party to do so, you need to vet them. A number of breaches that we’ve seen have arisen because of the failings of third-party software that a company has been using.”

Many Support Banning Face-Scanning Devices Outright

Because of this uncertain landscape, some favor engaging the emergency brake. Caitlin Seeley George, who works as a campaign director for technology watchdog Fight for the Future, is among this number.

“Facial recognition technology should absolutely not be used in the workplace. Period. End of sentence. No one should have to give up their biometric data in order to get a paycheck,” Seeley George said.

The #SurveillanceState is NOT inevitable. Watch Amazon, Microsoft, and IBM back away slowly while Congress finally tunes in— let's #BanFacialRecognition: https://t.co/lXUm7DnWco ✅#M4BL #BlackLivesMatter #surveillance #BoycottAmazon #InvestigateAmazon pic.twitter.com/gPcUJdcWH5

— Fight for the Future (@fightfortheftr) June 13, 2020

Some companies agree. Following the killing of George Floyd by the Minneapolis Police Department last summer, IBM halted all development of face-scanning and analysis software.

In a letter to Congress, company CEO Arvind Krishna wrote, “IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms or any purpose which is not consistent with our values and Principles of Trust and Transparency. We believe now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies.”

Identification Versus Verification

Alikhani, an expert in AI and facial recognition, does not believe that facial recognition technology should be banned outright. A key distinction, he argues, needs to be made between face-scanning systems that are used to identify and cross-reference an individual with a database of images, compared to those used only to verify a specific person’s identity.

“With the former example, I’m sitting on a repository of millions of face profiles. You pass by a camera. I come back and say, with a degree of confidence, ‘This is you.’ With the other, I use the system to say, ‘You have come in. I can say with certainty that you are you.’”

In Alikhani’s determination, using face-scanning systems for identification is far more questionable than doing so to verify individuals, provided other measures like consent and data security have been met.

The GDPR and Consent

“As of yet, no one has come out and said, ‘This is illegal,’” said Sarah Pearce. “To date, the strictest rules that we’ve seen exist in the E.U. with the [General Data Protection Regulation] (GDPR).”

Specifically, Article 9 of the GDPR governs the “processing of special categories of personal data,” which includes biometric information. Under this measure, face-scanning is generally banned. But it is permitted under specific conditions. Primary among these is if a data subject provides consent.

But even if an employee agrees to have their face scanned, Pearce says that this arrangement is not ideal.

“There’s an imbalance of power in the relationship,” she said. “What the GDPR is saying is that consent can’t be freely given in the employment context.”

Proportionate Use Under the GDPR

Alikhani also is quick to mention that the circumstance under which the facial recognition technology is used is also highly important under the GDPR. Entities are essentially prohibited from monitoring individuals by scanning faces if it is not specifically necessary or can be accomplished by other means.

Precedent in this regard has already been set. In 2019, the Skellefteå municipality school board in Sweden launched a pilot program in which they used facial recognition technology to take attendance. The Swedish Data Protection Authority (DPA) fined the municipality SEK 200,000 (equivalent to $29,000). Despite the fact that the municipality had obtained parental consent in advance, according to the Swedish DPA, the municipality had violated numerous GDPR requirements. Chief among these: there were other, less-intrusive ways by which they could take attendance.

The Swedish Data Protection Authority has served the country’s first #GDPR fine after a Skellefteå high school used facial recognition tech to monitor attendance w/o consulting the DPA or conducting a proper impact assessment: https://t.co/PR1pWMdukt

— The Engine Room (@EngnRoom) August 29, 2019

“That really gave me pause,” Alikhani said. “On the surface, you think, ‘What’s wrong with that?’ Kids don’t want to stand in line to take attendance every day. You can have that much more time to teach them. But Sweden’s DPA fined them nevertheless.”

With Little U.S. Regulation, Companies, States and the Public Are Driving Compliance.

In lieu of federal regulation in the U.S., other forces are shaping the use of facial recognition technology in the workplace.

“We’re going to see more and more compliance issues involving this topic,” Sarah Pearce said. “That isn’t to say that companies can’t use facial recognition technology altogether. They just need to be mindful of the compliance issues involved.”

Before founding Compliance.ai, Alikhani worked as the CTO for Litescape Technologies, which develops a range of biometric identification capabilities. In that capacity, he served as a board delegate to the FIDO Alliance. The organization operates as an industry association that authors technology authentication standards. Its governing philosophy is that passwords are inherently insecure. As the organization’s website states, the use of passwords “needs to be reduced, if not replaced.”

Part of the organization’s work includes maintaining industry standards for the use of facial recognition technology. Members include Apple, Google, Samsung, Facebook, Microsoft and dozens more.

But at the same time, an opt-in industry group does not have any regulatory teeth. Some members have attracted criticism for uses of facial recognition technology, including systems used to surveil employees.

Intel, for example, is a board-level member. On numerous company campuses around the U.S., according to The Oregonian, the company scans its employees faces and license plates for security purposes. It keeps employee information on file for two years after they leave the company. It also stores visitors’ data for 30 days after they visit.

A Potential ESG Concern

Amid the race and gender biases of existing facial recognition systems, their use has threatened to become a social, or ESG, compliance issue. (The acronym stands for environmental, social and governance topics.)

“It could very much become that,” Alikhani said. “I think this pushback and the controversy is more about mis-identification and the use of it without consent. Lack of consent and the disproportionality of inaccurate verification alone leads me to think that it’s not a proper use of technology in public environments where there’s a very large population of users. But when it comes to the workplace, where there’s a very limited pool of individuals and the technology is being specifically used for gainful access or attendance with proper disclosure and with proper communication — I think that makes sense.”

Opening the Door for Abuse

Expanding surveillance capabilities at work also creates the potential for their improper use. This is a main concern for Fight for the Future’s Caitlin Seeley George.

“Managers can use facial recognition technology to spy on workers, creating an environment ripe for harassment and abuse. This can have consequences ranging from frustrating inconveniences to major impacts on peoples’ lives and well-being,” Seeley George said.

Examples of the latter have already emerged. The company Verkada manufactures security systems that have facial recognition capability for use in the workplace. It also uses its own technology in its offices. In October of last year, reports came to light that Verkada managers were using the system it develops–including its facial recognition capabilities–to take pictures of female employees, highlight their faces, and make sexually explicit comments about them in a Slack channel.

”It’s Not About Big Brother.“
Outside of Facial Recognition, Employee Surveillance Is Widespread.

Other means of monitoring employees at work are ubiquitous. The research firm Gartner has tracked the use of digital technologies used to keep watch over employees. In 2013, the number of large corporations using ”nontraditional monitoring techniques” stood at 30% of those it surveyed. That figure rose to 50% in 2019 and is projected to be roughly 80% today.

With many professionals working from home due to the coronavirus pandemic, that figure may be even higher.

Some privacy experts describe these measures as invasive. Compliance officers, meanwhile, might view adoption of emerging technology as part of a broader effort to demonstrate that the company’s compliance function is proactive or has “evolved” per recent updates to the Evaluation of Corporate Compliance Programs by the Department of Justice.

“Companies need to have a sense of responsibility,” Alikhani said. “Somebody might steal. Somebody might break something. Somebody might harass a co-worker. Well, I’d like to be able to see the trail of that and not be clueless when it happens. It’s not about Big Brother.”

Christian, who faced the decision to use a face-scanning time clock or quit his job, also acknowledges that companies need to monitor their workers. But in regard to facial recognition technology, he thinks it has no place at work. After nearly two months of discussion with his managers and a company HR representative, he has managed to keep his job and has been allowed to not have his face scanned. But it hasn’t been easy.

“Ultimately, this is a part-time bartending job I’m working while I’m pursuing professional education,” he said. “I’m not working for the CIA handling classified information here. Compelling me to have my biometric data recorded to punch in for work — that level of micromanagement is extremely invasive and definitely overkill. I have no objection to tracking employees and output via reasonable means. This, however, is extreme.”


Tags: Emerging TechnologiesGDPRTechnology
Previous Post

2021 Global and Regional Trends in Corporate Governance

Next Post

Prepare Now to Comply with SEC’s Updated MD&A and Related Financial Disclosure Requirements

Henry Kronk

Henry Kronk

Henry Kronk headshotHenry Kronk is the former Managing Editor of Corporate Compliance Insights. His previous reporting has appeared in Exclaim!, the Burlington Free Press, International DJ, eLearning Inside and more. He produced the radio show Code Burst — an investigation into a coding bootcamp that sought to retrain out-of-work coal miners for jobs in tech — for CKUT 90.3 FM in Montreal.

Related Posts

origami tiger

Paper Tigers Won’t Protect You: The Reality of Effective NIS2 Compliance

by Hans Kayaert
March 24, 2025

Why Belgium's early adoption model could prevent another round of ‘compliance theater’ across Europe

examining data on laptop screen

Privacy Rights Surge Forces Rethink of Data Management

by Gal Ringel
March 14, 2025

As global privacy regulations multiply, organizations face mounting pressure to efficiently respond to data subject requests amid complex data environments

imessage on phone

The Hidden Compliance Risks Lurking in Your iMessages

by Harriet Christie
March 3, 2025

How end-to-end encryption and lack of native archiving tools complicate regulatory compliance

remote worker not at desk

The Unseen Risks of Remote Work: Stopping Employee Fraud Before It Starts

by Prakash Santhana
February 3, 2025

From unauthorized data access to BYOD risks, hybrid work demands smarter strategies to combat employee fraud before it escalates

Next Post
The facade of the SEC in Washington, D.C.

Prepare Now to Comply with SEC’s Updated MD&A and Related Financial Disclosure Requirements

No Result
View All Result

Privacy Policy | AI Policy

Founded in 2010, CCI is the web’s premier global independent news source for compliance, ethics, risk and information security. 

Got a news tip? Get in touch. Want a weekly round-up in your inbox? Sign up for free. No subscription fees, no paywalls. 

Follow Us

Browse Topics:

  • CCI Press
  • Compliance
  • Compliance Podcasts
  • Cybersecurity
  • Data Privacy
  • eBooks Published by CCI
  • Ethics
  • FCPA
  • Featured
  • Financial Services
  • Fraud
  • Governance
  • GRC Vendor News
  • HR Compliance
  • Internal Audit
  • Leadership and Career
  • On Demand Webinars
  • Opinion
  • Research
  • Resource Library
  • Risk
  • Uncategorized
  • Videos
  • Webinars
  • Well-Being
  • Whitepapers

© 2025 Corporate Compliance Insights

Welcome to CCI. This site uses cookies. Please click OK to accept. Privacy Policy
Cookie settingsACCEPT
Manage consent

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may affect your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. These cookies ensure basic functionalities and security features of the website, anonymously.
CookieDurationDescription
cookielawinfo-checbox-analytics11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Analytics".
cookielawinfo-checbox-functional11 monthsThe cookie is set by GDPR cookie consent to record the user consent for the cookies in the category "Functional".
cookielawinfo-checbox-others11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Other.
cookielawinfo-checkbox-necessary11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookies is used to store the user consent for the cookies in the category "Necessary".
cookielawinfo-checkbox-performance11 monthsThis cookie is set by GDPR Cookie Consent plugin. The cookie is used to store the user consent for the cookies in the category "Performance".
viewed_cookie_policy11 monthsThe cookie is set by the GDPR Cookie Consent plugin and is used to store whether or not user has consented to the use of cookies. It does not store any personal data.
Functional
Functional cookies help to perform certain functionalities like sharing the content of the website on social media platforms, collect feedbacks, and other third-party features.
Performance
Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.
Analytics
Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics the number of visitors, bounce rate, traffic source, etc.
Advertisement
Advertisement cookies are used to provide visitors with relevant ads and marketing campaigns. These cookies track visitors across websites and collect information to provide customized ads.
Others
Other uncategorized cookies are those that are being analyzed and have not been classified into a category as yet.
SAVE & ACCEPT
No Result
View All Result
  • Home
  • About
    • About CCI
    • CCI Magazine
    • Writing for CCI
    • Career Connection
    • NEW: CCI Press – Book Publishing
    • Advertise With Us
  • Explore Topics
    • See All Articles
    • Compliance
    • Ethics
    • Risk
    • FCPA
    • Governance
    • Fraud
    • Internal Audit
    • HR Compliance
    • Cybersecurity
    • Data Privacy
    • Financial Services
    • Well-Being at Work
    • Leadership and Career
    • Opinion
  • Vendor News
  • Library
    • Download Whitepapers & Reports
    • Download eBooks
    • New: Living Your Best Compliance Life by Mary Shirley
    • New: Ethics and Compliance for Humans by Adam Balfour
    • 2021: Raise Your Game, Not Your Voice by Lentini-Walker & Tschida
    • CCI Press & Compliance Bookshelf
  • Podcasts
    • Great Women in Compliance
    • Unless: The Podcast (Hemma Lomax)
  • Research
  • Webinars
  • Events
  • Subscribe

© 2025 Corporate Compliance Insights