The Impact of Privacy Regulations on Digital Workplace Technology
Over the past two years, there's been a marked acceleration of new technologies coming into the enterprise. While many companies did so out of necessity in a pandemic world, many also moved to harness the power of emerging technologies like edge computing, AI, machine learning and the metaverse to improve their data collection efforts and achieve business goals.
But as companies stockpile an increasing amount of data, governments around the world are introducing regulations to protect user privacy. Much of the focus thus far has been on regulating the collection and use of external customer data. But the accelerating digital transformation of work over the last two years means that companies are now sitting atop a massive stockpile of employee data, too. How and why they use that data to make organizational decisions will likely be put under the microscope.
The Need for GDPR and Privacy Laws
New technologies serve a wide range of purposes across the enterprise, but they share one common link: They are all driven by large amounts of data. Because the data collected can be sensitive in nature, there's a heap of new regulations designed to determine where the data can be sourced, how it can be used, where it can be used and what penalties can be imposed for its misuse. This has had considerable impact on technology deployment decisions across companies.
The EU's General Data Protection Regulation (GDPR) and other privacy acts enacted by local governments from California to Brazil to South Africa to China, have had a large impact on technology decisions and how organizations manage their data. Sarah Hospelhorn, senior vice president of global product marketing at data management company BigID, said regulation has put a spotlight on data rights and sparked conversations around what constitutes personal or sensitive data. Clarifying these definitions is important because organizations must now be able to identify the data they collect and process, down to the individual level.
“These privacy regulations are a catalyst for better data protection practices [and] stronger data management requirements," Hospelhorn said. "They are amplifying the need for organizations to map, monitor and proactively manage the data that they collect."
The regulations impact everything from access rights (who has access to what data — and who should) to data breach disclosure (requiring notification within 72 hours of the breach discovery) to being able to fulfill a data rights request or a request to be forgotten (requiring organizations to be able to identify and report on the data they collect in the first place) to making sure they have consent to hold that data in the first place.
Related Article: What to Know About Regulation of AI at Work
The Problem with the Data Consent Model
Introduced in May 2018, GDPR quickly became the gold standard of privacy laws but, like other privacy laws from Canada, the US and many other nations, it relies heavily on the consent model that data protection authorities have endorsed, which some perceive as futile.
Sharon Polsky, president of the Privacy and Access Council of Canada, said the problem is this consent model is an all-or-nothing conundrum that gives individuals remarkably little choice about whether, when and how their information is collected, used and shared. It leaves responsibility entirely with the individual to check privacy policies, negotiate websites and links, and indicate preferences on every website visited.
“As the meteoric growth of the data brokering industry clearly indicates, that blame-the-victim approach has been wildly successful — for technology companies that trade and monetize information,” she said. The all-encompassing “consent” that allows companies and governments to use a person’s information for a variety of business purposes lists examples, but not a definitive or limited list of purposes, she added.
For users, consenting to generic terms on a website falls into the company's broader definition of "business purposes" and gives a company permission to use their personal data in ways they may not agree with. In other words, those consents can give organizations carte blanche to do as they wish with information provided or collected. That includes sharing it with other companies and individuals, in any country, and relinquishing control over who has the information or what they’re going to do with it.
So far, governments have put the onus on technology companies to reduce the amount of information they collect or use the information only in specific ways. But companies are finding ways around it with the consent model.
“Perpetuating the consent hoax is good for business," Polsky said. "Companies across automotive, medical, surveillance, insurance and most other industries profit from collecting, analyzing and monetizing personal information, and include data as an asset in their financial statement."
Related Article: Why Congress Fails to Regulate Big Tech
The Evolution of Technology in Response to Regulation
According to Gary LaFever, CEO of New York City-based Anonos, the first few years of GDPR-related technology investments dealt with:
- Data discovery and inventory management technology required for companies to discover data in their possession, where it came from and how it is used.
- Consent management technology necessary to track consents received from data subjects and respond to data subject access requests (DSARS) asking to withdraw consent to process their data.
However, recent enforcement actions, such as Luxembourg's Data Protection Authority's $843 million fine against Amazon for processing data collected from its customers beyond the scope of GDPR-compliant consent, are causing companies to invest in technologies that satisfy heightened statutory requirements.
“While data may be the world's most precious resource, its value cannot be fully realized unless it can be shared, combined and processed with other data,” LaFever said. “However, most data sharing, combination and processing rely on outdated and non-scalable consent models that fail to satisfy the heightened GDPR requirements.”
Related Article: How to Ensure Data Privacy in the Digital Workplace
The Repercussions of Failing to Adapt
Companies that don't invest in new controls to overcome the limitations of the consent approach to data protection are setting themselves up for a regulatory headache. They'll be increasingly unable to lawfully:
- Repurpose data (secondary processing) beyond the original purpose of collection. With consent, it's impossible to describe with sufficient specificity what will be done with the data.
- Process data in multi-cloud environments outside of localized jurisdictions due to concerns over privacy and the ability of foreign governments to surveil the data and identify individuals directly and indirectly.
- Achieve desired levels of accuracy beyond that possible using traditional approaches to protection which suffer from limited use cases, errors that degrade the accuracy and value of data, and the need for significant increases in time or resources.
Privacy cannot be achieved with consent alone anymore. It now requires embedding technical controls into the data so it can travel, be shared and combined. The focus is therefore shifting from using technology such as encryption to protect data only in transit and when at rest, to technical safeguards that embed statutory GDPR pseudonymization controls which protect data while in use by anonymizing personal identifiers.
Investments in these technologies will enable companies to use privacy regulation as a competitive advantage. If done correctly, this creates a new sustainable asset by embedding trust and accountability into data flows.
What It Means for Employee Data
External customer data has been the focus of much of the regulatory action so far. But with the increasing digital transformation of work, companies are now stockpiling huge amounts of employee data, too.
The question is to what extent employees are aware of these technologies, said Andrew Taylor, founder and CEO of UK-based Net Lawman, given that employers can use data processing methods without first informing employees — and the boundaries between home and workplace have become increasingly blurred.
New technologies can help prevent the loss of company assets, improve employee productivity and protect personal data, but they also pose significant privacy and data protection challenges. What's needed is a careful assessment of the balance between the legitimate interest of the employer to protect the business and reasonable expectations regarding the privacy of the data subjects, namely employees.
“Users should be very careful in determining a valid basis for the processing of personal data, especially when it comes to new and modern technologies, and thoroughly reconsider all possibilities, necessity and impact on the privacy and protection of personal data of employees,” Taylor said.