Note: This event was conducted under Chatham House Rules resulting in no attribution of the conversation.
On Thursday, November 14th, the One World Identity (OWI) team headed to Seattle with their partners Uniken, Blueprint, and Sentinel to discuss the role of privacy in the data economy.
If blockchain was the 2018 buzz word, a series of high-profile incidents have brought data privacy to the forefront in 2019.
Big tech’s tight-lipped data practices continue to make headlines. Whether Google is collecting health data on 50 million Americans; Amazon’s Alexa and Ring cameras are spying on homeowners; or the story of Cambridge Analytica and Facebook continues to unfold, it has become clear that how we college, manage, and store data will be a controversial question that dominates conversations and headlines for years to come.
In July 2019, we also saw GDPR show its’ teeth with British Airways facing $229M USD fine, the biggest GDPR fine to-date for data privacy violations. Now, there are also a slew of state-level privacy laws, most notably California, Vermont, and Washington, here in the U.S. However, half of U.S. states have some level of data privacy laws
All of these threads have brought data privacy to the mainstage, and companies are starting to build out data privacy strategies while still capitalizing on the data collections benefits. For any company that collects and maintains data, the next five years are about learning to walk a tightrope, balancing the power of data with the concerns, and growing expectations, of customers.
At the KNOW Identity Forum in Seattle, we brought together privacy experts who are building privacy-focused technologies and workflows of the future. Our expert panel including Darren Louie, Senior Director of Product (Identity) at DocuSign, Aaron Weller, Co-Founder and VP of Strategy at Sentinel, Chris Carter, Managing Partner at Blueprint, and Bethan Cantrell, Privacy Officer, and Nishant Kaushik, CTO at Uniken. There was also a live State of Identity podcast featuring Kelsey Finch, Chief Legal Counsel at the Future of Privacy Forum, and Cameron D’Ambrosi, principal here at OWI.
Please visit here for the full State of Identity podcast recording and show notes.
Ethics of Data Privacy
The first question posed to the panel was the argument for data privacy. For identity professionals the justifications for strong data privacy measures are self-evident. However, for companies competing with large tech companies who generate billions of dollars in revenue from identity data, strong data privacy measures often entail giving up what can be considered a competitive differentiator.
The panel’s responses were mixed.
On the pro-legislation side, panelists suggested that collecting user data in a GDPR and CCPA compliant manner could establish stronger brand-user connections and make their datasets more valuable.
When companies are more transparent with what data they are collecting, why they are collecting, what they are going to do with it, and how it will be stored customers are likely to be more truthful with their attestations. More accurate data will empower companies to apply their resources more effectively, opening up new value streams that would have been more convoluted if the data had been less accurate.
On the other hand, the panel suggested that data privacy regulations can sometimes have adverse effects. For example, the right to be forgotten is a statute specified in GDPR that gives a user the right to ask a company to delete their personal data history. This concept seems beneficial in theory.
Companies now must have the capability to assign every single piece of data they have to a specific user, eliminating the possibility of anonymous data collection. One panelist recounted that they had to effectively, step down their cybersecurity standard for GDPR compliance. The irony of a data privacy standard creating less secure processes was not lost on the audience.
Although the arguments for privacy standards were mixed, the panel agreed that more emphasis should be placed on data privacy with the focus on creating new value streams.
User-centric Privacy Design
The next question for the panel was how to design privacy-focused products with the user in mind. Much to the chagrin of privacy professionals, users persistently prioritize a seamless experience over security. Any additional friction added to the user journey risks increasing dropoff rates for service providers.
The panel was unanimous in their conviction that privacy needs to be seamlessly integrated into the user journey. Security steps need to be secure and the benefits obvious. Moreover, the onus should not be on the user to evaluate service providers’ technical processes. Customers are not going to read the terms and conditions page that is thousands of words long, nor should they be asked to.
The panel urged technologists to consider accessibility. Personal data privacy shouldn’t be restricted to affluent internet users who can open Tor networks and establish extensive VPN connections. For most people, efficient access to the internet is required for everyday life, and they shouldn’t have to sacrifice access for privacy.
The Cost-benefit Analysis of Data Debt
The panel transitioned into the conversation around reducing the costs of data debt. Data debt is the implied cost of remediating improper data management processes for GDPR and CCPA compliance. Companies who have no single view of their customers, data sets don’t have consistent formats, no single identity, and inconsistencies in quality and structure create tremendous resource strain and remediation costs.
Panelists were in agreement that fixing data debt was worth the upfront cost. A good example they pointed out was Microsoft announcing they were using CCPA compliance as a defacto standard for all of their U.S. operations. After making the strategic investment to update their systems and processes for CCPA, keeping high-quality data processing standards positions them well for other data privacy regulations to come.
Identifying Dark Patterns
The evening was rounded out with a conversation about dark patterns. A dark pattern is when a website tricks a user to signup for a product or feature without them knowing they are doing so or when a user doesn’t realize the full extent of their consent. In CCPA 2.0, due on the California 2020 ballet, the regulation specifically calls out data collection processed through dark pattern methods will be considered invalid.
There is no denying that platforms continue this practice, but the panel suggested the practice is thankfully falling out of favor. Consumers are smartening up. Sleight of hand tricks companies typically use are not as effective as they have been.
The panel did debate how California lawmakers plan on defining a dark pattern. There is no clear and concise definition, but to companies trying to avoid regulatory scrutiny should focus on collecting user consent.
The KNOW Identity Roadshow Continues
Many thanks to the speakers and attendees who participated in the Boston KNOW Identity Forum! And a special thanks to our partner Uniken for their continued work in the digital identity and security space. At the front lines of innovation, they continue to push the conversation forward and provide the industry with best-in-class solutions. We’re looking forward to continuing conversations like these at our next Forum in London. The KNOW roadshow culminates in the annual KNOW Identity Conference in April 2020. We hope to see you there!