12 Steps to Building Digital TrustBy Guest Author | Posted 2016-08-12 Email Print
Modernizing Authentication — What It Takes to Transform Secure Access
Weak ethical data practices can damage consumer trust in a brand, so companies must embed strong data ethics practices throughout decision-making processes.
By Steven Tiell
In today’s digital economy, “digital trust”—a widely accepted belief that a brand is reliable, capable, safe, transparent and truthful in its digital practices—is paramount. But digital trust is difficult to build and startlingly easy to lose.
The digital economy is built on data, but analyzing and acting on insights from data can introduce entirely new classes of risk. While digital risk used to be limited to cyber-security threats, it now includes internal threats from lackluster ethical data practices. These practices include obscuring unethical or even illegal processes from inspection, amplifying biases that exacerbate issues of social and economic justice, and using data for purposes that its original disclosers would not have approved— and using it without their consent.
Left unchecked, these risks can permanently damage consumer trust in a brand. However, when organizations embed strong data ethics practices throughout their decision-making processes, risks can be identified and contained.
With input from Jacob Metcalf, a founding partner of ethics consulting and research firm Ethical Resolve, Accenture recommends 12 principles for organizations to consider as they build their own code of data ethics:
· Respect the people behind the data. When insights derived from data could affect the human condition, the potential harm to individuals and communities should be the paramount consideration. Big data can produce compelling insights into populations, but those same insights can be used to unfairly limit an individual’s possibilities.
· Account for the downstream uses of data sets. Data should be used in ways consistent with the intentions and understanding of the disclosing party. Many regulations govern data sets based on the data’s status—e.g., public, private or proprietary. But what is done with data sets is more consequential to subjects and users than the type of data or the context in which it’s collected. Correlative use of repurposed data in research and industry represents the greatest promise—and the greatest risk—of data analytics.
· The consequences of using data and analytical tools today are shaped by how they’ve been used in the past. There’s no such thing as raw data: All data sets and accompanying analytic tools carry a history of human decision making. As far as possible, that history should be auditable—and include mechanisms for tracking the context of collection, methods of consent, chains of responsibility, and assessments of data quality and accuracy.
· Match privacy and security safeguards with privacy and security expectations. Data subjects hold a range of expectations—often context-dependent—about the privacy and security of their data. Designers and data professionals should give due consideration to those expectations and align safeguards with them as much as possible.
· Understand that simply following the law is the lowest bar. Because laws have largely failed to keep up with the rapid pace of digital innovation and change, existing regulations are often miscalibrated to current risks. In this context, mere compliance means complacency. To excel in data ethics, leaders must define their own compliance frameworks to outperform legislated requirements.