What does ethical and transparent data use look like?
Data Privacy and Security

What does ethical and transparent data use look like?

The Cambridge Analytica-Facebook scandal and allegations that YouTube is improperly collecting and using children’s data shone a light on the collection, management and use of data by companies -- whether tech companies like Google, Facebook and YouTube or third-party users of these platforms who leverage data to target users for marketing and advertising purposes. As users become more savvy about the information collected about them on these and other platforms, data analytics will increasingly become a bone of contention with users asking more questions about how and why their data is collected and used.

For the c-suite, the real question is what does ethical and transparent data use really look like? Getting that aspect right can spell the difference between a company that is seen positively in the eyes of its consumers and one that is seen to be part of the problem.

 

A changing mindset

Data plays a role in so many aspects of business, from marketing and advertising to employee relations, and a consistent approach to data is crucial across all of these. Kriti Sharma, VP of Bots and Artificial Intelligence at Sage, says recent events add power to a global push for the technology industry to address rising public concerns over data privacy and transparency. They reinforce the need to rethink how technology platforms that pull from data sources are designed, governed and used by people and organizations.

“To immediately address the new dynamic, the business world needs to install workforce ethics training throughout company ranks, develop corporate transparency frameworks and hire diverse teams to interact with, create and improve upon these technologies. In practice, organizations must establish an ethical framework that people creating technologies can work within at the outset of a product’s development cycle,” she says.

Dimitri Sirota, CEO of BigID, says ethical and transparent data use starts with a change in mindset in how companies view their relationship with customer and employee data. Historically, Sirota notes, many companies followed the path of collecting as much subject data as possible, storing it and then using it for any purpose deemed beneficial to the organization.

“Companies were essentially allowed to operate as owners of that data, instead of stewards. This needs to change. Personally identifiable information (PII) should always remain the ownership of the person it belongs to, and it’s up to the organization to ethically use and protect that data until the individual implements their ‘right to be forgotten’ or requests erasure,” he says. 

Key to this is creating new roles within organizations, such as a chief data officer, who is solely responsible for all aspects of an organization’s data. Those individuals understand the power of data, and the need to view it as the currency of business. Sirota says that from there, organizations can follow three steps to better manage and control their data operations:

1. Knowing what data they have;

2. Figuring out what data belongs to who; and,

3. Being able to document how data is being processed, shared or exposed.

For Felix Marx , CEO of Truata, the issue is that while data analytics is critical to every aspect of day-to-day operations, few organizations ask themselves why they are processing that data and if it is in the best interest of the individuals they are servicing.

Others are eager to continue embracing analytics but have not yet found a way of doing so within the scope of GDPR, the new data protection law Europe introduced on 25 May. “Companies need to not only comply with the GDPR, but to also think about what analytics are in the best interest of not only their company but their customers as well,” he says.

 

Clear and consistent communication

In the consumer domain, for example, much of the focus is on the collection of data through company apps or websites or platforms, where users are often unclear as to what is being collected and how that data will be used.

Tim Michaels, Director: Enterprise Technology Strategy and Innovation at Grant Thornton LLP explains that the first thing companies need to understand is their customers’ expectations and comfort zone for data usage and data sharing. He says it is very difficult to limit data collection and still serve customers proactively.

“Being upfront with constituents that their data will not be shared – and that you will implement it for their benefit – through clear and concise communication goes a long way to building and maintaining trust,” he says.

For Michaels, companies can move in the right direction by:

  1. Choosing partners wisely and holding them accountable;
  2. Offering customers anonymous digital channels to interact with you;
  3. Making safeguarding data a top cultural priority; and
  4. Rather than requiring customers and prospects to opt out, requiring an opt in.

Rocket Lawyer's CEO and founder Charley Moore believes that the main way businesses can increase transparency in data collection methods is by simplifying the language laid out in the user terms and conditions. Companies protect themselves by burying the potential risks in pages and pages of legal jargon, incomprehensible to the average consumer.

“If companies are really developing their products with the user experience in mind, they’ll need to reconsider their responsibility in informing users in a way that makes sense for the audience. You wouldn’t give an audience of Chinese people warnings only in English, and that’s essentially how the average consumer would feel reading the standard end user agreement,” he says.

The value of data is undeniable and it is unlikely that data harvesting and analysis will suddenly disappear off the corporate agenda as a result of the recent data scandals. Julia Shullman, VP: Chief Privacy Counsel at AppNexus, explains that advertising is what powers much of the open internet.

Open, transparent marketplaces help marketers reach consumers with relevant and engaging advertising, she says, which in turn helps independent publishers monetize their creativity and content; which, ultimately allows those same publishers to offer that content free of charge to billions of consumers. “Responsible use of consumer data in digital advertising helps power this virtuous cycle,” Shullman adds.

In Shullman’s eyes, platforms need to respect personal privacy, collect only the data that they truly need, protect it and properly manage it while it is in their control – including managing which third parties have access to it – and purge that data as soon as it is no longer useful.

“But if platforms like Facebook continue to violate their social contract, the virtuous cycle will become a vicious circle. Consumers will opt out of the ecosystem, either through ad blocking or demands for stringent regulation. Marketers will pull back budgets. And publishers will be forced to erect paywalls around their content. The internet will become smaller and less democratic. That’s not a future that any of us should want,” she warns.

Securing that data is an important part of the overall puzzle and key to building trust with users. Moore says that one of the biggest mistakes companies make is overlooking their own privacy and security measures, thus preventing a breach in the first place. “Sure, a three-step verification system might take more time from the consumer, but they’ll always appreciate the extra sense of security. If a company can’t properly provide protections over user data, what makes them think users will feel like they have control over their own information?”

 

Defining data strategy

In the end, it boils down to a strategic approach to data collection and use that is predicated on principles of ethical and transparent use. The CIO role is crucial to that process.

For Ben Lorica, Chief Data Officer for O’Reilly Media, there are three things CIOs should consider when it comes to data collection, use and management:

  1. Training of data professionals should incorporate ethics: they need strategies for making sure that what they collect and put out is aligned with the ethics of their company. Industry groups are starting to assemble resources to just that
  2. Companies operating in the EU, GDPR and privacy by design are the starting points and will set an example for transparent data management.
  3. Using tools for augmenting data professionals are essential, particularly as the amount of data and the number of models used in production explodes. It is difficult for people to spot anomalies that arise when a few models are involved and it will be impossible when the number of machine learning models in production grows to the thousands

Frank Bien, CEO at Looker, says there are a number of sides to ethical and transparent data use. He believes that ethical and transparent data means clarity on where your data will reside, what it will be used for and who can access it, and with whom it will be shared.

“Even as we’ve witnessed seemingly benign data being harvested and weaponized, we believe there is a clear opportunity for data to be harnessed for good,” Bien says. “Businesses of all sizes need to prioritize the long-term greater good over the short-term bottom line. In short, businesses need to be willing to leave money on the table if partners or potential partners don’t protect user data.” 

Bien believes that putting ethical decision-making at the heart of business strategies will dictate the right path.

“CIOs must work to ensure there is one single source of truth throughout their organizations – especially when it comes to customer data. CIOs need governance measures on their data so they can restrict the data that people see without erecting unnecessary roadblocks that make it too difficult to access the appropriate amount of data,” he says 

Bien adds that GDPR gives individuals ‘a right to have personal data erased and to prevent processing … where the personal data is no longer necessary’. With that in mind, CIOs must have a consistent and accurate view of where data is stored, who has access to it and whether or not they can take it and store it elsewhere. 

“If a company has ‘data sprawl’ this can make it an almost impossible challenge, whereby businesses are tackling data disconnected from the central source – relying on, for example, systems of servers, cloud storage, laptops and more. These disparate ‘swamps’ are impossible to search and even harder to manage. Tackling this issue must be high priority for CIOs – who must then use tools that ensure the data remains clean, searchable and deletable on an ongoing basis,” he warns.

Moore adds that CIOs should consider how they are sharing data with third-party vendors or partners and the systems they are putting in place to keep the user terms and agreement from being violated. It is important that these organizations are able to manage user data and know exactly how it is being used by outside parties.

 

Look to the data experts

A robust data strategy is core to ethical and transparent data use and can help to avoid other pitfalls of the use of data analytics. Vijay Raghavan, chief technology officer of LexisNexis Risk Solutions, emphasizes that the modellers and statisticians and data scientists play a pivotal role in avoiding data analytics pitfalls. “They may have different titles, but these are the data experts who create the analytics strategy and the analytics,” he says.

A common pitfall to watch out for is the issue of ‘confirmation bias’, which is the phenomenon by which a data expert may cherry-pick the data (or algorithms) to prove what he or she already believes to be true. According to Raghavan, data experts should build or choose algorithms and data dispassionately and not just make those choices that support their preconceived notions.

When done the right way, data experts are better able to explain their conclusions, or the inner workings of a model. Raghavan says that is another pitfall to watch out for, in the sense that analytics need to be explainable to a regulator or consumer or customer.

“For this reason, data experts need to watch out for the gratuitous usage of powerful techniques – be it machine learning or any other algorithmic tool – which may make their work easier, but which detracts from the explainability and transparency of the analytics,” he says.

“Techniques such as machine learning can be enormously useful, but a good data expert avoids the pitfall of assuming that machine learning obviates any and all human oversight or common sense. It is the responsibility of the data expert to ensure that the analytics are empirically valid, and free of obvious or deliberate biases, regardless of what underlying techniques are used.”

 

Ethics training is crucial

Sharma believes that efforts to harvest personal data submitted to technology platforms reinvigorates the need for ethics training for people in all positions at companies that handle sensitive data. “The use of Facebook and third-party platforms raises the importance of building backend technologies distributing and analyzing human data, like AI, to be ethical and transparent. We also need the teams actually creating these technologies to be more diverse, as diverse as the community that will eventually use them,” she adds.

GDPR offers a good foundation for an ethical and transparent approach to data use, but there are potential pitfalls that might not be immediately apparent. For example, Satyen Sangani, CEO of data cataloguing firm Alation, points out that hidden deep in article 22 is a clause that says if you are using training data to train an algorithm, and that training data includes personally identifiable information (PII) and the consumer requests their PII, that training data might need to be replaced.

“If you think about how most organizations have implemented their data science practices today, they have very loose governance practices around training data. This opens up a pretty deep challenge for any organization implementing machine learning,” Sangani says.

A holistic view of data with deep insight into how it is being used and why is needed to really ensure the balance between compliance, privacy and innovation. As Bien says: “Maintaining good relationships is all about communication and transparency. In the age of the GDPR, the companies that embrace this kind of vision – and communicate its importance to their customers – will build the strongest relationships. Those taking on its values of privacy, transparency, trust and security will reap the rewards of doing so. A culture and operational process that embraces transparency breeds trust and competitive advantage.”

PREVIOUS ARTICLE

«Datacenters need to be smarter about energy

NEXT ARTICLE

A buyers' guide to ERP solutions»
Bianca Wright

Bianca Wright is a UK-based freelance business and technology writer, who has written for publications in the UK, the US, Australia and South Africa. She holds an MPhil in science and technology journalism and a DPhil in Media Studies.

Our Case Studies

IDG Connect delivers full creative solutions to meet all your demand generatlon needs. These cover the full scope of options, from customized content and lead delivery through to fully integrated campaigns.

images

Our Marketing Research

Our in-house analyst and editorial team create a range of insights for the global marketing community. These look at IT buying preferences, the latest soclal media trends and other zeitgeist topics.

images

Poll

Should the government regulate Artificial Intelligence?