Close Cookie Notice

Welcome to the MIT CISR website!

This site uses cookies. Review our Privacy Statement.

Red briefing graphic
Research Briefing

To Develop Acceptable Data Use, Build Company Norms

Our research has found that leading companies are introducing management practices that help them build norms of acceptable data use.
Abstract

Data use is exploding within organizations—yet most companies still have in place data governance that guides behaviors regarding data at rest, not data in use. MIT CISR research has found that leading companies are introducing management practices that help them build norms of acceptable data use. These norms guide leaders in developing rules and procedures that reinforce appropriate employee and partner behaviors, which over time shape an acceptable data use culture. This briefing describes IAG’s best practice acceptable data use journey, and introduces five principles to consider when putting in place acceptable data use practices.

Access More Research!

Any visitor to the website can read many MIT CISR Research Briefings in the webpage. But site users who have signed up on the site and are logged in can download all available briefings, plus get access to additional content. Even more content is available to members of MIT CISR member organizations.

The latest big data challenge is managing the big use of data. Companies are establishing evidence-based cultures whereby employees pervasively use data to make decisions; and they are automating business processes across the enterprise with data-fueled predictive algorithms, machine learning, and cognitive bots. When it comes to data about people—customers, employees, end users, and the public— is companies’ management of data use keeping up? 

Even if it were as straightforward as “keeping up,” it would not be easy. Laws and regulations that protect personal data are insufficient because they were formulated to protect static data, not data in use. But further, most companies have not yet established cultures of acceptable data use, and they are not ready to do so. Companies need a grasp on the possible to define what is acceptable; and to achieve consensus and buy-in on appropriate action, companies must first consider the perspectives of myriad stakeholders inside and outside the firm. 

Leading companies are building norms of acceptable data use to guide employees and partners in using people-centered data and analytics in ways that are informed by values of the organization and ecosystem actors. 

Since 2015, MIT CISR has investigated how companies, especially large ones, are able to use and innovate with data confidently in the face of concerns about acceptability.[foot]From 2015–2017, MIT CISR researchers conducted forty-five executive interviews; produced two company case studies; and hosted two online discussions, one each with the 2015 and 2017 MIT CISR Data Research Advisory Boards, composed of executives in data leadership roles and representing 40 and 85 organizations respectively.[/foot] We have found that leading companies are relying on management practices that help them build norms. Norms of acceptable data use guide employees and partners in using people-centered data and analytics in ways that are not just compliant with existing laws and regulations but are informed by organizational values and those of ecosystem actors. Company norms help leaders develop rules and procedures that reinforce appropriate employee and partner behaviors, which over time shape an acceptable data use culture. 

IAG is a company whose best practice journey to acceptable data use illustrates responsiveness to stakeholder interests and mindful governance. 

DEVELOPING ACCEPTABLE DATA USE AT IAG 

Insurance Australia Group Limited (IAG) is an Australia-based insurance company with operations in Australia and New Zealand and a growing presence in Asia. The $12 billion company strives to make its customers’ lives safer by supporting its core business with innovative insurance products and complementary offerings. Recent industry disruption is transforming insurance markets, such as how the auto insurance market is being impacted by the development of driverless cars and the rise of car sharing. This transformation is stimulating an expansion of innovation at IAG. 

Investing in Data-Driven Innovation 

Beginning in 2015, IAG invested in enhancements to its enterprise data and analytics capabilities. The company established a Hadoop-based advanced analytics platform onto which it consolidated customer data from thirty operational systems. It also acquired forty-person analytics company Ambiata for its data science talent and analytics leadership. And in December 2016, IAG created a division—Customer Labs—under Chief Customer Officer Julie Batch that merged data, analytics, marketing, customer experience, digital, design, venturing, and product innovation. 

As data resources and structures were established, the IAG Leadership Team encouraged an organizational culture in which employees valued data and leveraged it in great digital processes and offerings. CEO Peter Harmer led the culture change and communicated regularly both inside and outside the company what it meant for IAG to be a data-driven company. Harmer and his team recognized that for IAG to continuously build and maintain trust with its customers, there would need to be effective frameworks for and transparency into the process of collecting, using, and managing data. 

Defining Guidelines on Data Use 

IAG began defining guidelines by creating a framework regarding how IAG would collect, manage, use, and disclose customer information. The framework proposed three lenses: Legal—what does the law allow? Ethical—how it is appropriate for IAG to use data, in accordance with the company’s corporate purpose and values? And social—what do IAG customers expect or demand? According to Julie Batch, “The law is actually quite broad in Australia—and could be open to interpretation. Our business is founded on trust, so it’s important that we consider the needs of our customers when making data decisions.” And per Michelle Pinheiro, Director of Intellectual Property and Data Governance, “Our mission is to make our customers’ world a safer place, which extends to keeping their data safe as well.” 

IAG began to build norms of acceptable data use by creating a framework that defined what the law allows, what the company's values dictate, and what its customers expect. 

Four groups have been involved in establishing, evolving, and executing the guidelines: The Compliance team evaluates data and analytics efforts to ensure they meet legal requirements and company standards. Members of this team are assigned to projects to circumvent compliance issues. The Futures team evaluates projects regarding customer interests and ethical concerns. Through workshops, focus groups, and surveys, its members actively engage with customers and represent them on projects. The IAG IP and Data Committee—with senior representatives from across all of the company’s business divisions—makes foundational decisions such as how the company may combine data sets and how to achieve clarity regarding customer consent. Finally, the IAG leadership team addresses emergent issues and handles concerns that have been escalated. As each group encounters new issues and achieves resolutions, the framework has been adapted accordingly. 

IAG expected that the framework would change as technology progressed and customer expectations shifted. This has indeed been the case—and leaders have discovered that while the legal and ethical lenses have been influential, the customer (social) lens has been dominant in shaping the framework. 

Protecting Personal Data 

To reinforce IAG’s acceptable data use framework, leaders centralized the management of customer data. The IAG Data Governance Team has coordinated all access to the Hadoop data lake, ensuring that all customer information is governed before the company releases it. The team created a privacy rating process to classify all of the data in the lake according to its personally identifiable information (PII) and sensitivity. Michelle Pinheiro notes, “The information that we have about vehicles (like their make and model) is not as sensitive as our customers’ names [and their] date of birth and gender. Making that distinction is fundamental to governing the data effectively.” 

To request data extracts, employees must articulate what data is needed, how it will be used, whether it will be disclosed outside the company, and what the desired benefit is. The Data Governance Team reviews each request and recommends whether and how to move forward. If the data request challenges any of IAG’s ethical data principles, the team will suggest an alternate way to achieve the business objective more safely, or put the project on a data risk register to be reviewed by the Data Governance Council. The team wants to be seen as an enabling rather than disabling function. According to Pinheiro, “The success of our governance processes for data usage hinges upon our role in enabling our colleagues to achieve their goals for using data, not in preventing them from moving forward.” 

Privacy and Security leaders also rely on good data management practices to guide employees on acceptable data use. A team of ten dedicated data analysts monitor data quality metrics and remediate them when they drop below tolerable thresholds, thus ensuring accurate data for employees’ analyses. Another group oversees the tagging of data with contextual metadata that might indicate how and when the data was collected, its lifespan, or limitations to its use. 

Protecting Person-Centered Analytics 

While as important to IAG as acceptable data use, acceptable analytics use has represented more of a grey area. Inferences about a customer such as smoking status and gender—and the creation, storage, management, and sharing of such inferences—are not governmentally regulated. IAG has found it impossible to anticipate all potential analytical use cases, applications, and effects. Chief Analytics Officer Rami Mukhtar advises pragmatism when managing analytics: “You need to have the frameworks and the capabilities to ensure that if issues arise, you can deal with them appropriately. But ultimately, [acceptable use of analytics] is about pushing the boundaries of where analytics can take you with very clear principles in mind.” One principle, for example, is not using machine learning techniques to infer customer identity when you don’t have consent to do so. 

Regarding person-centered analytics, IAG has chosen to be conservative: It does not share analytics results outside of the company. The Privacy and Security team classifies analytics similarly to data according to the IAG PII sensitivity scale. And the team reviews how analytics are stored, how IAG’s insights are combined with third parties’, and how insight usage is tested with customers. 

Governing Data and Analytics Use by Partners 

IAG leaders consider its customers’ data to be the company’s responsibility even once shared with business partners. As such, the company maintains tight control over the data it provides to external parties. IAG reviews partner privacy agreements to learn where and how data will be stored or shared; establishes contractual controls regarding where data can flow and when it needs to be destroyed or removed; and audits its partners’ actions. The team has become adept at identifying non-specific terminology embedded in some partners’ privacy policies, such as terms that allow a partner to use data for unspecified reasons or for an unlimited period of time. 

Growing an Acceptable Data Use Culture at IAG 

IAG leaders have built acceptable data use capabilities by establishing a framework that it has shaped by a multitude of perspectives. The company is executing this framework through management of its data assets, analytical activities, and partnerships. And as a result, IAG’s culture has changed: employees have become more conscious of the importance of protecting personal data, the risks and benefits of leveraging person-centered analytics, and the value of questioning the stated goals and potential outcomes of projects. Chief Customer Officer Batch remarked, “At IAG, we recognise that data is our lifeblood; insurance is, after all, just a form of predictive analytics. Keeping our customers’ trust and being there when we they need us is core to our purpose. We see access to our customers’ data as a privilege that’s critical to our effective operation, and we are continually ensuring we never take that for granted.” 

BUILDING NORMS 

Management practices that facilitate the development of norms of acceptable data use are supported by five basic principles: 

  • Challenge current thinking and envision future alternatives to surface value conflicts inside the company. Example practices: devil’s advocacy, ethics review boards 
  • Incorporate direct and indirect stakeholder perspectives to surface value conflicts and perceptual risks outside the company. Example practices: involving stakeholders in pilots, appointing a stakeholder advocate on project teams 
  • Monitor and evaluate the way data is being converted into insight and action to understand employee data needs and behaviors. Example practices: usage request and approval processes, digital rights management technology 
  • Establish shared understanding, consensus, and buy-in regarding conflicts and resolutions to identify desired behaviors. Example practices: oversight boards, privacy impact assessment processes, employee training programs 
  • Accept responsibility for the data use of your ecosystem partners to surface risks that may result from organizational relationships. Example practices: partner data audits, partner data privacy reviews This is not a time to just write rules; more rules won’t grow a shared understanding of acceptable data use in your company or prepare the company for emergent issues. It is the time to put practices in place to build norms—to guide your employees to appropriate, confident action amidst transformation. 

Figure 1: Building an Acceptable Data Use Culture

© 2017 MIT Sloan Center for Information Systems Research, Wixom and Markus. MIT CISR Research Briefings are published monthly to update the center's patrons and sponsors on current research projects. 

About the Authors

MIT CISR Researcher

M. Lynne Markus, M. Lynne Markus, The John W. Poduska, Sr. Professor of Information and Process Management Bentley University

MIT CENTER FOR INFORMATION SYSTEMS RESEARCH (CISR)

Founded in 1974 and grounded in MIT's tradition of combining academic knowledge and practical purpose, MIT CISR helps executives meet the challenge of leading increasingly digital and data-driven organizations. We work directly with digital leaders, executives, and boards to develop our insights. Our consortium forms a global community that comprises more than seventy-five organizations.

MIT CISR Patrons
AlixPartners
Avanade
Axway, Inc.
Collibra
IFS
Pegasystems Inc.
The Ogilvy Group
MIT CISR Sponsors
Alcon Vision
Amcor
ANZ Banking Group (Australia)
AustralianSuper
Banco Bradesco S.A. (Brazil)
Banco do Brasil S.A.
Bank of Queensland (Australia)
Barclays (UK)
BlueScope Steel (Australia)
BNP Paribas (France)
Bupa
CarMax
Caterpillar, Inc.
Cemex (Mexico)
Cencora
Cochlear Limited (Australia)
Commonwealth Superannuation Corp. (Australia)
Cuscal Limited (Australia)
CVS Health
Dawn Foods
DBS Bank Ltd. (Singapore)
Doosan Corporation (Korea)
Fidelity Investments
Fomento Economico Mexicano, S.A.B., de C.V.
Fortum (Finland)
Genentech
Gilbane Building Co.
Johnson & Johnson (J&J)
Kaiser Permanente
King & Wood Mallesons (Australia)
Koç Holding (Turkey)
Mercer
Nasdaq, Inc.
NN Insurance Eurasia NV
Nomura Holdings, Inc. (Japan)
Nomura Research Institute, Ltd. Systems Consulting Division (Japan)
Novo Nordisk A/S (Denmark)
OCP Group
Pacific Life Insurance Company
Posten Bring AS (Norway)
Principal Life Insurance Company
QBE
Ramsay Health Care (Australia)
Raytheon Technologies
Scentre Group Limited (Australia)
Schneider Electric Industries SAS (France)
Stockland (Australia)
Tabcorp Holdings (Australia)
Telstra Limited (Australia)
Terumo Corporation (Japan)
Tetra Pak (Sweden)
Truist Financial Corporation
UniSuper Management Pty Ltd (Australia)
Uniting (Australia)
USAA
Webster Bank, N.A.
Westpac Banking Corporation (Australia)
WestRock Company
Wolters Kluwer
Xenco Medical
Zoetis Services LLC

MIT CISR Associate Members

MIT CISR wishes to thank all of our associate members for their support and contributions.

Find Us
Center for Information Systems Research
Massachusetts Institute of Technology
Sloan School of Management
245 First Street, E94-15th Floor
Cambridge, MA 02142
617-253-2348