Content

About

Enterprise Data & Business Analytics: A Holistic Approach to Insights-Driven Innovation

Elizabeth Mixson | 10/30/2020

Every minute of every day organizations of all types accumulate almost unfathomable amounts of data. For example, GPS systems collect data on where we are going, by which route and how long it takes. Major ecommerce sites keep track of what product's we're browsing, clicking on and purchasing. Social media sites track what people are viewing, sharing and commenting on.

Behind the scenes companies are also collecting huge amounts of data ranging from day-to-day financial transactions to changes in their workforce to industrial performance data. The totality of all of the digital information any given organization collects is known as enterprise data.

Centralized data that is shared by many users throughout the organization, enterprise data tells the story of your business, past, present and future. However, data within itself is not valuable. In order to harness the limitless power of enterprise data, it must be transformed into insights.

Business Analytics (BA) is the application of statistical methods and technologies on raw, historical data to derive actionable “insights” from it. This “business intelligence” can then be used to guide decision making, enhance strategic planning, develop evidence-based solutions to complex problems and even predict future outcomes.

BA typically fall into one of three categories:

  • Descriptive analytics – which describe what happened
  • Predictive analytics - which provides insight into what could happen based on historical outcomes
  • Prescriptive Analytics - which are the combination of descriptive and predictive analytics using artificial intelligence (AI) and machine learning (ML) technology, to convey what will happen, when it will happen, and why it will happen.


While data analytics refers to the science of identifying meaningful patterns in data, enterprise data analytics (EDA) is an umbrella term that encompasses the strategies, processes, and tools surrounding the discipline. In other words, it refers both to the overarching strategy and the IT infrastructure involved in managing the end-to-end data analysis process, from data acquisition to visualization.

What is Data Mining?

Data Mining refers to the process of identifying patterns, anomalies, relationships and trends within large sets of data and using these insights to predict future outcomes. By combining statistical and artificial intelligence (such as neural networks and machine learning) tools with database management, businesses are able to analyze large data sets.

For example, Air France KLM used data mining techniques to integrate info from trip searches, bookings, and flight operations with web, social media, call center, and airport lounge interactions. Using these insights, they were able to create a holistic, 360 view of customer behavior which they used to create personalized travel experiences.

Though data analysis and data mining are interrelated, they are not exactly the same things. Very generally speaking, data mining seeks to identify previously unknown relationships in data whereas data analysis seeks to deliver more focused insights that enhance decision making or help solve specific business challenges.

*Infographic sourced from "Data Mining vs Data Analysis," https://www.educba.com/data-mining-vs-data-analysis/ 

 

What is Data Monetization?

Data monetization is the process of using data to increase revenue or drive quantifiable economic profit. Though the term “data monetization” can refer to the literal sale of company data to third parties (known as direct data monetization), it also pertains to the internal use of data-driven insights to optimize process, develop new products or drive innovation (indirect data monetization).

One example of direct data monetization is the sale of location data from a ride share app to local retailers who in turn use this information to develop targeted, geo-specific marketing campaigns.

An example of indirect data monetization is the use of past purchase and browsing data by e-retailers (i.e. Amazon) to provide customers with personalized product recommendations. Or if a credit card company used internal fraud data to build an AI powered tool that essentially automated the fraud detection and response process, thereby increasing operational efficiency, curtailing fraud related financial losses and boosting consumer confidence.

Though, as of 2020, only about 17% of companies have truly monetized data, this discipline is certainly poised to grow. However, according to the 2019 Business Application Research Center (BARC) Data Monetization Survey, the number one inhibitor data monetization is poor data quality.

 

Video sourced from "How do companies monetize their data?", https://bi-survey.com/data-monetization

 

For many organizations, establishing a robust enterprise data management and analytics approach is the first step in understanding and capitalizing on the value of its own, internal data.

READ NEXT: Tesla. Automaker or Data Company?

Types of Enterprise Data

Transactional data – Operational data generated from day-to-day business transactions such as purchases, orders, payroll, invoices, etc. Think number of products sold to customers, number of employees, money owed, extc. Transactional data is typically created by Enterprise Resource Planning (ERP), Employee Management Systems (EMS) and Customer Relationship Management (CRM) systems.

Main Data - As defined by Gartner, main data "is the consistent and uniform set of identifiers and extended attributes that describes the core entities of the enterprise including customers, prospects, citizens, suppliers, sites, hierarchies and chart of accounts.” In other words, key business information that supports business transactions. Generally, master data does not change and does not need to be created with every transaction. Main data is usually low quality and disparate, generated, stored accessed across multiple systems throughout the enterprise.

Strategic data – Complex data related to business growth and competitiveness. Can be used to make business decisions. For example, the change in purchase volumes over time, stock market trends, customer retention, time to hire, and so on. Though this data can be pulled from multiple systems, it’s typically stored in Online Analytical Processing (OLAP) repositories, such as data warehouses and data lakes.

Data warehouses and data lakes represent 2 different approaches to storing data. Data lakes are, essentially, giant pools of unstructured and structured data from various company data sources. Data warehouses are repositories for structured, filtered historical data that fit a relational database schema. Though this is changing, in the past, data lakes were really only used by engineers and data scientists while data warehouses were the more user friendly option for business leaders.

However, in a business environment where everyone needs to think like a data scientist, organizations are developing new ways to give everyone access to the advanced, strategic enterprise data analytics, like deep learning and real-time analytics, data lakes enable.

 

The Fundamentals of Enterprise Data Management

As mentioned before, organizations of all kinds generate massive amounts of data on a daily basis. However, only a fraction of it is actually usable. According to one recent global study, on average companies only actually utilize about 45% of the data they produce. Why? Often because they lack the robust enterprise data management processes, tools and activities necessary to ensure data accuracy, quality, security, availability, and good governance.

Enterprise data management (EDM) refers not only to the process of inventorying and governing enterprise data, but as the folks at Tableau put it, EDM also “means making sure your people have the accurate and timely data they need, and that they follow your standards for storing quality data in a standardized, secure, and governed place.”

By ensuring enterprise data is controlled, integrated and usable, EDM serves as the foundation for big data and data monetization.

Key processes, practices, and activities that comprise EDM include:

Data Integration – At its simplest, the process of combining data from multiple sources into one central location to provide a single, unified view of the data. The goal of data integration is to ensure highly accurate, timely and consistent data is easily accessible by employees no matter what function they work in or where they may be located. Not only does it save time by eliminating the need to manually look up different data in different systems, by combining different types of data, it also enables more complex, strategic data analysis. Without data integration, accurate analytics would be impossible to achieve.

Main data management (MDM) - The core process used to manage, centralize, organize, categorize, localize, synchronize and enrich master data according to a succinct set of business rules. Similar to data integration, the goal of MDM is to provide a single, trusted view of main data across the enterprise.

Data governance (DG) – A set of rules, standards and policies that control or manage the availability, usability, integrity and security of enterprise data. DG frameworks are comprehensive, outlining the people, processes and technology required to ensure data is properly and uniformly handled across the entire enterprise. One of the major benefits of data governance is that it enables the democratization of data. In other words, by breaking down siloes and driving collaborative data practices, it makes it possible for non-data scientist to access and utilize accurate, timely and relevant data.

Data quality management– Data quality management refers to the processes, methods, and technologies put in place to ensure enterprise data meets specific business requirements. In other words, that data is high quality and usable. This can be done in a variety of ways. Data cleansing, for example, the process of fixing or removing incorrect, corrupted, incorrectly formatted, duplicate, or incomplete data within a dataset. Another DQM strategy, data enrichment, involves enhancing, refining, and improving raw data, typically by combining it with data from outside sources.

Data stewardship – A key component of data governance and master data management, data stewardship is the person responsible for making sure data usage, management and security policies are adhered to. They essentially serve as a liaison between the analytics team and the business.

Data warehouse – a type of data management system that serves as a repository for large amounts of historical data pulled from multiple systems. Not only do these platforms centralize and consolidate data, they also include analytical capabilities as well.

ETL/ELT – Standing for extract, transform and load, ETL is one approach to data integration. Often used to build data warehouses, data is extracted from a source system, converted into a format that can be analyzed, and stored into a data warehouse or other system.

______________________________

 

Where are you on your enterprise analytics journey?

We at ADA our thrilled to announce out latest survey on the current state & future outlook of enterprise data & analytics. We invite you to take 2 minutes to complete our latest survey.

 

Create your own user feedback survey

Can't view our the embedded survey above? Access it here: https://www.surveymonkey.com/r/JXF5H82  

 

Upcoming Events


CDO Healthcare Exchange

February 11 - 12, 2025
Le Méridien Dania Beach, Fort Lauderdale
Register Now | View Agenda | Learn More


Data Management for Generative AI in Automotive

11th - 13th February 2025
NH München Ost Conference Center, Munich, Germany
Register Now | View Agenda | Learn More


GenAI DACH 2025

24 - 26 February, 2025
H4 Hotel Berlin
Register Now | View Agenda | Learn More

MORE EVENTS