Five Organizational Capabilities Needed Before Jumping into Big Data Analytics

Home  »  Blog  »  Five Organizational Capabilities Needed Before Jumping into Big Data Analytics
Abstract Image for procurement optimization

Big data. Everyone is talking about it, everyone wants it, but what exactly is “it”? Simple answer is, big data is lots and lots of little data collected and stored in all sorts of ways and places. For oil and gas companies, it’s the detailed daily data from SCADA systems as well as systems that support land, drilling, well management, production, reserves, and administration.

This data is stored in various files and databases throughout the company, and some of the data has probably been shaken, stirred and poured into a data warehouse or data lake to be used by an expensive reporting system. Organizationally, most business units or regions manage their own applications and data for use in the data warehouse/lake. However, departmental measures and variables are used for localized reporting and not optimized for global reporting.

This siloed approach to data management hinders a company’s ability to use global data for better decision-making, operational efficiency, improved data understanding and data quality.

What’s more, the “as is” situation has a direct bearing on big data’s potential impact on our industry. Why is that? Because you can’t talk about big data without also talking big data analytics – which in simple terms is capturing all that little data and analyzing it to uncover hidden patterns, unknown correlations and other nuggets of business intelligence. And that’s why big data has been the big buzzword in IT for the past several years.

Hype aside, big data is a real thing. And big data analysis tools are already in use today in many industries. But what about in oil and gas? Are oil and gas companies using big data analytics? And are most companies even ready?

With the rare exception or two, the honest answer is no. And the reason is that the practice of data management in oil and gas companies has not kept up with technology advances like big data. The organizational capabilities needed for big data analytics are not going to magically appear with the purchase of expensive analytic tools. These skill sets must be grown and matured.

Before oil and gas companies can do big data analytics, they must first learn to manage and use the data they have now. Here are the five data management practices that must be engrained in the organization before any company can have success with big data analytics.

  1. Business Glossary – A business glossary standardizes business terms and measures, allowing the business to globally communicate performance and metrics. A business glossary differs from a data dictionary because it is owned by the business and not IT. The business glossary enhances data governance by using this shared business vocabulary and organized list of accepted terms and measures.
  2. Business Rules – Where the business glossary is the business vocabulary, business rules are the accepted business practices. These rules define the criteria and conditions for making business decisions. Reporting entities, business unit roll-ups and data relationships are outlined, communicated and reflected in data reporting. Business rules must align the operational decision-making actions with the strategic objectives of the company.
  3. Master Data Management – Every company has data redundancy, whether it be from the use of multiple systems or growing through mergers and acquisitions. Providing one point of reference for critical information eliminates costly redundancies and data errors. A system must be in place to implement the business glossary and business rules of the company through the standardization of dates, such as spud date, first production date, completion date or measures, such as the lateral length of a well, working interest or currency conversions, etc. The business data must reflect the defined business glossary and business rules.
  4. Data Cleansing – Data is inherently flawed. There are data entry errors, variances between how departments capture or define data, duplicate data, incomplete data, etc. Data cleansing is the process of detecting, correcting and standardizing the common data. Quality data requires a set of quality criteria be applied to business data aggregating, filtering, merging, appending, deduping and transforming. A defined data cleansing process for improving data capture and quality is required to ensure confidence at the corporate level that data throughout the organization is accurate, up-to-date and consistent. If the business glossary and business rules are the “what”, then data cleansing is the “how” they are implemented.
  5. Data Testing – Testing is a process, not an activity. Defining how data will be tested ensures consistency among data sources. Other than gut instinct or by matching data in the source system, how does the business know when data is correct? The business must define a testing process and determine what data will be tested against, what time periods are tested, what variances are allowed, etc. Consistently vetting the data ensures a greater confidence in the data. Successful data testing is the confirmation that the business glossary, business rules, master data management and data cleansing were successfully identified, interpreted and implemented.

Simply put, a lack of good data management cannot be filled by technology alone. Good data does not just happen, and big data analytics in and of itself is not going to change the data you have today. When we first adopt sound data management practices, we are laying the foundation for big data analytics and digital transformation. Once we learn to manage data with confidence and to treat it as an essential business asset, the transition to big data analytics will be successful and worth the investment in new technology.

About the Author

Charity Queret is a senior consultant at Stonebridge Consulting. Charity has over 20 years of experience in designing and developing end-to-end business intelligence and data warehousing/lake solutions. Her data management expertise includes business intelligence services, such as Cognos and Crystal development, requirements gathering, data verification, data mapping and testing. She also provides documentation of existing systems, user manuals and training and BI roadmaps for future development.