0% found this document useful (0 votes)
225 views4 pages

NNPC and Data

The document discusses efficient data management operations at NNPC. It explains that data plays a vital role in analysis and decision making for hydrocarbon exploration and production. However, NNPC's data is often in complex formats and hard physical files collected over decades. This makes data difficult to access. The document then describes NNPC's data quality management methodology which involves identifying poor quality data, measuring it, analyzing existing data, grouping problem areas, improving data quality, and controlling future data issues. The goal is qualified, accessible data for effective analytics and decision making.

Uploaded by

Victor Ojukwu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
225 views4 pages

NNPC and Data

The document discusses efficient data management operations at NNPC. It explains that data plays a vital role in analysis and decision making for hydrocarbon exploration and production. However, NNPC's data is often in complex formats and hard physical files collected over decades. This makes data difficult to access. The document then describes NNPC's data quality management methodology which involves identifying poor quality data, measuring it, analyzing existing data, grouping problem areas, improving data quality, and controlling future data issues. The goal is qualified, accessible data for effective analytics and decision making.

Uploaded by

Victor Ojukwu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

Efficient Data Management Operations at

NNPC
The last two and a half centuries have brought about stunning changes in industrial
productivity. With each passing generation, new and more advanced equipment and
methodologies have arrived on scene to push production capabilities to new heights. And as
each new methodology or technological advancement has matured, the quest for greater
operational efficiency has honed and advanced their capabilities, setting the stage for the next
leap. While it would be easy to lump these achievements together over this period, the
distinctions that set them apart are important in and of themselves. Because of this
distinctiveness, we now identify that there have been four industrial revolutions, each bringing
improvements to efficiency to a new level and again setting the stage for the next.

And as the new century bloomed, yet another host of disruptive technologies appeared on the
horizon. The use of software and advanced computer technology has led to the merging of
machine and computer. Accompanied by the advent of AI, machine learning, and deep
analytics, these disruptive technologies have set the stage for explosive growth in efficiency
through the creation of the “smart” factory and the Industrial Internet of Things (IoT).

In implementing all these robust technologies, data is the fundamental asset for executing the
analytics over the current and historic data. A lot of geoscientific evolutions have been evolved
in increasing the recovery capacity from the subsurface reservoirs. NNPC conducts extensive
analyses in various phases such as geological, geophysical, geochemical and drilling in the
exploration and production of oil and gas to improve the yield and to remain competitive. In
the competitive and challenging O&G industry, rigorous data management and analytical tools
are very essential and with their deployment storing and accessing data has become much
easier as compared to the earlier printed format.

Data plays a vital role in any analysis at every point for making valuable decisions in every
aspect of hydrocarbon exploration and production. However, the data available in NNPC is
usually found in complex formats, and most of it is contained in hard physical files that were
collected over a period of time (may be a few decades). This makes it cumbersome to search for
the correct files from a whole lot of data, especially when the data is not in any specific order.
That’s where Data Quality Management and Analytics can play an important role in helping
NNPC deal with the collection, analysis, loading, quality of data, and ensures that the uploaded
data meets corporate standards. Data Analytics is applied on the qualified data by generating
the reports and charts for better decision-making in the upstream industry.

Data Quality Management is the process of providing solutions (in each of the above phases)
that would include collection, processing and storing of data. The processing or analyzing of
data involves various business rules to qualify and quantify the data according to the
organization’s requirements. The capturing, storing and accessing of valuable data is made easy
by data management tools. Data Analytics is employed in aggregating the data from various
data sources by improving the visualization of data for industry needs.

Data Quality Methodology


In order to improve poor quality of data we must implement a process for measuring data
defects and systematically eliminate them. This is where a predefined process of data
management plays a crucial role in qualifying where the data stands on a scale of quality. This
can be better explained by using data from an oil well as an example.

Identify the data that needs to be improved


During this phase the team works to define the data types to be improved. Examples of such
data types are logs, well header, well paths, etc. After assessing the data types to be improved
by consulting the business stakeholders, the data team can dive deeper to focus on the specific
fields for improving the data. Examples of specific data fields in well header data are the surface
location of a well, well UWI, operators, etc. The next step is to work on quality requirements for
the data fields, i.e. decide if an 80% accuracy level acceptable for the data field or does it need
to be at least 99% accurate. Finally, the decision should be made on the percentage of
measuring the data accuracy of each data type.
Measuring the quality of the well data
During this phase of the project, the team will translate the measurements defined in step one
into specific quality rules and will then load the set of rules into the software tool designed to
measure data quality. This tool helps in measuring the quality metrics of data before and after
the data cleanup process and we can gauge how much data has been improved.

Analyze the existing well data


During this phase, the rules are run against the dataset. Based on the rules defined, the
erroneous data is identified. At this point, a root cause analysis is conducted to understand the
underlying reason why so many wells failed a certain rule. If there is an underlying systematic
problem, then changes are recommended to the work process to avoid future problems with
the data.

Group problem wells


Once a set of problem wells are identified, they are grouped based on similar problems. If a
group of wells are failing a certain rule, then the team creates a correction rule that is always
applied to the data. For example, if surface location is missing for a group of wells, a rule is
created that pulls the surface location from the database and loads it into the drilling database
automatically every end of the day. Grouping wells also helps in dividing the cleanup work in
the next step.

Improve the quality of the well data


This is the most important step. Once the lists of wells with data issues have been identified,
the team then systematically figures out how to eliminate these defects. This is done by using
software tools to compare the data with other data sources and perform corrections using the
tool or manually. After this, the team reverts to analyzing the existing well data and checks if
the existing well data meets the quality goals defined in the first section.

Control future well data problems


Now that data has been cleaned and deemed fit for use, how will NNPC company ensure that it
is properly maintained and kept error free, and that any new data added is also good quality?
Software tools can be set to automatically reassess data and send reports of data quality when
it falls short of established standards. Henceforth, it is very important for data quality
governance strategy to be applied on all the future data migrating to the corporate database

Conclusion
Effective data management facilitates desktop access to numerous up-to-date databases,
including those related to surface and subsurface land, wells, pressure, temperature,
production, pipelines, core, reserves, seismic and logs. Combining data quality with analytics,
improving the data quality standards and utilizing the analytical techniques for analyzing the
data ensures that business users get high quality data, and allows them to better visualize data
and analyze it for better decision-making.

You might also like