Field Data Management - From Wellhead To Workstation
Field Data Management - From Wellhead To Workstation
by
R. A. (Bob) Keller
Vice President
Asset Management and Optimization
and
Abstract
The hydrocarbon production industry has come a long way from the days when
measurements were manually recorded on “horse blankets” and transcribed from
department to department. The current state of the industry utilizes a mixture of
automated, semi-automated, and manual systems to measure, record, validate,
and disseminate production data throughout an organization. However,there is
still room for improvement. Advances in automation, information technology, and
telecommunications make it possible to enjoy seamless integration of production
data – from the wellhead to the workstation. This paper examines the current
state of production data integration within an operating company, and the
direction this integration will take over the next few years.
5.1
Introduction
It can be said that the hydrocarbon production industry has two primary
functions:
A Little History
In the early days of this industry, simple mechanical meters and charts were
used to measure production. Operators would read these devices and record the
measurements on large paper production reports known as “horse blankets”.
These reports would be collected from the production areas and the numbers
would be transcribed, aggregated, and validated as they made their way through
the organization. Eventually, the data would make its way to the Production
Accounting, Engineering, Marketing, and Operations Departments.
There were two innovations that were intended to reduce and/or eliminate the
massive number of production reports. These were SCADA systems which were
introduced in the early 1970’s and Field Data Capture systems that were
developed in the 1980’s.
5.2
SCADA systems were developed to monitor and control wells, facilities and
pipelines. Reporting was a function provided by the early system but the reports
were not of production accounting quality. SCADA systems now generate
accurate production data on a timely basis.
The systems that have built up within many organizations are typically a
heterogeneous mixture of legacy systems. Some of these systems are custom-
built to solve an industry or company-specific problem. Others are mass-market
applications that have been adapted to the needs of our industry. And some of
the systems are legacy solutions that were acquired along with the purchase of
producing properties. This mixture of technologies within an organization is the
rule rather than the exception - and it is not expected to change.
How can an operating company deal with a diverse mixture of technologies, all
intended to make production data management more efficient? There are two
approaches:
The nature of this business is such that many processes and systems are
developed with a goal of lowest initial cost and quickest turnaround. Although this
5.3
makes good business sense in the short term, there are some problems with this
in the long term. In particular, the quickest/cheapest short term solution often
requires larger on-going maintenance or operational costs. The flow of
production data through a company is a classic example of this situation. A
typical production company may have a data flow diagram similar to the one
shown in Figure 1.
Production Operations
Accounting Engineering Marketing Management
Data Consumers
Data Producers
Field
SCADA System
DCS CPU
DCS I/O
In this example, there are a total of 16 data paths that must be created and
maintained in order to connect the four data producers with the four data
consumers. Each data path represents some method of transferring data from
one system to another. Some example data paths that may exist include:
telephone notification, fax, database connection, web-based forms, or email.
5.4
This means that every time a new producer is added to the mix (ie. Whenever a
new producing property is bought or developed), a total of N data connections
must be built to satisfy all of the existing consumers. It should be obvious that the
more data paths that exist, the higher the operating costs become. As well, each
data path has the potential for errors in transmission and delays.
A production company utilizing a central data warehouse would have a data flow
diagram similar to the one shown in Figure 2.
5.5
Production Operations
Accounting Engineering Marketing Management
Data Consumers
Data
Warehouse
Data Producers
Field
SCADA System
DCS CPU
DCS I/O
This means that every time a new producer or consumer is added to the mix,
only one new connection must be implemented. A single data path for each
producer and consumer dramatically reduces the ongoing operations and
maintenance cost associated with managing production data. Even in this very
simple example, the total number of data points is half that shown in the many-to-
many situation. The savings increase exponentially as the system grows.
5.6
• iHistorian by Intellution
• PI by OSI Software
• InfoPlus.21 by AspenTech
The selected data historian should have a number of collection, archive, analysis,
and management features that are particularly well-suited to this application.
Collection
The chosen historian should provide a flexible and robust collection system. In
particular the collection system should be able to interface with a variety of
legacy systems using industry-standard interfaces. As a minimum, the historian’s
collection system should be able to extract data from existing field systems using:
• OPC (OLE for Process Control) This is the interfacing standard most
commonly employed within the automation industry. Most SCADA systems
utilize OPC servers to collect data from a myriad of flow computers and other
automation equipment such as PLCs and RTUs.
As well, the data collection sub-system should employ a “store and forward”
technique. Such a technique makes the system more robust by allowing
historical data to continue to be collected even though the link between the data
source and the data warehouse has been temporarily disabled. When this
happens, the collection system continues to gather and store production data.
When the link to the historian is reestablished, the collected data is transmitted.
Archive
The chosen historian should provide a high performance and expandable data
archive system. The archive system should be able to:
• Handle an unlimited number of data points to allow for growth and future
needs
• Store the data for an indefinite period of time (data compression techniques
are important)
• Apply time stamps to all incoming data and be able to resolve data generated
from different time zones.
Analysis
The chosen historian should provide a variety of data extraction and analysis
tools to allow any qualified stakeholder to view the archived data. Some data
analysis features to look for include:
5.8
• Support ad-hoc, casual interrogation from users within the organization. The
production data within the data warehouse is valuable to users such as those
in the Engineering or Marketing departments if they can get the data they are
looking for quickly, without being a nuisance to the Operations department.
As such, the historian’s analysis tools should include an easy to use web-
based trending and extraction system. This system should allow the user to
select the data points of interest, the collection times, and the presentation
format.
• Support integration with Excel-based reports. This will allow users within the
organization to define data reports in a format and schedule that suit their
individual requirements. As well, a familiar tool like Excel will reduce the
amount of additional training that is required.
Management
Finally, the selected data historian should be cost effective to maintain. That is, it
must be able to fit effortlessly into your existing IT Department’s infrastructure. To
ensure this, the historian should:
• Utilize thin-client, web-based maintenance tools. These tools should allow the
IT personnel to define new data collection sources with a minimal amount of
training.
5.9
Conclusion
Using a central data warehouse for production data, only one data path needs to
be established for each data producer and consumer. The resulting architecture
is flexible and is able to expand to accommodate new users or new producing
properties with a minimal amount of IT intervention. As well, the data warehouse
architecture makes it easy for casual users of the data to create ad-hoc queries
and reports. This allows them and other affects personnel, such as those in
Operations, to work more productively with fewer interruptions and delays.
The end result of a central data historian implementation is production data from
the wellhead to the workstation.
Biographical Sketch
held various positions within Kenonic including Vice President of Oil and Gas
Operations Services and Vice President Integrated Petroleum Industry. Bob is a
member of CGPSA, CGA and AGA. Bob is presently Vice President of Asset
Management and optimization with Emerson Performance Management,
Kenonic Controls Division.