0% found this document useful (0 votes)
9 views11 pages

Field Data Management - From Wellhead To Workstation

The document discusses the evolution of data management in the hydrocarbon production industry, highlighting the shift from manual data recording to automated systems. It emphasizes the need for seamless integration of production data from the wellhead to the workstation, proposing a many-to-one data architecture using a central data warehouse as the most efficient solution. The paper concludes that this approach reduces operational costs, minimizes errors, and enhances data accessibility for stakeholders.

Uploaded by

lionel
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views11 pages

Field Data Management - From Wellhead To Workstation

The document discusses the evolution of data management in the hydrocarbon production industry, highlighting the shift from manual data recording to automated systems. It emphasizes the need for seamless integration of production data from the wellhead to the workstation, proposing a many-to-one data architecture using a central data warehouse as the most efficient solution. The paper concludes that this approach reduces operational costs, minimizes errors, and enhances data accessibility for stakeholders.

Uploaded by

lionel
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

5.

Field Data Management – From Wellhead to Workstation

by

R. A. (Bob) Keller
Vice President
Asset Management and Optimization

and

Robert Cottingham, P. Eng.


Senior Automation Engineer

Emerson Process Management


Kenonic Controls Division

Abstract

The hydrocarbon production industry has come a long way from the days when
measurements were manually recorded on “horse blankets” and transcribed from
department to department. The current state of the industry utilizes a mixture of
automated, semi-automated, and manual systems to measure, record, validate,
and disseminate production data throughout an organization. However,there is
still room for improvement. Advances in automation, information technology, and
telecommunications make it possible to enjoy seamless integration of production
data – from the wellhead to the workstation. This paper examines the current
state of production data integration within an operating company, and the
direction this integration will take over the next few years.
5.1

Introduction

It can be said that the hydrocarbon production industry has two primary
functions:

1. Produce oil and gas

2. Manage enormous amounts of data to account for and report on the


production of said oil and gas

The focus of this paper is directed at the latter.

A Little History

In the early days of this industry, simple mechanical meters and charts were
used to measure production. Operators would read these devices and record the
measurements on large paper production reports known as “horse blankets”.
These reports would be collected from the production areas and the numbers
would be transcribed, aggregated, and validated as they made their way through
the organization. Eventually, the data would make its way to the Production
Accounting, Engineering, Marketing, and Operations Departments.

As technological innovations permeated the industry, some of the steps in this


original process became automated. Or at the very least, were made more
efficient in some manner by augmenting the process with computer systems.
Each innovation that was introduced into the system shared common objectives:
reduce operating costs, eliminate data errors, and make the data available to an
ever-increasing theatre of stakeholders.

There were two innovations that were intended to reduce and/or eliminate the
massive number of production reports. These were SCADA systems which were
introduced in the early 1970’s and Field Data Capture systems that were
developed in the 1980’s.
5.2

SCADA systems were developed to monitor and control wells, facilities and
pipelines. Reporting was a function provided by the early system but the reports
were not of production accounting quality. SCADA systems now generate
accurate production data on a timely basis.

Field Data Capture systems were designed to replace manually generated


production reports. They were originally designed as an operator tool. These
system have expanded beyond their original design.

However, none of these system will provide a “wellhead to workstation” solution.


Instead they have become integral parts of a complete solution.

The Current Situation

The systems that have built up within many organizations are typically a
heterogeneous mixture of legacy systems. Some of these systems are custom-
built to solve an industry or company-specific problem. Others are mass-market
applications that have been adapted to the needs of our industry. And some of
the systems are legacy solutions that were acquired along with the purchase of
producing properties. This mixture of technologies within an organization is the
rule rather than the exception - and it is not expected to change.

How can an operating company deal with a diverse mixture of technologies, all
intended to make production data management more efficient? There are two
approaches:

1. Link every data source to every data consumer

2. Link all data sources and consumers to a central data warehouse

Many to Many Connections

The nature of this business is such that many processes and systems are
developed with a goal of lowest initial cost and quickest turnaround. Although this
5.3

makes good business sense in the short term, there are some problems with this
in the long term. In particular, the quickest/cheapest short term solution often
requires larger on-going maintenance or operational costs. The flow of
production data through a company is a classic example of this situation. A
typical production company may have a data flow diagram similar to the one
shown in Figure 1.
Production Operations
Accounting Engineering Marketing Management

Data Consumers

Data Producers

Field
SCADA System

Excel Chart Reading Plant DCS


Spreadsheet Service Operator Interface

DCS CPU

Electronic RTU PLC


Flow Measurement DCS I/O
Device

DCS I/O

Figure 1: Many-to-Many Data Flow Diagram

In this example, there are a total of 16 data paths that must be created and
maintained in order to connect the four data producers with the four data
consumers. Each data path represents some method of transferring data from
one system to another. Some example data paths that may exist include:
telephone notification, fax, database connection, web-based forms, or email.
5.4

Mathematically, as additional producers and consumers are added to this


system, the number of data paths increases according to the simple equation:
D=NxP
Where:
D is number of data paths
N is number of data consumers
P is number of data producers

This means that every time a new producer is added to the mix (ie. Whenever a
new producing property is bought or developed), a total of N data connections
must be built to satisfy all of the existing consumers. It should be obvious that the
more data paths that exist, the higher the operating costs become. As well, each
data path has the potential for errors in transmission and delays.

Many to One Connections

An alternative to the many-to-many connection method is the many-to-one


system. In this system, a central depository is introduced to act as a unified data
collection and dissemination point. Although not usually the cheapest or the
quickest solution, a central data depository will provide long term benefits in the
way of reduced operating costs, elimination of most errors, and timely distribution
of data to the stakeholders.

A production company utilizing a central data warehouse would have a data flow
diagram similar to the one shown in Figure 2.
5.5

Production Operations
Accounting Engineering Marketing Management

Data Consumers

Data
Warehouse
Data Producers

Field
SCADA System

Excel Chart Reading Plant DCS


Spreadsheet Service Operator Interface

DCS CPU

Electronic RTU PLC


Flow Measurement DCS I/O
Device

DCS I/O

Figure 2: Many-to-One Data Flow Diagram

Mathematically, as additional producers and consumers are added to this


system, the number of data paths increases according to the simple equation:
D=N+P

This means that every time a new producer or consumer is added to the mix,
only one new connection must be implemented. A single data path for each
producer and consumer dramatically reduces the ongoing operations and
maintenance cost associated with managing production data. Even in this very
simple example, the total number of data points is half that shown in the many-to-
many situation. The savings increase exponentially as the system grows.
5.6

Several commercially available data warehouse or “historian” systems are


available to meet the needs of the oil and gas industry including:

• iHistorian by Intellution

• PI by OSI Software

• InfoPlus.21 by AspenTech

• Trend Link by Canary Labs

• Uniformance (PHD) by Honeywell

The selected data historian should have a number of collection, archive, analysis,
and management features that are particularly well-suited to this application.

Collection

The chosen historian should provide a flexible and robust collection system. In
particular the collection system should be able to interface with a variety of
legacy systems using industry-standard interfaces. As a minimum, the historian’s
collection system should be able to extract data from existing field systems using:

• OPC (OLE for Process Control) This is the interfacing standard most
commonly employed within the automation industry. Most SCADA systems
utilize OPC servers to collect data from a myriad of flow computers and other
automation equipment such as PLCs and RTUs.

• CSV (comma separated variable) files. A CSV file is typically created by


spreadsheet programs such as Microsoft Excel. The input to the spreadsheet
may be an automated report from a legacy SCADA system, or it may be
manually entered from field operators.
5.7

• A variety of communication systems including the corporate WAN, dial-up,


lease line, cellular, CDPD, microwave, or satellite.

As well, the data collection sub-system should employ a “store and forward”
technique. Such a technique makes the system more robust by allowing
historical data to continue to be collected even though the link between the data
source and the data warehouse has been temporarily disabled. When this
happens, the collection system continues to gather and store production data.
When the link to the historian is reestablished, the collected data is transmitted.

Archive

The chosen historian should provide a high performance and expandable data
archive system. The archive system should be able to:

• Handle an unlimited number of data points to allow for growth and future
needs

• Store the data for an indefinite period of time (data compression techniques
are important)

• Apply time stamps to all incoming data and be able to resolve data generated
from different time zones.

• Operate in a fault tolerant manner by using techniques such as online backup


and automatic reconnection.

Analysis

The chosen historian should provide a variety of data extraction and analysis
tools to allow any qualified stakeholder to view the archived data. Some data
analysis features to look for include:
5.8

• Support for interrogation by other database systems using ODBC (Open


Database Connectivity). This “hook” will allow various consumers of the data,
such as Production Accounting, to retrieve the archived data and insert it into
their system for further manipulation and reporting.

• Support ad-hoc, casual interrogation from users within the organization. The
production data within the data warehouse is valuable to users such as those
in the Engineering or Marketing departments if they can get the data they are
looking for quickly, without being a nuisance to the Operations department.
As such, the historian’s analysis tools should include an easy to use web-
based trending and extraction system. This system should allow the user to
select the data points of interest, the collection times, and the presentation
format.

• Support integration with Excel-based reports. This will allow users within the
organization to define data reports in a format and schedule that suit their
individual requirements. As well, a familiar tool like Excel will reduce the
amount of additional training that is required.

Management

Finally, the selected data historian should be cost effective to maintain. That is, it
must be able to fit effortlessly into your existing IT Department’s infrastructure. To
ensure this, the historian should:

• Be platform independent and be able to work with virtually any hardware or


network infrastructure.

• Utilize thin-client, web-based maintenance tools. These tools should allow the
IT personnel to define new data collection sources with a minimal amount of
training.
5.9

• Support bulk definition and modification of large numbers of data points.

Conclusion

Wellhead-to-workstation production data connectivity is needed in today’s oil and


gas industry to operate efficiently and meet legislative requirements. Two
methods of achieving that connectivity have been discussed: many-to-many, and
many-to-one. This paper concluded that the many-to-one architecture, using a
data historian, provides a stream-lined system with a number of tangible benefits
including: reduced system maintenance and operations costs, more timely
movement of data, and broader accessibility to the production data.

Using a central data warehouse for production data, only one data path needs to
be established for each data producer and consumer. The resulting architecture
is flexible and is able to expand to accommodate new users or new producing
properties with a minimal amount of IT intervention. As well, the data warehouse
architecture makes it easy for casual users of the data to create ad-hoc queries
and reports. This allows them and other affects personnel, such as those in
Operations, to work more productively with fewer interruptions and delays.

The end result of a central data historian implementation is production data from
the wellhead to the workstation.

Biographical Sketch

Bob Keller graduated from Northern Alberta Institute of Technology with a


diploma in Gas Technology in 1966. He worked in the oil and gas industry for 36
years with Chevron Standard, Energy Resources Conservation Board (now
EUB), BMP Energy prior to joining Kenonic Controls. Bob joined Kenonic
Controls as Manager, Measurement and Production Services in 1992. He has
5.10

held various positions within Kenonic including Vice President of Oil and Gas
Operations Services and Vice President Integrated Petroleum Industry. Bob is a
member of CGPSA, CGA and AGA. Bob is presently Vice President of Asset
Management and optimization with Emerson Performance Management,
Kenonic Controls Division.

Robert Cottingham graduated from the University of Calgary with a Bachelor of


Science degree in Electrical Engineering in 1987. He has 14 years of industry
experience in design, implementation, and management of SCADA, PLC and
DCS systems. Rob is an automation engineering consultant with Emerson
Performance Management, Kenonic Controls Division.

You might also like