0% found this document useful (0 votes)
24 views13 pages

Implementation of API

Uploaded by

salem
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views13 pages

Implementation of API

Uploaded by

salem
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

Implementation of API & Other Standards

and Trends in Tertiary Measurement Devices

Presented by: Alan L.McCartney, President, Omni Flow Computers, Inc


Kenneth D Elliott, Executive Vice President, Omni Flow Computers, Inc

Introduction As the energy industry continues to re-


32-bit Processors & Memory Requirements It is
engineer itself through "downsizing", "rightsizing", and
estimated that the leading manufacturer of panel-
mergers, standardization of product selection and
mount flow computers may have as much as 70% of
outsourcing elements of engineering, operations and
32-bit flow computer installations over the past four
maintenance is much in evidence. This does not relieve
years. This would suggest that only manufacturers
the user of the responsibility of being knowledgeable of
that have achieved critical mass in field-proven
his own processes and the underlying technologies
custody applications with 32-bit based units
employed.
worldwide have the breadth of experience and
resource to effectively use these faster
There is collateral consolidation in vendor groupings,
microprocessors. For example, a flow computer
too. This is resulting in product rationalization and the
which uses a 32-bit processor not only can perform
emergence of only a few expert companies in various
calculation-intensive tasks in 200 msecs but can also
custody measurement specialties with sufficient
undertake concurrently a variety of related control
commitment to maintain product excellence and
and communication functions with minimal
integrity. A lack of commitment can lead to a loss of
processing impact if properly programmed by the
quality control which, in turn, impacts on customer
manufacturer. Only one manufacturer has
relationships. One can be easily persuaded that changes
successfully achieved that to date.
in our perception will continue to occur in what
constitutes a typical tertiary device e.g. flow
There also remains a significant difference in
computers. It has already been established that they are
requirements of realtime, continuously calculating
capable of multiple uses. Current and emerging
110/220AC/24VDC powered systems vs. interval
technologies will have a profound effect on their
processing for 6VDC low-voltage systems e.g.
ultimate role in the future of measurement, control, and
continuous sampling, say every 1 sec. but with a 1
data acquisition systems.
minute calculation cycle. These low voltage flow
computers, particularly solar powered systems in gas
Part 1: Software & Hardware architecture of measurement are the norm because of the absence of
computational devices continuous power cf. chart recorders! The processor,
to conserve power, is maintained in a 'sleep' mode
In this microprocessor age, software is a key ingredient with minimal activity occurring until such time as a
in virtually every aspect of our environment. In custody calculation routine needs to be activated. Heat
transfer, software implementation and certification of dissipation of the processor is also a major concern
API Standards in several generations of for many applications and impacts on the reliability
microprocessor-based devices has always been a and longevity of the electronics.
subject of debate by a number of users. The expert There are distinct differences between third party 32-
knowledge required both by the manufacturers and bit PC chipset modules for industrial applications and
users, contrary to expectation, is less in evidence today embedded processor hardware designed entirely from
than a decade ago due to cyclical and structural the fundamentals by the manufacturer with a tight
changes in the energy industry. One cannot consider the coupling between hardware and software being
software development independently of hardware achieved. The latter
architecture design.

Page - 1
approach is used in the majority of industrial evaluation. Tasks may run every 10msec, 50 msec,
applications including DCS systems. Memory 100 msec, 200 msec or 500 msec. Lower priority
addressing capability of processors can differ tasks can be preempted by higher priority tasks.
significantly. Compare Intel's 64K segmented Very low priority tasks, such as formatting a report
architecture addressing constraints to Motorola's 16 Mb for printing, are handled in “background” and are
linear addressing capability. This can have a major given CPU time when other tasks are not running.
impact on the programmer's ability to write efficient Asynchronous events such as serial communications
code. are interrupt driven and are given a priority based
upon their individual interrupt levels.
Programming language selection has a direct impact on
memory requirements for firmware code. Optimized Other ways of increasing processing throughput is to
code written in assembler language operates more operate with multiple processors with assigned tasks.
efficiently than higher level languages using compilers. Using a separate microprocessor as a
Memory requirements will increase as a result. The communications controller or I/O controller, for
principal task of the flow computer can be lost in the example, requires the programmer to write two
"shuffle" separate programs, which must be able to access the
same database without data synchronization
Microprocessors and Math Coprocessors The
concerns. This can be a source of continuing
primary function of a flow computer is still to undertake
problems when designing and testing the application
“number crunching”. Consequently, a math co-
code. Consider also the challenge of system-oriented
processor chip can speed up math processing
vendors who must meet the burden of project custom
significantly. Because ‘same chip family’ parts are
specifications and require staff or contract
used, there is usually a tight integration between the
programmers writing the necessary code on a one-off
CPU and the math co-processor chip or it is already an
project basis including testing and documentation.
integrated chip function as is found in Intel 486 PC and
This goes a long way to explaining why there is
later chip architectures. Major improvement in
increased standardization and configuration
performance with little added programming complexity
flexibility, a trend that has been led by this vendor.
can be obtained.
Software Mapping There is a lot more that goes
The math co-processor supports single precision (32
into the development of a flow computer than the
bits), double precision (64 bits) and extended precision
calculations. In fact, that is the easiest part of the
(80 bits) floating point formats. Floating point
development. Other auxiliary functions such as PID
calculations can make use of the 80-bit extended
control, meter proving, valve and sampler control,
precision format by keeping intermediate results in
archiving, batch processing and data
floating point registers thus increasing the precision of
communications, just to name a few, are the
the total calculation.
challenging parts. Most of these can be logically
Dedicated floating-point math hardware chips will broken into tasks based on their function. These
always outperform comparable software solutions. tasks are broken into modules for easier maintenance
This does not mean that the hardware and software (Omni’s Software Coding Standard limits modules to
math methods will produce differing results, it simply below 1000 source code lines).
means that more processor time will be available for
How these sections of code are put in memory is
other tasks.
important if they are to work seamlessly together.
In a realtime, multitasking system, time-dependent Shared data and entry points must be automatically
functions have their own processing allotment. Software resolved at build time to eliminate human error.
tasks are divided into time intervals based on their Separate executable sections ease debugging and
priority. They are managed by an interrupt driven allows code to be upgraded a section at a time.
scheduler which executes the task during the Multiple programmers must be able to work on
appropriate time interval. Execution times for each sections of a product's code without interfering with
interval is maintained and recorded for system another's memory space.

Page - 2
Moving memory allocation and usage around because ♦ New meter factor validation against flow /
of poor planning can cause unnecessary delays in the viscosity performance curve and against
programming cycle. Stack space must be allocated for historical meter factor data.
worst case conditions and be monitored for overflow.
Interrupt routines must be able to locate buffers and ♦ Ability to Flow Weight Average all relevant
variables regardless of where the CPU was when measurement input data and computational
interrupted (this can be difficult in segmented results based on hourly intervals, daily and batch
architectures). Getting the most out of memory requires transactions. FWA’s allow reasonable
a well-considered design and careful coding. “Wasting” verification of real time computed results
memory because there seems to be plenty available can ♦ Automatic reporting and storage of data for:
have consequences down the road. Software engineers multiple batches, hours and days
not only need to think about what goes into the product
now but also to consider how enhancements will be ♦ Handle all security issues: configuration access,
added in the future. totalizer tampering
The database holds information for configuration and ♦ Provide diagnostic functions and displays to aid
real-time results as well as intermediate results and raw in certification of calculated results process,
data. Two goals of creating this ram area is to keep including viewing inputs from field devices
related information together so every module can access
it efficiently and ensure that variable locations do not API Calculation Procedure So how does a flow
change between software upgrades. computer implement the current API Volume
The above discipline and experience distinguishes and Correction Standards for Crudes & Refined
elevates product acceptance and performance in the Products? See APPENDIX A CTL/CPL Flow
marketplace and goes some way towards explaining Chart. There are actually six steps.
why there are fewer suitable flow computer products
Step 1: With observed product density at flowing
available to the custody market today.
conditions (RHOobs),density at standard temperature,
Flow Computer Capabilities The minimum one should 60oF or 15oC, and flowing pressure i.e. elevated
expect when assessing capabilities of flow computers pressure is calculated, using API MPMS Chapter
in meeting the requirements of current and emerging 11.1, Table 23 A /B or 53 A/B algorithm
standards:
Step 2: With calculated reference density from Step1
♦ ‘Near’ Real Time computation: computational
and flowing temperature, the initial compressibility
cycles of 500 mS. or less, with continuous process
factor 'F' is calculated using API MPMS Chapter
sampling
11.2.1 algorithm.
♦ Concurrent computations involving multiple meter
runs measuring different fluid products Step 3: Using initial 'F' from Step 2, equilibrium and
flowing pressures, the initial pressure correction
♦ Resident algorithms and measurement tables able to factor 'Cpl' is calculated using API MPMS Chapter
correct flows of Crude Oils, Refined Products, 12.2 rounding and truncating rules.
LPGs, NGLs, Olefins, Chemicals and Aromatics
Step 4: The product density at standard temperature
♦ Resident algorithms and tables to measure Natural and equilibrium pressure is calculated by adjusting
Gas, Speciality Gases, Steam and Water the density at standard temperature and flowing
Condensate. pressure obtained at Step 1 by the 'Cpl' factor
calculated at Step 3.
♦ Automatic control of a meter Prover with the ability
to provide API proving reports and meter factor
Step 5: Iterate Steps 2,3 & 4 until the change in Cpl
curve storage and maintenance
factor obtained is less than 0.00005.

Page - 3
Step 6: Using flowing temperature and product density precision calculations and that all constants are
at standard temperature (RHOb) and equilibrium or carried to the exact number of digits as required by
base pressure (Pba), the volume correction factor, Ctl, is the standard.
calculated using Table 24 A/B or 54 A/B, depending on
whether U.S. customary or metric measurement units Underlying data, equations and associated constants
are in use. have not been changed but there are increased density
and temperature ranges that accommodates lower
temperatures and higher densities. None of the above
applies to the 1952 Tables which were derived from
Part 2: API Volume Correction Standards and empirical data developed during the 1930s and 40s.
API ELM Standard Development
API Physical Properties Changes 64-bit floating- ELM Standard The API Standard dealing with
point operations capability from currently available PC Electronic Liquid Measurement, MPMS Ch. 21.2
processors and flow computers which already was published in 1998. The Standard builds on the
incorporate math coprocessing capability has foundation of MPMS Ch. 21.1, but goes into
encouraged the API into developing more complex considerably more detail on calculations, security,
algorithms and simplification of arithmetic operations calibration and configuration issues. Topics covered
in the revised API MPMS Ch. 11.1 currently being are:
balloted. Using increased decimal precision and
floating-point math routines in place of the less rigorous Guidelines for selection and use of system
implementation procedures using integer math and low components:
processing capabilities existing in the 1970s and 1980s, Primary, Secondary and Tertiary
discrepancies between the 60°F, 15°C and 20°C Tables Installation and Wiring
have surfaced. These differences were not identified due Instrument Calibration Periods and Procedures
to the rounding and truncation procedures required to Total System Uncertainty
implement the algorithms. These were devised to ensure Algorithms:
reproducibility of results from different devices. In Frequency of Calculation
practice, the new routines will result in approximately Averaging Methods
16 or more decimal places for all calculations. Verification of Calculated Results
Integration Methods
The procedures by which volume correction factors are References to Physical Properties Standards
calculated to a consistent five decimal places for all Auditing and Reporting Requirements:
VCF factors are also being revised. Due to the The Configuration Log
widespread use of realtime density measurement, The Quantity Transaction Record (Batch Ticket)
temperature and pressure corrections will be performed Alarm and Error Logs
as one procedure. It will now be possible to improve the Calibration versus Verification Issues
convergence methodology for the correction of observed Security:
density to base density. API is adopting a more Restricting Access
advanced convergence methodology than was Integrity of Logged Data
previously possible to do the calculations. The Protecting the Algorithms
procedures have been written without rounding because Memory Protection
they can be part of an iterative loop and rounding of Computer Math Hardware and Software:
factors could mean slow or non-convergence of Limitations
iterative calculations. Software versus Hardware
Older flow computer technology embodying 16-bit Number Types and their Limitations
technology may not reproduce factors exactly to a 5 Integers and Floating Point Numbers
decimal place resolution. The standard requires that all Integer Over and Under Flow Issues
calculations must also be executed using double- Floating Point Resolution Errors
Integration Errors

Page - 4
The electronic liquid measurement standard explains, multistream flow computers also operate as their own
for example, how performing a single calculation for a station controller and database.
custody transfer transaction compared to multiple real-
time calculations in a flow computer will yield different PC-based Realtime Operating Software There is an
quantities. Rounding and truncating rules were designed emerging trend by some process system builders to
for reproducibility using a one-time calculation, make use of PC-based third-party Realtime operating
regardless of equipment used. Unfortunately the ability system (RTOS) software as the backbone to their
to compare a ‘one time’ result versus ‘real time’ results new instrumentation software designs. This trend has
is impaired when many thousands of calculations are occurred due to the need to shorten development
performed each hour using the existing rounding rules. cycles and reduce investment cost - and no doubt
accelerated by the unavailability of embedded
Users can refer to API MPMS Chapter 21.2, Section
processor programmers and company reengineering:
9.2.12 and Appendix A for a fuller appreciation of
verification of quantities calculated by realtime flow
computation devices. Advantages
l Multitasking - separate tasks/programs
Any 32-bit flow computer today is capable of providing l Oversees system resources- transparent to
the necessary security access and algorithm protection programmers
that prevents alteration of measurement or calculation l Program can be split into independently developed
parameters. Although not specified in API MPMS 21.2, tasks
Institute of Petroleum's Petroleum Measurement l Speeds up product and software development
Manual Part XII, Section 3 also recognizes the need for l Improves maintainability
special diagnostic and self test routines. Custody
transfer totals, meter factors and other data and Disadvantages
constants can be stored in a redundant RAM area as a l Operating systems are third-party vendor-supplied
single register containing a checksum. This allows for - possible slow support
alarm detection of suspect data and permits automatic l Adds overhead switching tasks
correction of custody totals when RAM bits are found l Adds overhead servicing interrupts
to be faulty e.g. corruption due to constant power l Black box - difficult to troubleshoot RTOS-
fluctuations interrupting processing activity. related problems
l Requires full understanding of system and task
scheduling to achieve good performance

Part 3: Software and hardware developments As a consequence, many manufacturers are now
and trends reliant on PC-oriented programmers who make
extensive use of high-level third party software code.
Multiple Functions There is increasing consolidation of To make modifications to achieve an optimum
functions into the tertiary device. PID control of flow, coupling with application code requires expert staff.
back pressure and delivery pressure is becoming more These issues are not immediately apparent to the
common. Single stream gas flow totalizers are also unsuspecting user until the system goes wrong, with
being replaced by multistream flow computers which a resultant cost spiral.
interface directly to gas chromatographs, provide direct
serial interface to multiple ultrasonic meter types and
also provide master meter proving capability using
precision gas turbine meters. British Gas in U.K.
recently upgraded its entire transmission system with
over 200 flow computers with capability to meter any
combination of orifice and turbine meters as well as
direct interface to gas analyzers. These multifunction

Page - 5
Embedded Processor Software It is important to note
the differences between the preceding approach and the FIELD MOUNTABLE FLOW COMPUTER
established use of embedded processor software design. WITH INTEGRATED
It remains as the leading methodology to achieve a MULTIVARIABLE SENSOR
secure, efficient processing environment.

l Completely Interrupt-driven
l Code can be 100% company-generated
l No context switching times in software code
l No Operating System “kernel” with interrupt
latency supplied by third parties
l Field-proven code can be used
l "Tight" hardware/software integration

Field-Mounted Flow Computers There is much


interest in skid-mounted flow computers today. Such
computers can provide the dual benefits of reducing
system capital costs and more easily guaranteeing
system performance by performing FATs (factory
acceptance testing) with all instrumentation wired in
place.
Systems requiring separate control room facilities to
house flow computers usually have to be disconnected
from the metering skid after the FAT ready for
shipment to the customer. This introduces an added un-
certainty… ‘Will the equipment be correctly re-
connected at the customer’s site?’ Field mounted flow
computers because of their small size, are usually
limited in I/O and meter run capability.
With the advent of serial-based meters such as
ultrasonic and mass meters, skid-mounted flow
computers adjacent to the meter's secondary electronics
on the same spool piece provide for simplified
connectivity and minimal wiring being required between under a wide range of temperature conditions and
the metering skid and the host supervisory system. tolerance to extreme electrical effects. For this reason
OIML (International Organization for Legal
Serial data link connectivity between field mounted
Metrology) and European Norms are currently the
computers will be needed to transfer data and
best guideline that users have for ensuring that
commands needed to provide an integrated station
products are certified to acceptable levels of
function on larger multi-run metering systems. It is now
performance in the absence of extensive tests
possible to network field-mounted flow computers and
conducted by knowledgeable, accredited users.
provide remote wireless access. A remote display may
be needed when a field-mounted computer is mounted in Internet Applications Major pipeline systems in the
an inaccessible location. U.S. are now looking at the Internet for connectivity
solutions that can simplify the hierarchy of MIS/
Another challenge for field-installed flow computers SCADA/ Leak Detection/ Tank farm/ Metering
being used in custody measurement is their stability systems and minimize their exposure to obsolescence.
TCP/IP connectivity is fast becoming accepted and

Page - 6
products now exist which preserve Modbus-based individual manufacturers’ sources in the absence of
communications wrapped in TCP/IP high-speed functional electrical/microprocessor standards.
connections. Anything less makes a mockery of many company
approvals processes and leads to second-rate process
Windows-based Interfaces Configuration programs performance.
have proliferated as notebook computers have emerged
as a standard field technician tool. The key to In respect of serial a.k.a. digital, periodic numeric
successful field configuration remains the core communication as distinct from conventional instant
firmware. This is where flexibility must be built-in. pulse-based data acquisition , issues such as “band
Industry expectations are that only Windows-based width”, “baud rate”, “cross-talk”, signal integrity
programs are acceptable. This is both an advantage and principles including transmission line properties and
disadvantage and depends on the PC literacy of the connections etc, should give any metrology-based
user. user significant concern until the uncertainties of the
technology have been determined and/or by practical
Programs now exist to retrieve archive data and export use as a system component.
to a Windows-based spreadsheet, auditing and
calculation checking software incorporating the multiple This is not to reject the technologies involved. Users
measurement calculation standards, metric or should stay abreast of technology developments,
customary US units. The need for extensive training of experiment where possible, and adopt when the
measurement technicians to ensure proficiency in these results, as with well-established technologies, will
new disciplines is self-evident. drive the value decision. It is obvious, however, that
use of these communications-based technologies
This is best exemplified by the Windows-based control should be validated under a range of controlled
system supervisory software much in vogue in both parameters that best represent typical field conditions
process control and metering systems. Although so as to benefit the end users. The main objective
graphically-appealing and seemingly user-friendly, there must be to maintain the integrity of the custody
are considerable uncertainties due to the programming measurement data received from the primary
and configuration skills of integrators who promote measurement device. The old technologies still have
sophisticated software. They frequently lack in-depth value.
knowledge of the application and the communications
and data handling capabilities required for metering Measurement Fidelity Error checking, as originally
systems. Consequently many metering systems envisioned by API MPMS Chapter 5.5 and in
languish in semi-working order and some remain widespread use worldwide, risks being relegated to a
uncommissioned worldwide due to users' unfulfilled or reliance on secondary devices which, in a variety of
unrealistic expectations of "realtime" systems. operating conditions and electrical influences, may
not accurately represent the primary signal. At a
secondary level, numeric representatations of the
Part 4: New metering technologies, measurement value could impact the measured results
measurement fidelity and security in readouts. However, flow computers implement
CRC (Cyclic Redundancy Check) error checking and
New Metering Technologies The average user can other checking methods to ensure that data messages
expect to confront new meter devices using serial to and from other connected devices are not
digital/numeric data streams with relatively little corrupted. Security, configuration settings, alarming,
technical guidance. Not only must a testing engineer be data logging and audit issues are central to Weights
a knowledgeable electrical and instrument professional, & Measures/Excise approvals and sometimes take
today he should also be an experienced electronic their cue from API & IP Standards.
engineer and intimately knowledgeable in processor
architecture and technical specifications from It remains best practice for the tertiary device to use
the instantaneous flow rate value to calculate and

Page - 7
totalize the flow. This forms the basis for the totalizer safeguard for users by establishing minimum
integration within the electronics of these new metering requirements of performance and indicate a
systems. Data transfer is preferably accomplished commitment to quality by the manufacturer. For
through serial data transmission. This is particularly example, how many integrators and users alike invest
relevant in the case of gas ultrasonic meters. in quality equipment such as a Hewlett Packard 8904A
Multifunction Synthesizer DC-600KHz. There are
By complementing the serial data with the use of the numerous other test instruments that users should acquire
"manufactured" pulse output train - it is not identical to if they want to be in the business of obtaining believable,
a turbine or densitometer-generated pulse train, the user traceable results.
can obtain some pseudo- compliance with the data
System Electromagnetic Immunity & Electro-
security issues commonly associated with API Manual
magnetic Compatibility (EMI/EMC) Selecting
of Petroleum Measurement Standards (MPMS) Chapter
suitable electronic instruments with good EMI/EMC
5.5 regarding signal security, and API MPMS Chapters
performance can be a challenge. Appendix B can
21.1 and 21.2, respectively, regarding gas and liquid
again be referenced for testing procedures for CE
electronic metering systems.
approvals. But bringing them all together in a
measurement and control ‘system’ which has good
Some of the new electronic meters provide totalizers,
EMI/EMC performance is even more difficult for
but be warned - these can be difficult to use unless they
many engineers. Questions such as “Why does my
are provided in a numeric format which increments and
totalizer increment a couple of barrels when the pump
rolls over predictably. Floating point variables, for
starts up?” or “Why does the displayed flowing
example, normally keep increasing in value and do not
temperature change when I talk on my handheld
roll over to zero at any point. This causes a problem
radio?” should cause the system designer to re-
because as the totalizer increases in size, a point is
evaluate the EMI/EMC performance of the total
reached when the bit resolution of the mantissa portion
system, including components, wiring and shielding
of the number is exceeded, and the totalizer begins to
and set proper operating procedures for the metering
increment using larger and larger steps.
system.
The tertiary computing device can compare the totalizer
Manufacturers of electronic equipment can take
values received between successive serial transmissions,
reasonable steps to minimize but cannot eliminate the
but even this can prove to be difficult because of
exposure of the system to disturbances, such as
totalizer rollover and resolution problems in some
lightning events and switching surges, or to
digital flow meters, and the inpracticability with any
operational practices such as permitting high-wattage
degree of certitude of synchronizing the reading of
radios generating RF interference in close proximity
successive totalizer readings with the calculation cycle
to critical measurement devices. A CE certification is
of the tertiary host calculating device.
the minimum that manufacturers should provide.
Metrological Testing A majority of U.S.-source It is assumed that system integration engineers are
measurement products are not exposed to international sufficiently knowledgeable in the system grounding
metrological norms, a number of which are European- and shielding techniques required to correctly link the
inspired such as OIML R117 or EN 50081/2, unless it components together. By the appropriate use of
is intended to obtain a significant installed base grounding and shielding planes, separation of analog
overseas or compete with European-based competitors. and digital signals, and modular circuitry where
Appendix B shows how there is considerable similarity possible, an instrument manufacturer can provide a
in some areas of testing, many of which are derived number of extra benefits to the user. Users’
from IEC Standards. instrument engineers are encouraged to review their
design and testing associated with EMI/EMC.
It has been the experience of the authors that significant In older measurement systems, scant attention was
benefit is derived from considering such standards in the paid to EMI/EMC. Consequently operational
design and testing of instrumentation. They provide a

Page - 8
difficulties were frequently experienced. in new projects such that the flow computer may well
emerge as a network node, processing all metering
Conclusion Industry consolidations will continue to and valve data and diagnostics and passing on to
impact on the world of measurement. There will be less central host systems. This will drive more products
choice available to users, despite user preferences. to a field environment. It will therefore be essential
Technological challenges will confront the average that proper metrological and electrical certification is
measurement instrumentation user given the quantum obtained before a product is permitted into field use
leaps in electronic technologies and pace of meter beyond user trials.
developments that have occurred since the original API
MPMS Chapter 5.5 was published. This same potential exists for transferring to entirely
mass-based systems using flow computers and
Technical training will be at a premium. Energy proving systems. Currently, this is more in evidence
companies are mistaken if they believe that system in process than in custody transfer. Until then, the
vendors alone can bear responsibility in a "low-bid continuing efforts to improve accuracy and reduce
wins" environment. There is no evidence that adequate uncertainties in volumetric systems, the mainstay of
training budgets are included to ensure vendor and user custody transfer transactions for both static and
proficiencies in the latest technologies. Developments dynamic transfer systems, will continue.
will continue to occur in signal processing, micro-
processor, memory and communication technology Metrological data can still only be deduced, defined
during the next 5 years i.e. serial baud rates, ultrasonic and proven at a metering system level that
meters, digital signal processing, Internet, cellular, and incorporates proposed combinations of primary,
satellite means. The expertise is already limited so secondary and tertiary devices. The probability will
when will management again support the "value" of be for complex simulation equipment being made
measurement? available by typical users to emulate real-life
electrical disturbances when using electronic-based
The petroleum industry has yet to fully come to terms systems. Testing methodologies employed to validate
with existing serial digital technology much less account the metrological results is a currently debated issue,
for any device that could ultimately communicate at up as well as what uncertainty limits should be
to 100 Mbps (cf. IEEE 1394). It is certain that field established. These are the continuing challenges for
devices implementing the Fieldbus communication designers and users of metering instrumentation.
standard will enter into widespread use. But it will be
ill-advised for any user to endorse products,
technologies and methodologies for acquiring meter data
for no other reason than that they exist until and unless
they are fully tested in operational conditions.
 Copyright 1999 Omni Flow Computers, Inc.
Fieldbus, and other technologies that may supercede it,
appears to offer many benefits of network connectivity

Page - 9
Page - 10
Page - 11
Page - 12
Page - 13

You might also like