Loggernet
Loggernet
Revision: 03/2021
Copyright © 1999 – 2021
Campbell Scientific, Inc.
Campbell Scientific, Inc.
Software End User License Agreement
(EULA)
COPYRIGHT: This software is protected by United States copyright law and
international copyright treaty provisions. This software may not be sold,
included or redistributed in any other software, or altered in any way without
prior written permission from Campbell Scientific. All copyright notices and
labeling must be left intact.
This software can be installed as a trial version or as a fully licensed copy. All
terms and conditions contained herein apply to both versions of software unless
explicitly stated.
This trial may be freely copied. However, you are prohibited from charging in
any way for any such copies and from distributing the software and/or the
documentation with any other products (commercial or otherwise) without
prior written permission from Campbell Scientific.
(1) The purchase of this software allows you to install and use a single
instance of the software on one physical computer or one virtual machine
only.
(2) This software cannot be loaded on a network server for the purposes of
distribution or for access to the software by multiple operators. If the
software can be used from any computer other than the computer on which
it is installed, you must license a copy of the software for each additional
computer from which the software may be accessed.
(3) If this copy of the software is an upgrade from a previous version, you
must possess a valid license for the earlier version of software. You may
continue to use the earlier copy of software only if the upgrade copy and
earlier version are installed and used on the same computer. The earlier
version of software may not be installed and used on a separate computer
or transferred to another party.
(4) This software package is licensed as a single product. Its component parts
may not be separated for use on more than one computer.
(5) You may make one (1) backup copy of this software onto media similar to
the original distribution, to protect your investment in the software in case
of damage or loss. This backup copy can be used only to replace an
unusable copy of the original installation media.
Limited Warranty
The following warranties are in effect for ninety (90) days from the date of
shipment of the original purchase. These warranties are not extended by the
installation of upgrades or patches offered free of charge:
Campbell Scientific warrants that the installation media on which the software
is recorded and the documentation provided with it are free from physical
defects in materials and workmanship under normal use. The warranty does not
cover any installation media that has been damaged, lost, or abused. You are
urged to make a backup copy (as set forth above) to protect your investment.
Damaged or lost media is the sole responsibility of the licensee and will not be
replaced by Campbell Scientific.
Campbell Scientific warrants that the software itself will perform substantially
in accordance with the specifications set forth in the instruction manual when
properly installed and used in a manner consistent with the published
recommendations, including recommended system requirements. Campbell
Scientific does not warrant that the software will meet licensee’s requirements
for use, or that the software or documentation are error free, or that the
operation of the software will be uninterrupted.
Campbell Scientific will either replace or correct any software that does not
perform substantially according to the specifications set forth in the instruction
manual with a corrected copy of the software or corrective code. In the case of
significant error in the installation media or documentation, Campbell
Scientific will correct errors without charge by providing new media, addenda,
or substitute pages. If Campbell Scientific is unable to replace defective media
or documentation, or if it is unable to provide corrected software or corrected
documentation within a reasonable time, it will either replace the software with
a functionally similar program or refund the purchase price paid for the
software.
This warranty does not cover any software that has been altered or changed in
any way by anyone other than Campbell Scientific. Campbell Scientific is not
responsible for problems caused by computer hardware, computer operating
systems, or the use of Campbell Scientific’s software with non-Campbell
Scientific software.
Licensee’s sole and exclusive remedy is set forth in this limited warranty.
Campbell Scientific’s aggregate liability arising from or relating to this
agreement or the software or documentation (regardless of the form of action;
e.g., contract, tort, computer malpractice, fraud and/or otherwise) is limited to
the purchase price paid by the licensee.
Table of Contents
PDF viewers: These page numbers refer to the printed version of this document. Use the
PDF reader bookmarks tab for links to specific sections.
3. Introduction............................................................. 3-1
3.1 What is LoggerNet? ......................................................................... 3-1
3.1.1 What Next? ............................................................................... 3-1
3.2 Overview of Major LoggerNet Functions and Associated
Software Applications .................................................................. 3-2
3.2.1 The Heart of it All – LoggerNet Toolbar .................................. 3-2
3.2.1.1 Toolbar Views ................................................................ 3-2
3.2.1.2 Favorites Category.......................................................... 3-3
3.2.1.3 Toolbar Menus ............................................................... 3-4
3.2.1.4 Command Line Arguments ............................................ 3-5
3.2.1.5 Alternate Language Support ........................................... 3-5
3.2.2 LoggerNet Admin/LoggerNet Remote...................................... 3-6
3.2.3 Setting Up Datalogger Communication Networks.................... 3-6
3.2.4 Real Time Tools ........................................................................ 3-7
3.2.5 Network Status and Problem Solving ....................................... 3-7
3.2.6 Network Management Tools ..................................................... 3-8
3.2.7 Creating and Editing Datalogger Programs .............................. 3-9
3.2.8 Working with Data Files ......................................................... 3-10
3.2.9 Automating Tasks with Task Master ...................................... 3-10
3.2.10 Managing External Data Storage Devices .............................. 3-10
i
Table of Contents
ii
Table of Contents
iii
Table of Contents
iv
Table of Contents
v
Table of Contents
vi
Table of Contents
vii
Table of Contents
viii
Table of Contents
ix
Table of Contents
Appendices
x
Table of Contents
xi
Table of Contents
Tables
7-1. Formats for Output Data ................................................................ 7-49
7-2. Formats for Entering Numbers in CRBasic.................................... 7-50
7-3. Synonyms for True and False......................................................... 7-50
7-4. Rules for Names ............................................................................. 7-52
7-5. Operators and Functions................................................................. 7-61
7-6. Editor Keystrokes ........................................................................... 7-66
8-1. Comma Separated, Field Formatted, Printable ASCII, and Table
Oriented ASCII Input File Format Types ..................................... 8-8
8-2. Example of Event Driven Test Data Set......................................... 8-17
8-3. Processed Data File Using Option C .............................................. 8-17
8-4. Input File Entries to Process the First Data Point for each Test ..... 8-18
8-5. Effects of Out of Range Values for Given Output Options ............ 8-21
8-6. Split Operators and Math Functions ............................................... 8-22
8-7. Time Series Functions .................................................................... 8-24
xii
Table of Contents
xiii
Preface — What’s New in LoggerNet 4?
Product History
LoggerNet 4 continues the original design of client-server functionality that
first appeared when Version 1.0 was released for Windows to replace Real
Time Monitoring Software (RTMS) that ran on OS/2 operating systems.
Versions in the 1.x series supported only table-based dataloggers and provided
large network users with sophisticated capabilities to develop clients to the
server to move data without having to store it in interim files.
Version 2.0 added support for dataloggers with mixed-array operating systems,
the CRBasic dataloggers, and additional communications devices. It also
supported client applications’ requests for data via TCP/IP, and automatically
created files on the PC for final storage data. Subsequent revisions in the 2.x
series added support for hardware as it was released and refined the client-
server architecture to make it more robust and flexible. Software development
kits and standalone clients were released to provide additional functionality.
LoggerNet 3.2 added support for our new CR3000 datalogger. In addition,
LoggerNet Admin and LoggerNet Remote were developed, which provide
tools to support larger networks. These tools include security and remote
management capabilities, and the ability to run LoggerNet as a service.
LoggerNet 3.3 added support for the CR800 datalogger. A new file output
option was also added for table-based dataloggers. This CSV file format option
allows the creation of data files similar to those from mixed array dataloggers.
LoggerNet 4.0 introduces a new look and feel to the LoggerNet Toolbar.
Applications are divided into categories to make navigating the Toolbar easier.
You can also organize a Favorites category for the applications that you use
most often. A new file viewing application, View Pro, is introduced which
allows multiple data files to be opened, multiple graphs to be created, and
graphing in a variety of formats (Line Graph, X-Y Plot, Histogram, Rainflow
Histogram, and FFT). Another new application, the Network Planner, is
included. This is a graphical application that assists the user in designing
PakBus datalogger networks. It allows the development of a model of the
xv
Preface — What’s New in LoggerNet 4?
PakBus network, proposes and verifies valid connections between devices, and
allows integration of the model directly into LoggerNet 4.0.
See below for more details on what is new in LoggerNet 4.0 and each
individual application.
One of the main efforts in the development of LoggerNet 4.1 was the ability to
use LNDB databases with View Pro. The ability to lock the timestamp column
on the left of the data file has also been added to View Pro. This keeps the
timestamp visible as you scroll through columns of data. The Device
Configuration Utility adds an off-line mode which allows you to look at the
settings for a certain device type without actually being connected to a device.
The CRBasic Editor now has the capability to open a read-only copy of any
file. This gives you the ability to open multiple copies of a program and
examine multiple areas of a very large program at the same time. You can also
now continue an instruction onto multiple lines by placing the line continuation
indicator (a single space followed by an underscore “_”) at the end of the each
line that is to be continued. Also, bookmarks in a CRBasic program are now
persistent from session to session. In the Troubleshooter and the Setup Screen
(Standard View), you can now click on a potential problem to bring up a menu
that allows you to go the Setup Screen or Status Monitor to fix the potential
problem, bring up help describing the problem, or in some cases fix the
problem directly. Campbell Scientific’s new wireless sensors have been added
to the Network Planner. An option to provide feedback on LoggerNet is now
available from the LoggerNet Toolbar’s Help menu.
LoggerNet 4.2 adds support for IPv6 addresses. IPv6 addresses are written as
eight two-byte address blocks separated by colons and surrounded by brackets
(e.g., [2620:24:8080:8600:85a1:fcf2:2172:11bf]). Prior to LoggerNet 4.2, only
IPv4 addresses were supported. IPv4 addresses are written in dotted decimal
notation (e.g., 192.168.11.197). Leading zeroes are stripped for both IPv4 and
IPv6 addresses. Note that while LoggerNet now supports IPv6 addresses and
they can be used to specify servers, CR1000/CR3000/CR800-series
dataloggers will not support IPv6 until a future OS release. Check the OS
revision history on our website to determine when IPv6 support is added to the
OS. (Starting in LoggerNet 4.2.1, IPv6 connections are disabled by default.
They can be enabled from LoggerNet’s Tools | Options menu item.)
The ability to set up subnets of the network map has been added to LoggerNet
Admin. The Setup Screen’s View | Configure Subnets menu item is used to
configure the subnets. Within each subnet, you can also specify groups of
dataloggers. The datalogger groups create folders than can be collapsed or
expanded when viewing the subnet. Once subnets have been configured, you
xvi
Preface — What’s New in LoggerNet 4?
can choose to view a subnet rather that the entire network in the Setup Screen,
Connect Screen, and Status Monitor.
You can now set up defaults for the Setup Screen’s Schedule, Data Files,
Clock, and File Retrieval tabs that will be used when new stations are added to
the network. There is also the ability to copy these defaults to existing stations.
The ability to use 24:00 (rather than the default of 00:00) for the timestamp at
midnight has been added. (This is accessed from the button next to the
Output Format field on the datalogger’s Data Files tab in the Setup Screen. It
is also available in the Connect Screen’s Custom Collection options.)
You can now access a datalogger’s Settings Editor from the Connect Screen
either by right-clicking on the datalogger or from the Datalogger menu. You
can also manually set the datalogger’s clock from the Connect Screen either by
double-clicking in the Station Date/Time field or from the Datalogger menu.
Boolean values displayed in the Connect Screen’s Numeric Display now have
an LED icon next to them to allow for easy toggle.
You can now view additional statistics in the Status Monitor for table-based
dataloggers including watchdog errors, skipped scans, and battery errors. (Note
that there is a Poll for Statistics check box on the datalogger’s Schedule tab in
the Setup Screen that must be enabled to poll for these statistics.)
The Task Master has been integrated into the LoggerNet server. This allows for
remote administration of the Task Master. (See Section 9.1.3, Remote
Administration of the Task Master (p. 9-17), for conditions that must be met for
remote administration of the Task Master.)
NOTE Integrating the Task Master into the server involved extensive
changes. When upgrading to LoggerNet 4.2 from a previous
version, an attempt will be made to import all previously-
configured tasks. However, imports have only been tested back to
LoggerNet 3.4.1. After upgrading (from any previous version of
LoggerNet), you should verify that all of your tasks have imported
correctly.
Calendar-based scheduling has been added to the Task Master. This allows for
non-interval task execution (including data collection). See Example #3 in
Section 9.1.1.4, Define What the Task Does (p. 9-9), for an example of calendar-
based data collection.
xvii
Preface — What’s New in LoggerNet 4?
A Constant Customization feature has been added to the CRBasic Editor. This
allows you to define values for one or more constants in a program prior to
performing a conditional compile. The constants can be set up with an edit box,
a spin box field for selecting/entering a value, or with a list box. A step
increase/decrease can be defined for the spin box, as well as maximum and
minimum values.
The CRBasic Editor now allows you to Save and Open Display Settings.
Display settings affect the look and feel of the CRBasic Editor. This includes
font and background, as well as syntax highlighting.
View Pro has a new View Record option in the right-click menu that can be
used to view an entire record in a new window.
The main effort in the development of LoggerNet 4.3 has been support for the
new CR6-series datalogger. Support was also added for the CRS451 Series
Water Level Recording Sensor and the CRVW Series Vibrating Wire
Recording Interface.
LoggerNet 4.4 adds support for the CR300-series datalogger. Support was also
added for the CDM-A108 and CDM-A116.
A UDP Search button was added to the IPPort in the Setup screen to initiate a
UDP discovery to search for PakBus dataloggers in the network. Also in the
Setup screen, the ComPort now has an Install USB Driver button to allow you
to install the USB drivers for our dataloggers and peripherals that require them.
The ability to display line numbers has been added to the CRBasic Editor.
(This is done from the View | Editor Preferences menu item.)
LoggerNet 4.6 adds support for the new GRANITE Data Logger Modules.
LoggerNet 4.7 adds support for the new CR350 Series datalogger. (Note that in
the main LoggerNet interface, the CR350 is considered a CR300Series
xviii
Preface — What’s New in LoggerNet 4?
An Allowed Neighbors tab has also been added to the PakBusPort device.
This allows you to specify a list of PakBus addresses that the port will accept
as neighbors.
Also, a TCP Password field has been added to the Security Settings screen in
the EZSetup Wizard to control IP access to a datalogger.
In LoggerNet Admin, you can now add, delete, and edit stations while viewing
a subnet. Previously, you had to change your view to the whole network before
making station changes.
NOTE If you are using an older version of RTMC Pro and plan to
continue creating RTMC projects, we recommend that you
upgrade to RTMC Pro 5.0.
LoggerNet Products
Campbell Scientific offers three LoggerNet software packages, LoggerNet,
LoggerNet Admin, and LoggerNet Remote and several standalone client
products. Each of these packages is purchased separately. LoggerNet is the
main software application and comes with all of the applications needed to set
up and configure a network of dataloggers including tools to write programs
and monitor retrieved data. LoggerNet Admin and LoggerNet Remote enhance
the capabilities of LoggerNet by providing management tools for more
complex networks. The difference in the two is that LoggerNet Admin offers a
complete LoggerNet package, while LoggerNet Remote, which was designed
to be run remotely, does not include the LoggerNet server.
xix
Preface — What’s New in LoggerNet 4?
Toolbar
The LoggerNet 4.0 Toolbar has a completely new look and feel. Applications
are divided into categories to make navigating the Toolbar easier. You can also
organize a Favorites category for the applications that you use most often.
Then, if you prefer a smaller version of the toolbar, you can select Favorites
View from the View menu. This will switch to a small view of the toolbar
containing only icons for applications in the Favorites category.
Setup Screen
The Setup Screen now has the option of being used in an EZ View or a
Standard View. The Standard View is similar to the Setup Screen in older
versions of LoggerNet. In the EZ View, the EZ Setup Wizard is used to add
dataloggers and edit their settings.
A new menu item has been added to enable a Stations Only view. When this is
enabled, only stations will be shown in the Network Map and they will be
listed in alphabetical order. This can be especially helpful, when working with
a large network.
A Scheduled Backup menu item has also been added. This opens a dialog box
from which you can setup automatic backups of LoggerNet on a user-defined
schedule.
A new root device, the PakBus TcpServer, has been added. This device can
accommodate multiple incoming PakBus/TCP connections to service the
stations attached to it. Therefore, it allows the same IP port to be used to listen
for incoming connections from multiple dataloggers. The device has a Routing
tab that can be used to specify IP addresses and port numbers to be used for
outgoing connections to specific dataloggers attached to the PakBus TcpServer.
The Routing tab can also be used to cause LoggerNet to maintain a connection
with a range of dataloggers, once an incoming connection has been established.
An Image Files tab has been added to the Setup Screen for the CR1000,
CR3000, and CR800-series dataloggers. This tab provides an easy way to
retrieve image files from the datalogger on a specified interval. (Note this tab is
renamed File Retrieval in LoggerNet 4.1.)
A Notes tab has been added to all devices to allow the user to keep notes about
the device for future reference. This is purely for the user’s convenience. (The
information in a datalogger’s Notes tab is displayed in the Connect Screen,
when that datalogger is selected.)
A new File Output Format option, CSIXML, has been added. When this option
is selected, data is stored in XML format with Campbell Scientific defined
elements and attributes.
xx
Preface — What’s New in LoggerNet 4?
Individual devices and/or device branches of the Network Map can be copied
and pasted into the network.
Various other settings have been added including BMP1 Station ID, BMP1
Low Level Delay, PakBus Verify Interval, TCP Password, Enable Automatic
Hole Collection, Stay on Collect Schedule, and Collect At Most. See Section
4.2.4, Device Settings (p. 4-7), or LoggerNet’s online help for information about
these settings and what devices they apply to.
Connect Screen
The Connect Screen has been reorganized with most of the buttons now
residing on a toolbar at the top of the window.
A Table Monitor has been added in the middle of the window that can be used
to monitor the values for one entire table.
A Notes field has been added that displays the information from the Setup
Screen’s Notes tab of the selected datalogger.
The Pause Data Displays option has been moved to the Edit menu. (In previous
versions it was available as a check box on the Connect Screen.)
The Update Interval for data displays has been moved from the Options dialog
to the display’s main window.
New options added to the data displays include an Auto Format option (rather
than specifying the number of decimal places to display), the ability to format
the timestamp for numeric displays, and the ability to specify what will happen
when a NAN is encountered in a graph.
A new File Format option, CSIXML, has been added to Custom Collection.
When this option is selected, data is stored in XML format with Campbell
Scientific defined elements and attributes. For Custom Collection, you also
now have the option of whether or not to include timestamps and/or record
numbers.
Status Monitor
Two new statistics are now available to be monitored: Link Time Remaining
(the time remaining, in milliseconds, until Maximum Time On-Line is reached
and the device is automatically disconnected) and RFTD Blacklisted (indicates
that a station has been blacklisted by an RF Base because of a failed
communication attempt).
xxi
Preface — What’s New in LoggerNet 4?
Task Master
A new event type, After File Closed, has been added to the Task Master. Using
this event type, a task will be executed anytime a data file being written to is
closed.
Along with the above event type, FTP and SFTP capabilities have been added
which allow the just closed file to be transferred to a designated FTP server.
Short Cut
Support has been added to Short Cut for the CR9000X datalogger, the ET107
Evapotranspiration Monitoring Station, and the AVW200 Two-Channel
Vibrating Wire Spectrum Analyzer.
New sensor files have been added for the CMP3 Pyranometer, the IRR-P
Precision Infrared Temperature Sensor, the JC Ultrasonic Depth Sensor, the
CNR2 Net Radiometer, the CS106 Barometric Pressure Sensor, the OBS-3+
Turbidity Sensor, the 03002 Wind Speed and Direction Sensor, the 105E
(chromel-constantan) Thermocouple, the WindSonic1 (RS-232) Two-
Dimensional Ultrasonic Wind Sensor, the WindSonic4 (SDI-12) Two-
Dimensional Ultrasonic Wind Sensor, the HMP155 Temperature and Relative
Humidity Sensor, the SR50A Sonic Ranging Sensor (SDI-12 Output), the
CS450/455 Pressure Transducer, a Vibrating Wire Sensor (for generic
vibrating wire sensors and the AVW200), and a saturation vapor pressure
calculation
An Advanced tab has been added to the Finish screen for CRBasic
dataloggers, which allows the user to view the CRBasic code and launch the
CRBasic Editor.
There is now an option to send the program to the datalogger from the Results
tab on the Finish screen.
The user now has the ability to create custom sensor files using existing sensor
files as templates.
The user can now manually set advanced outputs to high or low resolution.
The Add Device button has been removed. Peripheral devices are now listed in
and selected directly from the Available Sensors and Devices tree.
CRBasic Editor
The CRBasic Editor now gives you the option to Save and Encrypt a file.
Encrypted files can be compiled in the datalogger but cannot be read by a user.
Dim variables can now be declared within a subroutine or function and are
local to that subroutine or function. The same variable name can be used within
other subroutines or functions or as a global variable without conflict. The F9
and F10 pop-up pick list will include the local variables for a specific
subroutine or function if the cursor is within that subroutine or function.
F11 can now be used to bring up a pop-up pick list that contains all user-
defined functions found in the program.
xxii
Preface — What’s New in LoggerNet 4?
A new button has been added to the toolbar (blue arrow) which takes the cursor
to user-defined functions and subroutines.
A new shortcut, CTRL-Y, has been added that will delete the current line.
Several options have been added to the Editor Preferences dialog box
including:
Create .TDF File at Compile – The user can then associate a .TDF file
with a datalogger. This can be useful if communication is taking place
over a slow or unreliable communication link where the attempt to receive
table definitions back from the datalogger fails.
Clear Undo/Redo List on File Save – Clears the change tracking in the
program when the file is saved. Otherwise change tracking is kept until
the file is closed.
You can now drag and drop a file onto the CRBasic Editor workspace to open
the file. Also, multiple files can be selected from the File | Open dialog box.
All selected files will be opened.
Support has been added for custom voice files for the VoiceSpeak instruction.
When inserting a VoiceSpeak instruction, the user then has the option of
choosing words from the standard Voice.txt file or from a user-created custom
voice file.
RTMC
Many new functions have been added that may be used when building
expressions in RTMC. These include string functions, time functions, start
option functions, and function with state. The ability to declare aliases for data
values used in expressions has also been added. See Section 5.2.1.4,
Expressions (p. 5-43), or the Expressions topic in RTMC’s online help for more
information.
RTMC has a new Layout Toolbar which gives quick access to the Align, Space
Evenly, Make Same Size, Center, and Order menu items from RTMC’s
Component menu.
Graphics Options have been added to the Edit | Preferences menu item that
allow you to choose the maximum number of frames per second, whether
animation is enabled, and whether high quality or high speed is more
important. From this menu item, you can also choose the visual theme for
RTMC. This determines the look and feel of the application (i.e. colors, button
xxiii
Preface — What’s New in LoggerNet 4?
appearance, etc.). These options are available in both RTMC Development and
RTMC Run-Time.
An Edit | Customize menu item has been added which allows you to
customize RTMC’s toolbars and menus. This menu item is available in both
RTMC Development and RTMC Run-Time.
Miscellaneous other changes have been made to the settings for specific
components.
View Pro
View Pro is included for the first time in LoggerNet 4.0. It maintains the ease
of use of our former data file viewer with greatly enhanced capabilities.
Large files can be loaded more quickly. Scrolling is more responsive for large
files.
View Pro allows you to have multiple data files opened at one time. Multiple
graphs can be created from the same file or from multiple files. There is no
limit to the number of traces per graph. Data can be graphed in a variety of
formats including a Line Graph, X-Y Plot, Histogram, Rainflow Histogram, or
FFT (2D or 3D).
You have the ability to create a Line Graph containing multiple strip charts.
This allows you to simultaneously display data from multiple files (one strip
chart per file) to compare data from multiple stations. The X-axes (timestamps)
of the strip charts can be synchronized to facilitate cross file comparisons.
A Line Graph can use record numbers rather than timestamps on the X-Axis.
This allows you to display data files containing gaps in the timestamps.
From the toolbar of a Line Graph, you can bring up a Statistics box which
shows the average, standard deviation, minimum, and maximum of the
displayed points. From the toolbar you can also add a graph cursor to a Line
Graph. The cursor can be scrolled across the graph and the data values and
timestamp at the current cursor position will be shown.
View Pro has zoom capability to allow you to zoom in on a certain area of a
graph. You can also scroll a graph either from the graph itself or from the
opened data file.
You can print a graph either from a preview screen or directly from the graph
toolbar. The graph can also be saved in a variety of formats (BMP, JPEG,
WMF, EMF, or PCX).
Binary files (TOB1, TOB2, TOB3) can be opened directly in View Pro.
xxiv
Preface — What’s New in LoggerNet 4?
Split
A Time Offset option has been added. This allows the user to specify a time
offset that will be applied to each item on the Select line that uses the Date or
Edate function to output a date. This may be useful when adjusting for different
time zones.
Split now maintains a log file, splitr.log, each time Splitr is run. The main
purpose of this log file is to enable users running Splitr in command line mode
to identify what happened with each execution of Splitr. If a second instance of
Splitr is started when one is already running, another log file, splitrunning.log,
will be written. This file simply identifies the time that the second instance of
Splitr was started and that Splitr was already running.
CardConvert
A new File Format option, CSIXML, has been added to the Destination File
Options. When this option is selected, data is stored in XML format with
Campbell Scientific defined elements and attributes.
Troubleshooter
The Troubleshooter now allows the user to customize the possible problems for
which warnings will be given. In addition, you can click on any highlighted
warning to bring up additional information about the warning.
For array-based dataloggers, you now have the option to do a full hardware
reset. You can also now bring up Station Status information for array-based
dataloggers. (Previously this was only available for table-based dataloggers.)
Capability has been added to the Comm Test to report Invalid Datalogger
Security and Invalid LoggerNet Security.
Network Planner
The Network Planner, a graphical application that assists the user in designing
PakBus datalogger networks, is introduced for the first time in LoggerNet 4.0.
The Network Planner allows the development of a model of the PakBus
network, proposes and verifies valid connections between devices, and allows
integration of the model directly into LoggerNet 4.0.
Data Filer
A new File Format option, CSIXML, has been added. When this option is
selected, data is stored in XML format with Campbell Scientific defined
elements and attributes.
You now have the option of whether or not to include timestamps and/or
record numbers in the data file.
xxv
Preface — What’s New in LoggerNet 4?
installed on your machine, you will see the language in the list for the
Languages menu (Options | Languages).
xxvi
Section 1. System Requirements
1.1 Hardware and Software
LoggerNet is a collection of 32-bit programs designed to run on Intel-based
computers running Microsoft Windows operating systems. The recommended
minimum computer configuration for running LoggerNet is Windows 7,
Windows 8, or Windows 10 because they offer the most stable operating
environment. LoggerNet runs on both 32-bit and 64-bit versions of these
operating systems.
1-1
Section 2. Installation, Operation and
Backup Procedures
NOTE You must have administrator rights on your computer to install
Campbell Scientific software.
2.1 Installation
If you are installing LoggerNet from a download, run the executable file,
LoggerNet_version.exe, to begin the installation.
If you are installing from a CD, place the installation disk in your computer’s
CD-ROM drive. If autorun is enabled for the drive, the LoggerNet installation
will start automatically. If the installation does not start automatically, use the
Browse button to access the CD-ROM drive and select the autorun.exe file
from the disk.
The first screen displayed by the installation is a Welcome screen. Click Next
to proceed to the licensing agreement. After reading the licensing agreement,
select the “I Accept…” option and select Next to proceed to the User
Information screen. At the bottom of the User Information screen is a field for
entering the CD key for the software. The CD key is found on the back of the
CD case in which LoggerNet is shipped. Use the drop-down list box for the
first part of the CD key to select the software being installed: LGRNET
(LoggerNet), LGNADM (LoggerNet Admin), or LGNRMT (LoggerNet
Remote). Note that you must select the correct LoggerNet version for your CD
key or you cannot proceed further in the installation. After entering the CD
key, select Next and continue through the remaining screens, following the on-
screen prompts to complete the installation.
Items are added to your computer’s Start menu under All apps | Campbell
Scientific that start the Toolbar and some other selected utilities. At the end of
installation you also have the option to add a desktop shortcut to LoggerNet.
In addition to placing files in the Program Files directory of your computer, the
installation also creates working directories for the LoggerNet server and the
individual LoggerNet applications under C:\CampbellSci. Section 2.3.1,
LoggerNet Directory Structure and File Descriptions (p. 2-3), provides more
detail on the directories that are created.
If you are installing the trial version of LoggerNet, you will have 30 days to
use this fully functional trial version. Each time you run LoggerNet, you will
be advised as to how many days are remaining on your trial version. At the end
of the 30 days, the trial version of LoggerNet will no longer function.
2-1
Section 2. Installation, Operation and Backup Procedures
If you choose to purchase LoggerNet, you will need to uninstall the trial
version, run the install program on the LoggerNet CD, and input the CD Key
from the back of your CD case. This can be done either before or after the 30-
day trial period has expired.
Note that the trial version will install applications in the C:\Program
Files\Campbellsci\Demo directory. When the purchased version of LoggerNet
is installed, the applications will each be installed in their own directory under
C:\Program Files\Campbellsci.
When you upgrade an existing installation, LoggerNet will continue to use the
network map, data collection schedules, data file locations, etc., of the existing
installation. Essentially, you will be able to “pick up where you left off” the
last time you used LoggerNet.
LoggerNet 4.x can readily use the network map from LoggerNet 2.x or 3.x.
However, network maps are not backwards compatible. If you upgrade your
existing version, once LoggerNet 4.x is opened, the network map will no longer
be compatible with LoggerNet 2.x or 3.x. For this reason the upgrade installation
will automatically make a copy of the <WorkingDirectory>\LoggerNet\sys
directory and all of its contents. The copy will reside in
<WorkingDirectory>\LoggerNet\NetworkMapBackup\<version>\sys. If it then
becomes necessary to revert back to a previous version of LoggerNet, you will
need to remove the <WorkingDirectory>\LoggerNet\sys directory and replace it
with the <WorkingDirectory>\LoggerNet\NetworkMapBackup\<version>\sys
directory.
2-2
Section 2. Installation, Operation and Backup Procedures
This scheme was implemented because we use the underlying tools and many
of the applications (the server itself, library files, datalogger program editors,
etc.) in a number of different products. By providing a common working
directory for each major application, we hope to make it easier to keep track of
files and information as you move from one product to another.
The following figure shows the typical working directories for LoggerNet if
the default options were selected during installation.
2-3
Section 2. Installation, Operation and Backup Procedures
By default, the files that you create in each of the applications will be stored in
their respective folders in the working directory. You can override that default
and store the files in a different location. Each application “remembers” the last
directory in which a file was saved and will default to that directory until a
different directory is selected.
Note that most all applications have one or more subdirectories in which
configuration files are saved.
Lib directory – The Lib directory is a library directory for several of the
LoggerNet applications. The Compilers folder holds all of the compilers for the
CRBasic Editor, except for the CR200 compilers, which are stored in the
CR200Compilers directory. The CRBasicDefFiles folder holds the definition
2-4
Section 2. Installation, Operation and Backup Procedures
files and help files for all dataloggers supported by the CRBasic Editor. The
definition files are the files which provide the unique instructions and
parameters for each datalogger. The RTMCMediaLib directory contains all of
the media files that can be used by RTMC to provide graphics and sound for
your RTMC projects. Any custom graphics or sounds that you create and wish
to use in your project should be stored in one of these directories.
LoggerNet directory – The ASCII data files that are saved to disk as a result
of data collection from the dataloggers are stored to the LoggerNet directory
with a *.dat extension. The Logs directory holds the logs that are created when
communication takes place between the LoggerNet server and client
applications, and the LoggerNet server and the dataloggers. These logs are
used to help troubleshoot communication problems.
The Sys directory holds the network map description (CsiLgrNet.xml) and the
binary data cache. (The data cache is a repository for the data which is
collected from the dataloggers by the LoggerNet server, and which each client
application accesses when processing that data. In the example above, folders
20, 28, and 33 represent the data caches for different dataloggers. See
Appendix C, Software Organization (p. C-1), for additional information.)
The maximum interval for backing up data files depends primarily on the
amount of data maintained in the datalogger memory. The datalogger’s final
storage is configured as ring memory that will overwrite itself once the storage
area or table is full. If the data is backed up more often than the oldest records
in the datalogger are overwritten, a complete data record can still be maintained
by restoring the data from the backup and then re-collecting the newest records
from the datalogger.
You can choose to back up only the network map, or to back up the network
map and data cache. The network map will restore all settings and data
collection pointers for the dataloggers and other devices in the network. The
data cache is the binary database which contains the collected data from the
datalogger. Other files can be added as well.
2-5
Section 2. Installation, Operation and Backup Procedures
The files included in the backup will be based on a saved backup configuration
file. To save a backup configuration, choose Backup | Manual Backup from
the Setup Screen’s menu. Proceed through the Backup wizard. At the last step,
choose Save Configuration For Later. The configuration will be saved to
C:\Campbellsci\LoggerNet\Backup.Configuration.
To set up the task, open the Task Master and add a task. For an “Add After”
task, choose the event type that will trigger the backup. For a scheduled task,
enter the interval on which you want the backup to be performed. Press the
Configure Task button and enable the Execute File check box. In the File
Name field, type (or browse for) the file LNBackup.exe. Make sure to include
the path (if LoggerNet was installed with the default directory structure, this
will be C:\Program Files\CampbellSci\LoggerNet\LNBackup.exe). Once the
changes have been applied, the backup will be performed based on the defined
event or schedule.
To create a unique filename based on date and time each time the task is run,
enter –AppendTime in the Command Line options field.
If you are running LoggerNet Admin and have security enabled, the command
line options must also include the username and password as shown below:
-username=“username” –password=“password”
If you have used a LoggerNet command line argument (see Section 3.2.1.4,
Command Line Arguments (p. 3-5)) to change LoggerNet’s default port number,
the command line options must also include the server address and port number
as shown below:
2-6
Section 2. Installation, Operation and Backup Procedures
NOTE This process DOES NOT append to the existing network — the
existing network will be overwritten when the restore is
performed.
2-7
Section 2. Installation, Operation and Backup Procedures
The configuration files contain information about each device in the datalogger
network, including collection schedules, device settings, and other parameters.
These files are written to frequently to make sure that they reflect the current
state and configuration of each device. The configuration files are only opened
as needed.
If computer system power is lost while the LoggerNet server is writing data to
the active files, the files can become corrupted, making the files inaccessible to
the server.
While loss of power won’t always cause a file problem, having files backed up
as described above will allow you to recover if a problem occurs. If a file does
get corrupted, all of the server’s working files need to be restored from backup
to maintain the synchronization in the server state.
If you have problems restarting the LoggerNet server after a program crash or
it crashes as soon as it starts, make sure that the LoggerNet server has not left a
process running. You can check this by going to the Windows Task Manager
and selecting the Process tab. In the list of processes look for the Toolbar or
one of the client applications. If one of these processes exists but the Toolbar is
not running, select this process and click End Process; you will be asked to
confirm the end process.
2-8
Section 2. Installation, Operation and Backup Procedures
Note that when running LoggerNet as a service, tasks being run by the Task
Master cannot interact with the desktop. Therefore, any tasks set up in the Task
Master should not require any user interaction.
NOTE The LoggerNet user account will not show up in your list of users
when logging on to your computer. It can be viewed from the
Windows Control Panel. (In Windows 7, for example, from the
Control Panel, click User Accounts | Manage User Accounts |
Advanced tab | Advanced button, and then select Users from the
list of Local Users and Groups.) Although it is not available in the
drop-down list when logging on to your computer, you can
manually enter the user name (LoggerNet), enter the password,
and then select your local machine. (In Windows 7, the local
machine name is entered with the user name, i.e.,
machine_name\LoggerNet.)
2-9
Section 2. Installation, Operation and Backup Procedures
This is the process for giving the LoggerNet user write access to a designated
directory in Windows 10. The process in other operating systems is similar.
• Go to the Security tab of the Properties dialog box and select Edit. In the
Permissions for directoryname dialog box, press Add. This will open the
Select Users, Computers, Service Accounts, or Groups dialog box.
• In the Locations dialog box, select the computer name and press OK.
• From the Select Users or Groups dialog box press the Advanced button.
Then press the Find Now button. Select LoggerNet in the list of names
that appears at the bottom of the dialog box and press OK. Note that
<COMPUTER-NAME>/LoggerNet has been added to the Object Names
on the Select Users or Groups dialog box. Press the OK button to close
the Select Users or Groups dialog box.
• The LoggerNet user should now have full access to the designated
directory.
2-10
Section 2. Installation, Operation and Backup Procedures
2-11
Section 3. Introduction
3.1 What is LoggerNet?
LoggerNet is a software application that enables users to set up, configure, and
retrieve data from a network of Campbell Scientific dataloggers and share this
data over an Ethernet communications network. This software application is
designed to run under Windows 7, Windows 8, and Windows 10.
One significant benefit of the software design is that some of the client
applications (RTMC, for instance) can be run on any computer that connects to
the main computer by a TCP/IP network connection. Some examples of these
networks are Local Area Network (LAN), Wide Area Network (WAN), or the
Internet. If you have LoggerNet Admin or LoggerNet Remote, any of the client
applications can log on to a remote LoggerNet server. Another benefit is the
efficiency that is gained, since several client applications can simultaneously
request and receive information from the software server.
The first step is to set up a communication link between your computer and the
datalogger station. This step may also include the configuration of peripheral
communication devices. Next you’ll need to develop a program for the
datalogger, and then send the program to the datalogger and ensure that
measurement results are viable. Once the datalogger has been storing data for a
period of time, you will want to collect that data and store it to a file on your
computer for further analysis.
3-1
Section 3. Introduction
When you run LoggerNet from the Window’s Program menu or from a desktop
shortcut, you are launching the LoggerNet Toolbar. Some options on the
Toolbar launch applications that connect to the server and allow you to set up
the network or view the collected data. Other options launch stand-alone
applications to perform other functions, such as program editing. As you hover
over a category in the list on the left side of the toolbar, applications related to
that category will be shown on the right. Selecting an application in the right-
hand list will launch the application.
If you prefer a smaller version of the toolbar, you can select Favorites View
from the View menu. This will switch to a small view of the toolbar containing
only icons for applications in the Favorites category. (For additional
information on the Favorites category, refer to the following section.)
3-2
Section 3. Introduction
By default, the LoggerNet menus are not shown on the toolbar. Press the arrow
button in the upper right corner to view the LoggerNet menus as shown below.
The arrow button will change direction and can then be used to hide the menus.
The Available Applications column shows all applications that are available in
LoggerNet. (Press the + sign next to a category to show the applications in that
category.) An application can be added to the Favorites category by selecting it
in the Available Applications column and pressing the right arrow key. An
application can be removed from the Favorites category by selecting it in the
Favorites column and pressing the left arrow key.
The applications will appear on the Toolbar’s Favorites category in the same
order as they appear in the Favorites column. The up and down arrow keys can
be used to rearrange the order of applications in the Favorites column. To move
3-3
Section 3. Introduction
an application up in the Favorites column, select the application and press the
up arrow until the application is in the desired location. Use the down arrow
key in a similar manner to move the application down in the Favorites column.
File Menu
View Menu
Full View – This option is only available when in Favorites View and brings
up the full view of the Toolbar.
Favorites View – This option is only available when in Full View. It switches
from a full view of the Toolbar to a smaller view which shows icons for only
the Favorites category.
Hide Main Menu – Hides LoggerNet’s main menu. The main menu can be
displayed again by pressing the arrow key in the upper right-hand corner of the
Toolbar.
Tools Menu
Options – This option bring up the LoggerNet Options dialog box. From this
dialog box you can specify various options such as whether the toolbar always
stays on top, the behavior of the system tray icon, whether legacy applications
(Edlog and Transformer) are shown on the toolbar, language, whether to
automatically check for LoggerNet updates, whether remote connections are
allowed, and whether IPv6 connections are allowed. You are also able to
specify which applications are included in the Favorites category.
Launch Menu
Help Menu
Check for Updates – Opens the Campbell Scientific Software Updater to check
for LoggerNet updates.
Give Feedback on LoggerNet – Opens a form on our website which allows you
to provide feedback on LoggerNet to Campbell Scientific.
3-4
Section 3. Introduction
/WorkDir Sets the working directory to something other than the default.
Usage:
“C:\Program Files\CampbellSci\LoggerNet\ToolBar.exe”
/WorkDir=C:\CampbellSci\test
“C:\Program Files\CampbellSci\LoggerNet\ToolBar.exe” /m
NOTE If you are running LoggerNet Admin, which requires that you log
in to a particular sever with each client, you must specify this
alternate port number when entering the server address in the login
window (e.g., LocalHost:6700 or 192.168.7.123:6700).
3-5
Section 3. Introduction
LoggerNet Admin and LoggerNet Remote also have the ability to launch more
than one of the same client screens. In LoggerNet, you can open only one client
window at a time. In LoggerNet Admin/LoggerNet Remote, if Launch Multiple
Clients is selected on the Toolbar’s Option menu, you can open two or more of
the same window. For instance, you can open one Connect Screen and connect
to datalogger A, and open a second Connect Screen and connect to datalogger
B.
The most basic tool in setting up your network is the Setup Screen. The Setup
Screen can be used in either an EZ View or a Standard View. (For simplicity,
in this manual, references to the Setup Screen that do not specify EZ View or
Standard View, will refer to the Standard View of the Setup Screen.)
The EZ View of the Setup Screen uses the EZSetup Wizard which provides a
simple step-by-step sequence of screens, with on-screen help and many pre-set
values that make it easy to add a new datalogger and communications devices
to your LoggerNet network. You start with the type of datalogger you wish to
add, and then enter the settings for the communications devices used to reach
it, ending with a communications test and an opportunity to set the clock, send
a program, and set up an automatic data collection schedule.
The Standard View of the Setup Screen accomplishes the same tasks, but
allows you a bit more control when setting up your network, and allows for
more complex network configurations.
3-6
Section 3. Introduction
LoggerNet also ships with a command line scripting tool, CoraScript, which
can be used to configure the datalogger network from a command prompt.
RTMC is used to for real-time data displays of the data collected by the
LoggerNet server. You can create customized graphic displays that include
graphs, tables, dials, alarms, digital values and other graphic elements. These
displays automatically update when LoggerNet collects new data. Graphical
elements are also available for toggling ports or flags, or setting variables (or
input locations) in a datalogger. The displays created in RTMC can be
distributed to other users who have licenses to run RTMC Run-time software
(purchased separately). This allows a remote computer, accessible via TCP/IP,
to connect to the LoggerNet server and display the real-time data.
3-7
Section 3. Introduction
available to view operational log messages for the server as well as the low-
level communication between the datalogger and the server. A Comm Test
window can also be launched from the Status Monitor.
The LoggerNet Service Manager is a utility that allows you to install and run
LoggerNet as a service. Refer to Section 2.4, Installing/Running LoggerNet as
a Service (p. 2-8), for additional information.
The Hole Monitor utility is used to monitor the hole collection activity in
LoggerNet. Holes are instances in the data cache where records are missing.
Holes are most often seen in large RF networks where data is being collected
via a data advise operation.
3-8
Section 3. Introduction
NOTE While Short Cut programs can be imported into Edlog or the
CRBasic Editor, once they have been edited in one of these
programs, the modified program cannot be imported back into
Short Cut.
Edlog is the tool to create and edit datalogger programs for all Campbell
Scientific dataloggers except the CR1000X series, CR6 series, CR300 series,
GRANITE 6, GRANITE 9, GRANITE 10, CR1000, CR3000, CR800 series,
CR200 series, CR5000, CR9000, and CR9000X. Instructions are available for
sensor measurement, intermediate processing, program and peripheral control,
and data storage. The built-in precompiler provides error checking and warns
of potential problems in the program. For Edlog dataloggers with PakBus
operating systems, you can include settings for PakBus routing in the
datalogger program itself.
For those users of CR10X or CR510 dataloggers and Edlog programming who
are switching to CR1000 dataloggers (or CR23X users switching to CR3000
dataloggers), a Transformer utility has been developed. The Transformer reads
in an Edlog CSI or DLD file and generates a CRBasic program file. The two
files are displayed side-by-side for comparison purposes; double-click an
instruction in the Edlog program, and the associated instruction is highlighted
in the CRBasic program. Edlog program instructions that cannot be converted
directly to a CRBasic program instruction are listed in a Messages window and
are included as commented text in the CR* file. After conversion, the newly
created CR* file can be opened in the CRBasic Editor for further editing.
3-9
Section 3. Introduction
View Pro is used to inspect data files, from either mixed-array or table data
dataloggers. You can also view data from an LNDB database. The data is
displayed in a tabular format by record or array. Data values can then be
chosen to display graphically on a line graph, histogram, XY plot, rainflow
histogram, or FFT as appropriate for the data type. You can also print graphs or
save them to disk in a variety of formats.
Split is used to post process and generate reports from collected data files from
either mixed-array or table-based dataloggers. Traditionally it has been used to
separate mixed-array data files into individual files based on the array ID, but it
can also create files in custom formats for use in reports or as input to other
data applications, including converting mixed-array datalogger time stamps
(year, Julian day, Hour/Minute) to more conventional date/time stamp formats.
Split includes time series function, which can be used to provide summary
information from more frequent data (e.g., hourly summaries from one-minute
data).
The CardConvert file converter is used to convert TOB1, TOB2, and TOB3
files to TOA5, Array Compatible CSV, or CSIXML format (TOB2/TOB3 files
can also be converted to TOB1 format). TOB files are binary files that are
either created by LoggerNet during collection or are collected directly from a
compact flash, microSD, or PCMCIA card installed in a CRX000 datalogger.
A command line file converter, toA_to_tob1, is also included in LoggerNet.
(Refer to Appendix B, Campbell Scientific File Formats (p. B-1), for additional
information on File Formats.)
The File Control functionality, accessed from the Connect Screen, can be used
to manage files created by CRX000 dataloggers on other, non proprietary
formatted, PC cards.
3-10
Section 3. Introduction
The CSI Web Server is used to display RTMC projects using a web browser.
Once the web server is in place, the only thing required to view the data is a
browser such as Internet Explorer or Firefox.
Popup hints are available for many of the on-screen controls. Let the mouse
pointer hover over the control, text box or other screen feature; the hint will
appear automatically and remain visible for a few seconds. These hints will
often explain the purpose of a control or a suggested action. For text boxes
where some of the text is hidden, the full text will appear in the hint.
3-11
Section 4. Setting up Datalogger
Networks
The EZ and Standard Views of the Setup Screen provide ways to create and maintain the
communications link and data collection schedules for a network of dataloggers. The EZ
View uses the EZSetup Wizard which walks you through the setup step-by-step. In the
Standard View, you add devices and configure their settings on your own. Either method
will result in a network map with all of the devices and communications links to reach the
datalogger stations.
The Network Planner is a graphical application that assists the user in designing a PakBus
datalogger network.
The Device Configuration Utility, or DevConfig, is a stand-alone tool that can be used to
configure settings in the dataloggers themselves, as well as in communication devices such
as RF401A radios or NL201s.
Open the wizard by pressing the Add button. The EZSetup Wizard starts with
the page shown below.
4-1
Section 4. Setting up Datalogger Networks
Subsequent pages are similar. Previous and Next buttons are provided to move
through each step of the wizard. Progress is shown by the blue arrow next to
each step displayed at the left. Field descriptions and helpful tips are displayed
on the wizard page. If additional help is needed, the on-line help can be opened
by pressing F1 or the Help button on the bottom right of each page.
In the Communication Setup step you first select the datalogger type and give
it a name. (This name will also become the default file name for data files
collected from that datalogger.) Next you choose the connection type from the
possible communications methods supported for that datalogger. EZSetup
Wizard fills in as many communications settings as possible; in many cases
you can use the default settings. It also provides fields for user-entered
communications settings such as phone numbers and RF radio addresses.
The Datalogger Settings step is provided for fine tuning the connection to the
datalogger. The baud rate offered is typically the maximum baud rate
supported by that datalogger and communications medium; lower rates may be
required for cell phones or noisy telephone links. Enter a Security Code, TCP
Password (IP Port connections only), or PakBus Encryption Key only if the
datalogger is configured to use it. Note that the default Max Time On-Line
setting for most communications links is zero (“0 d 00 h 00 m”), which means
that LoggerNet will never hang up until you click Disconnect. For telephone
links, the default Max Time On-Line setting is 10 minutes in order to reduce
the possibility of inadvertent and expensive long distance or cellular telephone
charges. There are, however, other links that can result in expensive connection
charges, such as digital cellular links using TCP/IP that charge by the byte.
Leaving the datalogger connected also uses battery power, so if the datalogger
power supply is not recharged from a reliable source, it may discharge its
battery below safe levels. Be sure, therefore, that you do not leave the
datalogger connected beyond the time necessary to do the tasks you need to do.
Use the Neighbor PakBus Address field to enter the PakBus address of a
neighbor you will go through to connect to your datalogger. For example, this
can be used to enter the PakBus address of a Konect PakBus router. A value of
0 means you will connect directly to your datalogger.
The Setup Summary step provides a list of the settings entered. You can use
the Previous button to return to a page and change these settings if necessary.
The Communications Test step allows you to test the communications link
before going any further. If the datalogger is not installed, you can skip this and
the next two steps.
If communication succeeds, you can move to the Datalogger Clock step where
you can check or set the datalogger’s clock to match the PC’s system time. If
the datalogger is in a different time zone, you can enter an offset in hours and
minutes.
The Send Program step allows you to send a program to the datalogger. This
may be a program you created with Short Cut, Edlog or the CRBasic Editor or
a program supplied by someone else. If it is a mixed-array datalogger, and the
datalogger is already running a program, you should associate the .DLD file so
that LoggerNet will use the labels for input locations and final storage.
Dataloggers with table-based operating systems (TD, PakBus, and CRx000)
will know their program if one is running and will provide table definitions that
contain the labels. If you don’t have a program for the datalogger you can skip
this step and send a program later from the Setup Screen or Connect Screen.
4-2
Section 4. Setting up Datalogger Networks
The Data Files step is where you define what data tables, or final storage areas,
should be collected by LoggerNet and saved to disk. If you used the EZSetup
Wizard to send a program to a table-based datalogger, the software will already
be aware of the data tables that exist in the datalogger. If the program was
already loaded, or for some reason no tables are displayed, press the Get
Tables Definition button to retrieve the table names.
The Data Files step also has a Table Collected During Data Collection (or
Enabled for Scheduled Collection) field. When enabled, LoggerNet will collect
that table or final storage area from the datalogger on a manual or scheduled
data collection attempt.
The Scheduled Collection step is where you can define a schedule on which
LoggerNet will automatically call the datalogger and collect data.
Once a datalogger station has been configured, it can be edited by pressing the
Edit button to open the EZSetup Wizard. When editing in the EZSetup Wizard,
click a step in the Progress column to go directly to that step, or walk through
each wizard page using the Next button.
4-3
Section 4. Setting up Datalogger Networks
The number of tabs will vary, based upon the type of device that is selected.
Some devices may have only hardware and notes tabs, while other devices,
such as dataloggers, have several tabs.
To add a ComPort to the network map either right click in the blank area of the
network map or click the Add Root button. Once the ComPort is in place you
can click the Add button to bring up the Add window. If you used the Add
Root button to add the ComPort, the Add window will automatically be
displayed.
4-4
Section 4. Setting up Datalogger Networks
The contents of the Add Device window will change as each device is added to
the network map. Only those devices that are valid components to add to the
last device added will be shown. Continue to add devices in this manner until
your network map is complete.
An alternative to the Add Device window is to press the right mouse button
while your cursor is on a device within the main device map window. A
shortcut menu like the one shown below will appear that will provide a list of
valid devices for connection to the device you have right clicked. For instance,
if you right click within the white space of the device map, the list will present
options for root devices such as ComPorts or IPPorts. When you right click a
ComPort, only valid connections for ComPorts will be presented.
4-5
Section 4. Setting up Datalogger Networks
To delete a device from the network map select the device and click the Delete
button. This will delete the device and any devices that were connected below
it. A keyboard shortcut Ctrl+D will also delete the selected device.
Changing the network map or any of the device settings enables the Undo
button. Clicking the Undo button will roll back each change in reverse order to
the originally saved network and settings. If you undo a change and really
wanted to keep it, you can click the Redo button and restore the change.
Once the changes to the network map and device settings have been applied,
they can no longer be rolled back or restored using the Undo or Redo button.
Clicking the Cancel button before changes are applied will undo all of the
changes to the network map and settings, and restore the saved configuration.
4-6
Section 4. Setting up Datalogger Networks
All devices have a Notes tab which is only for the user’s convenience. It may
be used to keep notes about the device for future reference.
As with changes to the network map, the changes made to the device settings
are not used until they have been applied.
4.2.4.1 ComPort
The ComPort (or serial port) has only Hardware and Notes tabs. Following is
an explanation of each of the fields on the Hardware tab.
Standard
Communications Enabled – Before communications can take place, all
devices in the communications chain must be enabled. The default setting for
this check box is Enabled.
Install USB Driver – This button can be used to select and install the USB
drivers for our dataloggers and peripherals that require them.
Advanced
Call-Back Enabled – Enabling call-back tells LoggerNet to watch for a call-
back from the datalogger on this port. If there is a phone modem attached it
will be set to accept incoming calls.
4-7
Section 4. Setting up Datalogger Networks
NOTE LoggerNet waits a certain amount of time for a response from each
device in a communications path. The extra response times
defined for the communications link are cumulative. Therefore,
the amount of time spent waiting for a device to respond is the
sum of all Extra Response Times defined, plus the default
response time for each device in the link. Add only the minimum
time necessary since very long response times can delay other
scheduled events while waiting for a device that is not responding.
Standard
Communications Enabled – Before communication can take place, all
devices in the chain must be enabled. When this box is selected, the Internet
protocol serial port is enabled for communication.
Internet IP Address – In this field, enter the TCP/IP address and port through
which LoggerNet will communicate with the datalogger network. The address
is entered in the form ###.###.###.### for an IPv4 address or
[XXXX:XXXX:XXXX:XXXX:XXXX:XXXX:XXXX:XXXX] for an IPv6
address. (Alternately, a valid machine name can be entered.) The port is in the
form of :####. A typical IPv4 entry might be 123.123.123.123:1024. A typical
IPv6 entry might be [2620:24:8080:8600:85a1:fcf2:2172:11bf]:1024.
If your network supports UDP, the UDP search button next to this field can be
used to search for PakBus dataloggers in the network. As devices are
discovered, they are listed in the resulting dialog box along with their device
4-8
Section 4. Setting up Datalogger Networks
type and IP address. They can then be selected and added to the selected IPPort
in the network map. If the IPPort currently selected is used by another device, a
new IPPort will be created when a selected device is added. (Note that in
LoggerNet Admin with security enabled, the user must have at least Network
Administrator rights to use the UDP search.)
NOTE By default, IPv6 connections are not allowed. They can be enabled
from LoggerNet’s Tools | Options menu item.
Advanced
Call-back Enabled – Enabling call-back tells LoggerNet to watch for a call-
back from the datalogger on this port.
TCP Listen Only – When selected, LoggerNet will never attempt to make an
outgoing TCP Link on this IPPort. It will only listen for an incoming TCP call-
back. This option is useful for a datalogger doing TCP call-back from behind a
firewall. In this case, it is not possible to create a TCP connection from
LoggerNet to the datalogger and any time spent attempting to do so will be
wasted and may result in missing incoming connection attempts.
Extra Response Time – In this field, specify the additional time that
LoggerNet should delay before terminating the communications link if there is
no response from the IPPort. Additional time may be needed in instances
where the communications link is noisy or network traffic is heavy.
IP Port Used for Call-back – If call-back is enabled for the IP port, enter the
port number that LoggerNet should open and monitor for incoming call-back
messages.
AirLink Modem Name – If an AirLink modem is being used, enter the Device
ID set in it. By default, this is the 11-digit Electronic Serial Number of the
device.
4-9
Section 4. Setting up Datalogger Networks
NOTES When entering the IP address, do not use leading zeros for the
address numbers. For example use 123.123.2.34 instead of
123.123.002.034.
Standard
TAPI Line – Select the modem you want to use for communication. The
modems listed are defined by Windows as part of the computer’s Modem
Setup. All of the parameters for the modem, including the baud rate have to be
set using the Windows Modem Setup dialog. If you are using the same modem
for dialup access you may have to change the settings for the different
applications.
Advanced
Extra Response Time – In this field, specify the additional time that
LoggerNet should delay before terminating the communications link if there is
no response from the IPPort. Additional time may be needed in instances
where the communications link is noisy or network traffic is heavy.
4-10
Section 4. Setting up Datalogger Networks
The communication port (i.e., the root device) must be configured for call-back
as well. Enable the root device’s Call-Back Enabled check box to accomplish
this.
Maximum Time On-Line – This field is used to define a time limit for
maintaining a connection to the device. (This may be useful in avoiding costly
communication costs, in the event that a connection to a station is inadvertently
maintained for a long period of time.) Maximum Time On-Line applies to both
scheduled connections and manual connections. However, for manual
connections from the Connect Screen, it is always best to manually disconnect
rather than relying on LoggerNet to disconnect for you.
When the device is connected in the Connect Screen and the time limit
approaches, a dialog box is displayed warning the user that Max Time On-Line
is about to be exceeded. The dialog box has Reset Max Time and Don’t Reset
buttons. If the Reset Max Time button is pressed, the Max Time On-Line
counter will be reset. If the Don’t Reset button is pressed or if no button is
4-11
Section 4. Setting up Datalogger Networks
NOTE If you are using LoggerNet Admin or LoggerNet Remote 4.0 and
using the Connect Screen to connect to a remote server that is
running an older version of LoggerNet, the behavior will be
different than described above. When connecting to a LoggerNet
3.4.1 server, you will be disconnected with no advanced warning
when Max Time On-Line is reached. A message will be displayed
indicating that Max Time On-Line has been reached. When
connecting to servers older than LoggerNet 3.4.1, the behavior
will be variable. Generally, you will be disconnected at some
point, but the timing of the disconnect will not be predictable.
Maximum Baud Rate – Select the arrow to the right of this field to choose a
maximum baud rate for communication with this datalogger. Note that the
actual rate of communication may be limited by the capability of other devices
in the communications chain.
Advanced
Extra Response Time – In this field, specify the additional time that
LoggerNet should delay before terminating the communications link if there is
no response from the datalogger. Additional time may be needed in instances
where the communications link is noisy or network traffic is heavy.
Security Code – A datalogger can have a security code to restrict access to the
datalogger. This helps prevent inadvertent changes to the datalogger’s program
or memory. A valid security code is any four digit, non-zero number. The
security code is set by the datalogger program, through a keyboard display, or
the remote keyboard utility. If a datalogger program that sets or changes
security is loaded into the datalogger, the Security Code in LoggerNet must be
changed to match so that the server can access the datalogger. (Security is not
available in the CR5000, CR9000, and CR200-series dataloggers.)
4-12
Section 4. Setting up Datalogger Networks
command to the device is issued before the delay has expired, communication
will not be terminated.
BMP1 Station ID – The address that will be used for the device in the BMP1
network. When adding a new device to the network, this field will not show up
until after the Apply button has been pressed. The ID will be assigned
automatically by LoggerNet, but can be changed by the user. This allows the
user to designate unique addresses for all BMP1 devices across multiple
LoggerNet networks.
BMP1 Low Level Delay – the amount of time, in milliseconds, that LoggerNet
should delay after receiving a valid low level serial acknowledgement package
before sending out the next low level serial query packet. If the value is zero,
the query packet will be sent immediately.
PakBus Encryption Key ─ This setting specifies text that will be used to
generate the key for encrypting PakBus messages sent to or received from this
device. The key entered here must match the PakBus Encryption Key setting in
the device. (The device setting is entered using DevConfig, PakBus Graph,
Network Planner, or a CR1000KD.)
The PakBus Encryption Key can be up to 63 bytes long and can include any
character with the exception of the Null character. Note that if Unicode
characters are included in the key, those characters may take up to three bytes
each.
If the PakBus Encryption Key device setting is specified as an empty string, the
device will not use PakBus encryption. If the PakBus Encryption Key device
setting is specified as a non-empty string, however, the device will not respond
to any PakBus message unless that message has been encrypted. AES-128
encryption is used.
NOTES LoggerNet waits a certain amount of time for a response from each
device in a communications path. The extra response times
defined for the communications link are cumulative. Therefore,
the amount of time spent waiting for a device to respond is the
sum of all Extra Response Times defined, plus the default
response time for each device in the link. Add the minimum time
necessary since very long response times can delay other
scheduled events while waiting for a device that is not responding.
Refer to your datalogger operator’s manual for complete
information on its security functions.
4-13
Section 4. Setting up Datalogger Networks
Scheduled Collection Enabled – This check box activates the data collection
schedule defined on this tab. No data will be automatically collected if the
schedule is disabled.
Apply to Other Stations – This button allows the schedule setup for this
datalogger to be copied to other stations in the network. Clicking the button
brings up a window that lists all of the dataloggers in the network. You can
select one or more dataloggers and then press OK to use the entered schedule.
To select more than one datalogger, hold down the Ctrl key while clicking the
dataloggers to select.
Base Date – The base date field is used to define the first date for scheduled
data retrieval. If the date entered in this field has already passed, a data
collection attempt will be made when the schedule is enabled and applied.
Base Time – This field is used to define the first time for scheduled data
retrieval. As with the Base Date field, if the time has already passed, a data
collection attempt will be made when the schedule is enabled and applied. This
setting is also used with the Collection Interval to determine the time data
collection will be performed.
CAUTION Entering a zero for any of the intervals below will cause
LoggerNet to try and collect as fast as possible.
Example: If the Base Date and Time are 1/1/99, 12:15 p.m., with an
interval of one hour, data collection attempts will be made at 15 minutes
past the hour, each hour.
Primary Retry Interval – If a data collection attempt is made but fails, you
can specify an interval on which another attempt will be made. This primary
retry interval starts at the time of failure, not on the original calling time and
interval. “Failures” may be caused by busy phone lines, noisy RF
environments, low batteries, damaged hardware, etc.
Number of Primary Retries – The number entered into this field is the
number of times the server will attempt to contact the datalogger on the
Primary Retry Interval. If all the collection attempts fail, then the server will
commence calling on the Secondary Retry Interval if it is enabled.
Secondary Retry Interval – If the secondary retry interval box is checked, the
specified interval is a calling interval that will be followed if all Primary
Retries fail. Data collection attempts will continue on the Secondary Interval
until a data collection attempt is successful, at which time, all retry statistics
are reset. The Secondary Retry Interval is based on the initial date and time
settings, not the time of the last failure. If the box is not checked the collection
schedule will return to the normal collection schedule and continue through the
primary retry schedule until communications are restored.
Typical use is to set the Primary Retries fairly close together, and the
Secondary Retry at a longer interval. For instance, if you are calling on an
4-14
Section 4. Setting up Datalogger Networks
hourly basis, the Primary Retries might be set to three tries, spaced five
minutes apart. The Secondary Interval then might be set at 2 hours.
Reschedule On Data – When this box is selected, each time that data is
received for a station, the data collection schedule will be reset using the
current system time as the base time.
Poll for statistics – The Status Monitor displays information about datalogger
data collection and communication status. There are some potential useful
statistics (columns) that are available for some dataloggers that are not
available for other datalogger types. Sometimes there are statistics obtained
automatically as part of data collection for some dataloggers but can be only
obtained with additional communication commands for other dataloggers. In
this latter case, these statistics are not retrieved by default as users with slow or
expensive communication may not wish to incur the additional cost or time
associated with the extra commands. In cases where the user does want to
retrieve the additional statistics, the Poll for Statistics setting can be enabled to
request that the statistics are retrieved. The statistics will be retrieved during
scheduled or manual data collection.
When Poll for Statistics is enabled, the Status Monitor can show the following
statistics even if the Status Table is not being collected:
4-15
Section 4. Setting up Datalogger Networks
NOTE If the above Server Statistic columns are not currently being
displayed in the Status Monitor, you can add them by selecting
Edit | Select Columns from the Status Monitor menu.
Collect Ports and Flags – If this box is checked, the current state of the ports
and flags is collected and stored in LoggerNet’s internal data cache. This
allows functions such as the Numeric Display and view ports and flags to get
updated data with scheduled data collection.
When the Server’s Table Definitions are Invalid – This option determines
what action should be taken in the data cache during scheduled data collection
(or upon the arrival of a One Way Data or Data Advise record) when
LoggerNet determines that the table definitions it has stored for a table-based
datalogger and the table definitions actually in the datalogger do not match.
4-16
Section 4. Setting up Datalogger Networks
One Way Data and Data Advise are two collection methods that can produce
holes. These collection methods both rely on the datalogger to send records to
LoggerNet. Since the transmission of these records is unacknowledged, there is
a possibility that the data will be lost. If LoggerNet doesn’t receive a record for
any reason, a hole is created. If the Data Advise or One Way Data Hole
Collection check box is selected, LoggerNet will attempt to contact the
datalogger and request the missing records. Otherwise, LoggerNet will not
attempt to collect records missing from the data cache.
Please note that LoggerNet puts records from Data Advise or One Way Data
hole collection in the .dat files as they are received. If there are holes in the
data that are retrieved later, the records will not be in sequential order in the
.dat file created by LoggerNet.
NOTE Data Advise or One Way Data hole collection will not occur at a
time when doing so would force the communication link to be
dialed.
Data Advise is used within RF telemetry networks to increase the speed of data
collection. The RF polling process using the TD-RF (“Time-division polling”)
PROM or OS can take advantage of the Data Advise agreement to collect data
very quickly by broadcasting a communication packet to all dataloggers in the
RFBase-TD network concurrently. This broadcast packet triggers all
dataloggers to check for and send any new records at once. The records are
simultaneously stored in the individual RF remote modems (RFRemote-TD)
until retrieved through the RF polling process (initiated by the RFBase-TD).
4.2.4.4.3 Final Storage Area 1 and 2 Tab (Edlog Dataloggers with Mixed-array Operating
System)
Mixed-array dataloggers include the 21X, CR500, CR510, CR10, CR10X,
21X, CR23X, and CR7. When the datalogger program stores data in a mixed-
array datalogger, the data arrays are stored in a final storage area. Some
dataloggers, such as the CR10X, have two final storage areas while others,
such as the 21X, have only one. This tab is used to define the output file name
and location, the data file format and other output options for the data stored in
the final storage area.
Enabled for Collection – The specified final storage area will be included in
the collected data if this box is checked.
Output File Name – This is the name and directory path for the output file
where the final storage data will be saved after being collected from the
datalogger during manual data collection from the Connect Screen or during
scheduled data collection. The setting can contain these predefined symbols
4-17
Section 4. Setting up Datalogger Networks
that will be expanded by the LoggerNet server at the time the file is opened or
created:
Use Default File Name – Checking this box will set the collected data file
name to the default value, which consists of the name of the station and
number of the final storage area.
File Output Option – This option allows you to choose whether new data
collected from the station is appended to the data file, overwrites the old data in
the data file, or is not stored to a data file. The default option is to append to the
data file so the old data is not lost. If the data file is used only for a real-time
display or such that only the last data collected is needed, overwrite can be
used to replace the old data with the new collected data. If the data is only
going to be used within LoggerNet for display on the Connect Screen graph or
numeric display, or for RTMC, you can choose no output file and a limited
amount of data will be kept in LoggerNet’s internal data cache.
− ASCII, Comma separated writes data to the file in ASCII text format one
record per line with commas between the data values. This file can be
opened in LoggerNet View, a text editor, processed using Split, or brought
into a spreadsheet application.
− ASCII, Printable writes data to the file in ASCII text format separated
into columns separated by tabs. The column number precedes each data
value in the record. Only 80 characters will be placed on each line,
columns that don’t fit the 80 characters are placed on the next line. This
file format can be opened in LoggerNet View, a text editor, or processed
using Split. See the example data file below.
− Binary writes the data to a file in a binary format. The advantage of the
binary format is that it is more compact so the size of the file is much
4-18
Section 4. Setting up Datalogger Networks
smaller than for the ASCII based files. The disadvantage is that it’s
unreadable except using View or by post-processing with Split.
Collect Mode – The collect mode allows you to choose how much data to
collect when getting data from the datalogger.
• Most Recently Logged Arrays – This option is used when you are
interested in only the most recently stored data. When this option is
selected you can specify how many arrays back from the most recent array
should be included when data is collected from the datalogger.
4.2.4.4.4 Data Files Tab (CRBasic Dataloggers, and Edlog Dataloggers with Table Data
and PakBus Operating systems)
Table-based dataloggers include the CR1000X series, CR6 series, CR300
series, GRANITE 6, GRANITE 9, GRANITE 10, CR1000, CR3000, CR800
series, CR200 series, CR5000, CR9000, CR9000X, CR10T, CR510TD,
CR10X-TD, and CR23X-TD. Data output to final storage is stored as records
in tables. The Data Files tab is used to define what data tables will be collected
from the datalogger, along with the output file name and format.
Tables to be Collected – All of the available tables in the datalogger are listed
in the column on the left. If no tables are listed, click the Get Table
Definitions button. The tables selected for collection are shown with a green
check mark and the excluded tables are shown with a red ‘X’. Data from the
selected tables will be collected from the datalogger during scheduled data
collection.
The individual tables can be highlighted by clicking the table name. The
settings on the right side of the window apply to the highlighted table. The
name of the highlighted table appears at the top of the settings. Double clicking
a table name will toggle collection of that table on or off.
Included for Scheduled Collection – If this box is checked the specified table
is included in data collection. This can be changed either by clicking the check
box or double clicking the name of the table in the list.
Output File Name – This setting defines the file name and path for the output
data file that contains the data collected from the datalogger. Clicking the
Browse button ( … ) at the right of the box will allow you to choose another
directory or file name for the collected data. The data from each table is stored
in a separate output file. The setting can contain these predefined symbols that
will be expanded by the LoggerNet server at the time the file is opened or
created:
4-19
Section 4. Setting up Datalogger Networks
Use Default File Name – Checking this box will set the collected data file
name to the default value, which consists of the name of the station and the
name of the table.
File Output Option – This option allows you to choose whether new data
collected from the station is appended to the data file, overwrites the old data in
the data file, creates a new data file with a unique name, or is not stored to a
data file. The default option is to append to the data file so the old data is not
lost.
If the data file is used only for a real-time display or such that only the last data
collected is needed, overwrite will replace the old data with the newly collected
data.
If the data is only going to be used within LoggerNet for display on the
Connect Screen graph or Numeric Display, or for RTMC, you can choose no
output file and the data will only be kept in LoggerNet’s internal data cache
(see Appendix C.2, LoggerNet Server Data Cache (p. C-1), for more about the
data cache).
4-20
Section 4. Setting up Datalogger Networks
Collect Mode – The collect mode allows you to choose how much data to
collect when getting data from the datalogger.
• Most Recently Logged Records – This option is used when you are
interested in only the most recently stored data. During each data
collection, the number of records specified in the Records to Collect field
will be collected.
Get Table Definitions – When this button is pressed, LoggerNet will query the
datalogger for its table definitions. This should only be needed the first time
connecting to a station or when the datalogger program has changed. New table
definitions will cause the previous output data file to be saved with a different
name and a new data file will be created to save the data.
Use Reported Station Name – Enabling this check box will cause the station
name from the Status Table to be used in the header of the data files. If this
check box is not enabled, the network map station name will be used.
NOTE This check box affects only the header of the data files. It has no
effect on the filenames.
4-21
Section 4. Setting up Datalogger Networks
Refer to Section 4.2.6, Setting the Clock (p. 4-55), for additional information on
setting and checking the clock.
Initial Date – The initial date field is used to define the date on which the first
clock check will occur. If the date entered in this field has already passed, the
datalogger’s clock will be checked at the next scheduled data collection.
Initial Time – This field is used to define the time at which the first clock
check will occur. As with the Initial Date field, if the time has already passed,
the clock will be checked at the next scheduled data collection.
Check Clocks – Press this button to manually initiate a clock check of the
LoggerNet server and datalogger clocks. The two values are displayed in the
Adjusted Server Date/Time and Station Date/Time fields, respectively.
Set Station Clock – Press this button to manually set the clock to that of the
LoggerNet server.
NOTE The Allowed Clock Deviation setting will prevent a manual clock
set from being carried out if the difference between the
datalogger’s and server’s clocks is less than the specified
deviation.
4-22
Section 4. Setting up Datalogger Networks
4.2.4.4.7 File Retrieval Tab (CR6-Series, CR300-Series, GRANITE Data Logger Modules,
CR1000, CR3000, CR800-Series, and Edlog Dataloggers with PakBus Operating
Systems)
Retrieval Mode – This option determines the schedule for file retrieval.
New Schedule – Files will be retrieved based on the Base Date and Time
and Collection Interval defined below. Only the new schedule will
trigger file retrieval. Attempts to retrieve files will be made following the
new schedule, whether or not scheduled collection is enabled.
Base Date/Time – Enter a date and a time that the first file retrieval attempt for
the device should occur. If the date and time reflected by these fields has
already passed, retrieval will be attempted immediately when the schedule is
enabled.
Retrieval Interval – Enter the interval on which files should be retrieved from
the device. The retrieval interval is relative to the Base Date and Time entries.
For instance, if the Base Time is set at 12:15 and the interval is set for 1 hour,
file retrieval will be attempted at 12:15, 1:15, 2:15, etc. The format for this
field is 00 d(ays) 00 h(ours) 00 m(inutes) 00 s(econds) 00 m(illi)s(econds).
Delete Files After Retrieval – When this box is selected, the files will be
deleted from the datalogger after they are retrieved.
Add New – When this button is pressed, a new pattern is added to the list of
files to be retrieved. The user must then designate the File Pattern, Output
Directory, Max Files, Force Retrieval, and Record If Skipped fields for this
pattern.
4-23
Section 4. Setting up Datalogger Networks
Delete – When this button is pressed, the selected pattern is deleted from the
list of files to be retrieved.
Edit File Pattern – Specifies a file pattern that will select the files that will be
retrieved. Select an option from the drop-down list or type it in directly. This
can be an exact filename or it can contain the wildcard characters “*” or “?”.
The asterisk is able to replace zero or more characters while the question mark
replaces exactly one character. The file pattern can also have a prefix indicating
the drive from which to retrieve the files. For example, USR:*.jpg will select
all .jpg files on the USR drive. Note that the file pattern is case insensitive.
Output Directory – Enter the directory to store the retrieved files. It can be
entered into the field directly, or you can press the Browse button to the right
of the field to select a path from the Explorer window. The setting can contain
these predefined symbols that will be expanded by the LoggerNet server at the
time the file is opened or created:
Max Files – Specifies the maximum number of files that can be retrieved on
each retrieval. The newest files will be retrieved.
Force Retrieval – When this box is selected, a file that matches the file pattern
will be retrieved regardless of the file’s timestamp or whether the file has
already been retrieved.
Record If Skipped – When this box is selected, the names and dates of any
files that are not retrieved because of the Max Files parameter will be recorded
and they will not be retrieved later. If this box is not selected, the skipped files
can be retrieved in a later attempt.
4.2.4.5 PhoneBase
The PhoneBase is a telephone modem connected to one of the server’s
ComPorts to provide access to other devices in the datalogger network. The
PhoneBase has only Hardware and Notes tabs. This device must be properly
installed and configured in the operating system to use one of the computer’s
ComPorts before it can be used by LoggerNet.
Standard
Communications Enabled – Before communications can take place, all
devices in the chain must have the Communications Enabled box checked.
When this box is selected, communications to the phone modem are enabled.
Maximum Baud Rate – Select the arrow to the right of this field to choose a
maximum baud rate for communication with this device. Note that the actual
rate of communication may be limited by the capability of other devices in the
communications chain.
4-24
Section 4. Setting up Datalogger Networks
Edit Modem Database – The modem connected to the server computer may
not be listed in the database, or the user may desire to change the modem
configurations. When the Edit Modem Database button is selected, the reset
and initialization strings for the selected modem are displayed. You can change
these settings or add a custom modem to the list. If you change the settings for
one of the standard modems you will have to save it to a new name to use it.
The only modems that can be deleted from the list are modems that have been
added by the user.
Modem Type – Use the drop down list box to select the type of modem that is
attached to the server computer’s communications port. In most instances, the
<default modem> should work.
Advanced
Extra Response Time – In this field, specify the additional time that the
LoggerNet server should delay before terminating the communications link if
there is no response from the phone modem. Additional time may be needed in
instances where the communications link is noisy or network traffic is heavy.
NOTE LoggerNet waits a certain amount of time for a response from each
device in a communications path. The extra response times
defined for the communications link are cumulative. Therefore,
the amount of time spent waiting for a device to respond is the
sum of all Extra Response Times defined, plus the default
response time for each device in the link. Add the minimum time
necessary since very long response times can delay other
scheduled events while waiting for a device that is not responding.
4.2.4.6 PhoneRemote
The Hardware tab of the remote phone modem is used to set up the dialing
string for the attached remote device. It has the following controls:
4-25
Section 4. Setting up Datalogger Networks
following the specified delay. The next number will then be dialed after the
delay specified. The amount of time to delay is in milliseconds so a 5-second
delay would be entered as 5000 milliseconds.
4.2.4.7 RFBase
The RF base modem acts as a gateway device that provides RF communication
with the remote RF modems connected to dataloggers at the field site.
Standard
Communications Enabled – Before communications can take place, all
devices in the chain must have the Communications Enabled box checked.
When this box is selected, communication to the RF base is enabled.
Maximum Time On-Line – This field is used to define a time limit for
maintaining a connection to the device. (This may be useful in avoiding costly
communication costs, in the event that a connection to a station is inadvertently
maintained for a long period of time.) Maximum Time On-Line applies to both
scheduled connections and manual connections. However, for manual
connections from the Connect Screen, it is always best to manually disconnect
rather than relying on LoggerNet to disconnect for you.
When the device is connected in the Connect Screen and the time limit
approaches, a dialog box is displayed warning the user that Max Time On-Line
is about to be exceeded. The dialog box has Reset Max Time and Don’t Reset
buttons. If the Reset Max Time button is pressed, the Max Time On-Line
counter will be reset. If the Don’t Reset button is pressed or if no button is
pressed, the connection will be terminated when Max Time On-Line is
reached.
4-26
Section 4. Setting up Datalogger Networks
NOTE If you are using LoggerNet Admin or LoggerNet Remote 4.0 and
using the Connect Screen to connect to a remote server that is
running an older version of LoggerNet, the behavior will be
different than described above. When connecting to a LoggerNet
3.4.1 server, you will be disconnected with no advanced warning
when Max Time On-Line is reached. A message will be displayed
indicating that Max Time On-Line has been reached. When
connecting to servers older than LoggerNet 3.4.1, the behavior
will be variable. Generally, you will be disconnected at some
point, but the timing of the disconnect will not be predictable.
Maximum Baud Rate – Select the arrow to the right of this field to choose a
maximum baud rate for communication with this device. Note that the actual
rate of communication may be limited by the capability of other devices in the
communications chain.
Advanced
Extra Response Time – In this field, specify the additional time that
LoggerNet should delay before terminating the communications link if there is
no response from the RFBase. Additional time may be needed in instances
where the communications link is noisy or network traffic is heavy.
4.2.4.8 RFRemote
The RF remote has only Hardware and Notes tabs.
Standard
Communications Enabled – Before communications can take place, all
devices in the chain must have the Communications Enabled box checked.
When this box is selected, communication to the RF modem is enabled.
4-27
Section 4. Setting up Datalogger Networks
Advanced
Use F Command – The “F” command forces the baud rate to 9600. In the
modem enabled (ME) state, the serial I/O port of the end-of-link modem will
communicate with the datalogger at 9600 baud with the “F” command. In the
synchronous device communication (SDC) state, the baud rate from the
computer to the start-of-link modem will be 9600.
Use W Command – The “W” command will force the RF modem to wait until
there is no carrier detect before transmitting. This option is used when the
computer is connected to more than one RF base.
Custom Dial String – The custom dial string is specifically used to send
commands to an RF95 modem. The values entered into this field will be
inserted between the S command and the string of switch identifiers sent when
the modem is dialed.
4.2.4.9 RFBase-TD
The RFBase-TD device is used to represent a radio base-station modem which
uses the Time Division Polling (TDP) protocol to act as a communications link
between LoggerNet and remote-telemetry radio-datalogger stations. The base
modem hardware must run the Time Division Polling protocol via a TD PROM
or TD-enabled operating system. Hardware that can act as an RFBase-TD
includes the RF500M, RF500B, RF310M, RF310B, and RF315M. The
standard use of the Time Division Polling protocol is to communicate with
PakBus dataloggers using the OneWayData record output method or with table
data dataloggers (CR10X-TD, CR23X-TD, CR510-TD, CR10T) using the Data
Advise record output method.
The child devices of an RFBase-TD are remote radio modems. A child device
can be an RFRemote-TD (table data) or RFRemote-PB (PakBus) depending
upon the type of datalogger connected to the remote radio station’s modem.
When collecting data via radio using the TDP protocol, an RF Polling Interval
is used in conjunction with an RF Poll Offset and a Computer Offset. The
remote radio modems (RFRemote-TD or RFRemote-PB) query the dataloggers
on a time slot given to them by the base modem (RFBase-TD). The base
modem queries the remote modems for data on the specified RF Polling
Interval, factoring in the RF Poll Offset. The base modem buffers this data
until it is queried by the LoggerNet communications server. LoggerNet uses
the RF Polling Interval plus the Computer Offset when collecting the data from
the base modem.
If it is desired to have LoggerNet poll the RFBase-TD for data more frequently
than the interval established by the RF Polling Interval and Computer Offset
settings, the Computer Poll Interval should be set to a non-zero value. In this
case, how often LoggerNet polls the RFBase-TD for data will be determined
by the setting of the Computer Poll Interval, but when LoggerNet polls the RF
Base will be based on the Computer Offset setting.
4-28
Section 4. Setting up Datalogger Networks
For Example:
If:
RF Polling Interval = 5 minutes
RF Poll Offset = X
Computer Offset = 4 minutes 47 seconds
Computer Poll Interval = 0
LoggerNet will query the RF Base 12 times per hour at:
XX:04:47, XX:09:47, XX:14:47, XX:19:47, XX:24:47, XX:29:47,
XX:34:47, XX:39:47, XX:44:47, XX:49:47, XX:54:47, XX:59:47
If:
RF Polling Interval = 5 minutes
RF Poll Offset = X
Computer Offset = 4 minutes 47 seconds
Computer Poll Interval = 3
LoggerNet will query the RF Base 20 times per hour at:
XX:01:47, XX:04:47, XX:07:47, XX:10:47, XX:13:47, XX:16:47,
XX:19:47, XX:22:47, XX:25:47, XX:28:47, XX:31:47, XX:34:47,
XX:37:47, XX:40:47, XX:43:47, XX:46:47, XX:49:47, XX:52:47,
XX:55:47, XX:58:47
NOTE Neither the Computer Offset nor the Computer Poll Interval has
any effect on when, or how often, the RF Base polls the RF
Remotes for data. The polling of the RF Remotes is determined
solely by the settings of the RF Polling Interval and RF Poll Offset.
Hardware Tab
Standard
Communications Enabled – Before communications can take place, all
devices in the chain must have the Communications Enabled box checked.
When this box is selected, communication to the RF modem is enabled.
Maximum Time On-Line – This field is used to define a time limit for
maintaining a connection to the device. (This may be useful in avoiding costly
communication costs, in the event that a connection to a station is inadvertently
maintained for a long period of time.) Maximum Time On-Line applies to both
scheduled connections and manual connections. However, for manual
connections from the Connect Screen, it is always best to manually disconnect
rather than relying on LoggerNet to disconnect for you.
4-29
Section 4. Setting up Datalogger Networks
When the device is connected in the Connect Screen and the time limit
approaches, a dialog box is displayed warning the user that Max Time On-Line
is about to be exceeded. The dialog box has Reset Max Time and Don’t Reset
buttons. If the Reset Max Time button is pressed, the Max Time On-Line
counter will be reset. If the Don’t Reset button is pressed or if no button is
pressed, the connection will be terminated when Max Time On-Line is
reached.
NOTE If you are using LoggerNet Admin or LoggerNet Remote 4.0 and
using the Connect Screen to connect to a remote server that is
running an older version of LoggerNet, the behavior will be
different than described above. When connecting to a LoggerNet
3.4.1 server, you will be disconnected with no advanced warning
when Max Time On-Line is reached. A message will be displayed
indicating that Max Time On-Line has been reached. When
connecting to servers older than LoggerNet 3.4.1, the behavior
will be variable. Generally, you will be disconnected at some
point, but the timing of the disconnect will not be predictable.
Maximum Baud Rate – Select the arrow to the right of this field to choose a
maximum baud rate for communication with this device. Note that the actual
rate of communication may be limited by the capability of other devices in the
communications chain.
Advanced
Extra Response Time – In this field, specify the additional time that
LoggerNet should delay before terminating the communications link if there is
no response from the RF95T. Additional time may be needed in instances
where the communications link is noisy or network traffic is heavy.
RF Poll Offset –The time into the Polling Interval that the RFBase should
query the RFRemotes for new data. The data is held in the RFBase’s buffer
until it is queried by LoggerNet for the data.
Computer Offset – The time into the Polling Interval that LoggerNet should
query the RFBase for data.
Computer Poll Interval – The time interval on which LoggerNet will contact
the RFBase-TD for data. When this setting is at its default value of 0,
LoggerNet will contact the RFBase-TD at the RF Polling Interval plus the
Computer Offset. If this setting is changed to a non-zero value, LoggerNet will
query the RFBase-TD for data at this interval. The timing of the queries will be
based on the Computer Offset.
4-30
Section 4. Setting up Datalogger Networks
BMP1 Station ID – The address that will be used for the device in the BMP1
network. When adding a new device to the network, this field will not show up
until after the Apply button has been pressed. The ID will be assigned
automatically by LoggerNet, but can be changed by the user. This allows the
user to designate unique addresses for all BMP1 devices across multiple
LoggerNet networks.
Clock Tab
Time Zone Offset – Enter an amount of time to offset the RFBase’s clock
from the PC’s clock when it is set. This feature is useful if the RFBase is
located in a different time zone than the PC, and you want the datalogger to
reflect the local time when the clock is set.
Enabled – This check box is used to turn the clock check schedule on or off.
Initial Date/Initial Time – These fields are used to specify when the first
scheduled clock check should occur. If the time reflected by these fields has
already occurred, a clock check will be performed during the next data
collection attempt with the network.
Interval – Enter an interval for how often a clock check should be performed.
Allowed Clock Deviation – Enter the amount of time, in seconds, that the
RFBase’s clock can differ from the computer’s clock before the RFBase’s
clock is corrected. If 0 is entered, the clock will be checked but not set. The
Last Clk Chk and Last Clk Diff statistics can be viewed in the Status monitor
to determine the time of the last clock check and the amount of deviation when
this value is set to 0.
Adjusted Server Date/Time – Displays the date and time for the computer on
which the LoggerNet server is running. This value will be displayed/updated
only when the Check Clocks button is pressed.
4-31
Section 4. Setting up Datalogger Networks
Station Date/Time – Displays the date and time for the RFBase. This value
will be displayed/updated only when the Check Clocks button is pressed.
The RFBase’s clock can be set to that of the PC’s by pressing the Set Station
Clock button.
4.2.4.10 RF RemoteTD
This device is used to configure the remote modem in an RF-TD network. This
option is used when the datalogger attached to the remote modem has a table-
data operating system.
Standard
Communications Enabled – Before communications can take place, all
devices in the chain must have the Communications Enabled box checked.
When this box is selected, communication to the RF modem is enabled.
Advanced
Extra Response Time – In this field, specify the additional time that
LoggerNet should delay before terminating the communications link if there is
no response from the RF95T remote. Additional time may be needed in
instances where the communications link is noisy or network traffic is heavy.
4.2.4.11 RFRemote-PB
This device is used to configure the remote modem in an RFBase-TD network
when the datalogger attached to the remote has a PakBus operating system.
Standard
Communications Enabled – This check box is used to turn communication on
or off. This check box must be enabled for any communication to take place
over the RF modem.
PakBus Verify Interval – The amount of time that will be used as the link
verification interval in the PakBus hello transaction messages. If no
communication has taken place during the specified interval, LoggerNet will
initiate a hello exchange with the datalogger. A verify interval of zero causes
LoggerNet to use a default verify interval of 2.5 times the beacon interval. If
the beacon interval is also zero, the default verify interval is 5 minutes.
4-32
Section 4. Setting up Datalogger Networks
Advanced
Extra Response Time – The amount of additional time, in seconds, that
LoggerNet should wait for this device to respond. Note that Extra Response
Time is cumulative for all devices in the network.
PakBus Address – This field reflects the address of the LoggerNet server.
BMP1 Station ID – The address that will be used for the device in the BMP1
network. When adding a new device to the network, this field will not show up
until after the Apply button has been pressed. The ID will be assigned
automatically by LoggerNet, but can be changed by the user. This allows the
user to designate unique addresses for all BMP1 devices across multiple
LoggerNet networks.
Standard
Communications Enabled – Before communications can take place, all
devices in the chain must have the Communications Enabled box checked.
When this box is selected, communication to the MD9 base is enabled.
Maximum Time On-Line – This field is used to define a time limit for
maintaining a connection to the device. (This may be useful in avoiding costly
communication costs, in the event that a connection to a station is inadvertently
maintained for a long period of time.) Maximum Time On-Line applies to both
scheduled connections and manual connections. However, for manual
connections from the Connect Screen, it is always best to manually disconnect
rather than relying on LoggerNet to disconnect for you.
When the device is connected in the Connect Screen and the time limit
approaches, a dialog box is displayed warning the user that Max Time On-Line
is about to be exceeded. The dialog box has Reset Max Time and Don’t Reset
buttons. If the Reset Max Time button is pressed, the Max Time On-Line
4-33
Section 4. Setting up Datalogger Networks
NOTE If you are using LoggerNet Admin or LoggerNet Remote 4.0 and
using the Connect Screen to connect to a remote server that is
running an older version of LoggerNet, the behavior will be
different than described above. When connecting to a LoggerNet
3.4.1 server, you will be disconnected with no advanced warning
when Max Time On-Line is reached. A message will be displayed
indicating that Max Time On-Line has been reached. When
connecting to servers older than LoggerNet 3.4.1, the behavior
will be variable. Generally, you will be disconnected at some
point, but the timing of the disconnect will not be predictable.
Maximum Baud Rate – Select the arrow to the right of this field to choose a
maximum baud rate for communication with this device. Note that the actual
rate of communication may be limited by the capability of other devices in the
communications chain.
Advanced
Extra Response Time – In this field, specify the additional time that
LoggerNet should delay before terminating the communications link if there is
no response from the MD9. Additional time may be needed in instances where
the communications link is noisy or network traffic is heavy.
4-34
Section 4. Setting up Datalogger Networks
Standard
Communications Enabled – Before communications can take place, all
devices in the chain must have the Communications Enabled box checked.
When this box is selected, communications to the MD9 modem are enabled.
Address – The hardware for each MD9 modem is configured for a certain
address using internal hardware switches. This address acts as an identification
for the device in an MD9 network. Each MD9 modem in the network must
have a unique address; this number is entered in the Address field.
Maximum Time On-Line – This field is used to define a time limit for
maintaining a connection to the device. (This may be useful in avoiding costly
communication costs, in the event that a connection to a station is inadvertently
maintained for a long period of time.) Maximum Time On-Line applies to both
scheduled connections and manual connections. However, for manual
connections from the Connect Screen, it is always best to manually disconnect
rather than relying on LoggerNet to disconnect for you.
When the device is connected in the Connect Screen and the time limit
approaches, a dialog box is displayed warning the user that Max Time On-Line
is about to be exceeded. The dialog box has Reset Max Time and Don’t Reset
buttons. If the Reset Max Time button is pressed, the Max Time On-Line
counter will be reset. If the Don’t Reset button is pressed or if no button is
pressed, the connection will be terminated when Max Time On-Line is
reached.
NOTE If you are using LoggerNet Admin or LoggerNet Remote 4.0 and
using the Connect Screen to connect to a remote server that is
running an older version of LoggerNet, the behavior will be
different than described above. When connecting to a LoggerNet
3.4.1 server, you will be disconnected with no advanced warning
when Max Time On-Line is reached. A message will be displayed
indicating that Max Time On-Line has been reached. When
connecting to servers older than LoggerNet 3.4.1, the behavior
will be variable. Generally, you will be disconnected at some
point, but the timing of the disconnect will not be predictable.
Maximum Baud Rate – Select the arrow to the right of this field to choose a
maximum baud rate for communication with this device. Note that the actual
rate of communication may be limited by the capability of other devices in the
communications chain.
Advanced
Extra Response Time – In this field, specify the additional time that
LoggerNet should delay before terminating the communications link if there is
no response from the MD9. Additional time may be needed in instances where
the communications link is noisy or network traffic is heavy.
4-35
Section 4. Setting up Datalogger Networks
4.2.4.14 RF400
If the RF400 is being used in a point-to-point network (one base radio to one
remote radio) or in a PakBus network, where all of the settings in the radios are
identical, then the communications link can be depicted on the device map as a
direct connection (COM Port with datalogger or PakBus routing device
attached — no RF400s shown in the device map). However, in a point-to-
multipoint network where all remote radios have a separate address, the
RF400s are depicted on the device map. Refer to the RF400 Users Manual for
complete information on RF400 radio setup.
Standard
Maximum Time On-Line – This field is used to define a time limit for
maintaining a connection to the device. (This may be useful in avoiding costly
communication costs, in the event that a connection to a station is inadvertently
maintained for a long period of time.) Maximum Time On-Line applies to both
scheduled connections and manual connections. However, for manual
connections from the Connect Screen, it is always best to manually disconnect
rather than relying on LoggerNet to disconnect for you.
When the device is connected in the Connect Screen and the time limit
approaches, a dialog box is displayed warning the user that Max Time On-Line
is about to be exceeded. The dialog box has Reset Max Time and Don’t Reset
buttons. If the Reset Max Time button is pressed, the Max Time On-Line
counter will be reset. If the Don’t Reset button is pressed or if no button is
pressed, the connection will be terminated when Max Time On-Line is
reached.
NOTE If you are using LoggerNet Admin or LoggerNet Remote 4.0 and
using the Connect Screen to connect to a remote server that is
running an older version of LoggerNet, the behavior will be
different than described above. When connecting to a LoggerNet
3.4.1 server, you will be disconnected with no advanced warning
when Max Time On-Line is reached. A message will be displayed
indicating that Max Time On-Line has been reached. When
connecting to servers older than LoggerNet 3.4.1, the behavior
will be variable. Generally, you will be disconnected at some
point, but the timing of the disconnect will not be predictable.
4-36
Section 4. Setting up Datalogger Networks
Attention Character – Enter the character that will be used to reset the RF400
modem. By default, the radios are programmed to use the + character as the
Attention Character. However, if the RF400 is being used in a communications
link that includes a phone modem, you will most likely need to change this
character in the RF400 radio setup and on LoggerNet’s Setup Screen. Most
phone modems use + as the reset character, and sending this character
unexpectedly will reset the modem and terminate the communications link.
Advanced
Maximum Packet Size – Data is transferred in “chunks” called packets. For
most devices the default value is 2048 byes. The value entered in this field can
be changed in 32 byte increments. If a communications link is marginal,
reducing the packet size may improve reliability.
Standard
Communication Enabled – Before communication can take place, all devices
in the chain must be enabled. When this box is selected, the RF400 radio is
enabled for communication.
Maximum Time On-Line – This field is used to define a time limit for
maintaining a connection to the device. (This may be useful in avoiding costly
communication costs, in the event that a connection to a station is inadvertently
maintained for a long period of time.) Maximum Time On-Line applies to both
scheduled connections and manual connections. However, for manual
connections from the Connect Screen, it is always best to manually disconnect
rather than relying on LoggerNet to disconnect for you.
When the device is connected in the Connect Screen and the time limit
approaches, a dialog box is displayed warning the user that Max Time On-Line
is about to be exceeded. The dialog box has Reset Max Time and Don’t Reset
buttons. If the Reset Max Time button is pressed, the Max Time On-Line
counter will be reset. If the Don’t Reset button is pressed or if no button is
pressed, the connection will be terminated when Max Time On-Line is
reached.
4-37
Section 4. Setting up Datalogger Networks
NOTE If you are using LoggerNet Admin or LoggerNet Remote 4.0 and
using the Connect Screen to connect to a remote server that is
running an older version of LoggerNet, the behavior will be
different than described above. When connecting to a LoggerNet
3.4.1 server, you will be disconnected with no advanced warning
when Max Time On-Line is reached. A message will be displayed
indicating that Max Time On-Line has been reached. When
connecting to servers older than LoggerNet 3.4.1, the behavior
will be variable. Generally, you will be disconnected at some
point, but the timing of the disconnect will not be predictable.
Network Address – Enter the network address that is set up in the RF400
radio. A unique network address is required only if there is more than one
network of dataloggers within the communication range of the network you are
configuring; otherwise, the default of 0 can be used. All devices in a network
must have the same radio network address. Valid Radio Net Addresses are 0
through 63.
Radio Address – This is the unique radio address for the RF400 remote. Valid
addresses are 0 through 65,535.
Advanced
Maximum Packet Size – Data is transferred in “chunks” called packets. For
most devices the default value is 2048 byes. The value entered in this field can
be changed in 32 byte increments. If a communications link is marginal,
reducing the packet size may improve reliability.
Maximum Time On-Line – This field is used to define a time limit for
maintaining a connection to the device. (This may be useful in avoiding costly
communication costs, in the event that a connection to a station is inadvertently
maintained for a long period of time.) Maximum Time On-Line applies to both
scheduled connections and manual connections. However, for manual
connections from the Connect Screen, it is always best to manually disconnect
rather than relying on LoggerNet to disconnect for you.
When the device is connected in the Connect Screen and the time limit
approaches, a dialog box is displayed warning the user that Max Time On-Line
is about to be exceeded. The dialog box has Reset Max Time and Don’t Reset
4-38
Section 4. Setting up Datalogger Networks
buttons. If the Reset Max Time button is pressed, the Max Time On-Line
counter will be reset. If the Don’t Reset button is pressed or if no button is
pressed, the connection will be terminated when Max Time On-Line is
reached.
NOTE If you are using LoggerNet Admin or LoggerNet Remote 4.0 and
using the Connect Screen to connect to a remote server that is
running an older version of LoggerNet, the behavior will be
different than described above. When connecting to a LoggerNet
3.4.1 server, you will be disconnected with no advanced warning
when Max Time On-Line is reached. A message will be displayed
indicating that Max Time On-Line has been reached. When
connecting to servers older than LoggerNet 3.4.1, the behavior
will be variable. Generally, you will be disconnected at some
point, but the timing of the disconnect will not be predictable.
Maximum Baud Rate – Select the arrow to the right of this field to choose a
maximum baud rate for communication with this device. Note that the actual
rate of communication may be limited by the capability of other devices in the
communications chain.
Modem Tab
Dial script – Enter the complete ASCII text string (initialization commands,
telephone number, etc.) that is required to set up the device for communication.
End script – Enter the reset string that should be sent at the end of
communication with this device. Care should be taken to ensure that the script
puts the device in an off-line state.
Raise DTR – Select this check box to set the DTR (data terminal ready) line
high. This tells the remote device that the computer is ready to receive data.
4-39
Section 4. Setting up Datalogger Networks
RTS CTS use – Determines what mode to use for Request to Send/Clear to
Send functions:
• Hardware handshaking will be enabled – The computer uses the RTS and
CTS hardware lines to control the flow of data between the computer and
the remote device. Most newer modems support hardware flow control.
• The RTS line will be raised – Sets the RTS (ready to send) line high. This
tells the remote device that the computer is ready to send data.
• The RTS line will be lowered – Sets the RTS line low.
4.2.4.17 PakBusPort
A PakBusPort must be added to the network map if you want to add a
datalogger capable of PakBus communication (CR1000X series, CR6 series,
CR300 series, GRANITE Data Logger Modules, CR1000, CR3000, CR800
series, CR200 series, CR510-PB, CR10X-PB, or CR23X-PB). PakBus is a
packet-based communications protocol developed by CSI for its dataloggers
and some communications peripherals. PakBus offers a robust, efficient means
of communication within larger datalogger networks, and allows routing of
data from one datalogger (or other PakBus device) to another within the
network. All PakBus devices within the network are assigned a unique address.
Transmissions within the network can be broadcast to all devices or to only one
device using the unique address.
PakBus Graph can be used to visually monitor and retrieve settings from
devices in a PakBus network.
PakBus Port Always Open – The computer running the LoggerNet server is
included as a PakBus device in the network. Because of the nature of broadcast
messages within the Pakbus network, the computer can keep the PakBus port
open, and therefore, can “listen” for transmissions from other PakBus devices.
In most instances, keeping this port open is not an issue. However, if there are
other hardware or software components on your computer that must have
access to the physical port to which the PakBus port is attached, you will want
to clear the PakBusPort Always Open box so that LoggerNet opens the port
only when communication is initiated as part of scheduled data collection or
manually by the user. This way, the port remains available for other uses,
except when it is in use by LoggerNet.
Maximum Time On-Line – This field is used to define a time limit for
maintaining a connection to the device. (This may be useful in avoiding costly
communication costs, in the event that a connection to a station is inadvertently
maintained for a long period of time.) Maximum Time On-Line applies to both
scheduled connections and manual connections. However, for manual
connections from the Connect Screen, it is always best to manually disconnect
rather than relying on LoggerNet to disconnect for you.
4-40
Section 4. Setting up Datalogger Networks
When the device is connected in the Connect Screen and the time limit
approaches, a dialog box is displayed warning the user that Max Time On-Line
is about to be exceeded. The dialog box has Reset Max Time and Don’t Reset
buttons. If the Reset Max Time button is pressed, the Max Time On-Line
counter will be reset. If the Don’t Reset button is pressed or if no button is
pressed, the connection will be terminated when Max Time On-Line is
reached.
NOTE If you are using LoggerNet Admin or LoggerNet Remote 4.0 and
using the Connect Screen to connect to a remote server that is
running an older version of LoggerNet, the behavior will be
different than described above. When connecting to a LoggerNet
3.4.1 server, you will be disconnected with no advanced warning
when Max Time On-Line is reached. A message will be displayed
indicating that Max Time On-Line has been reached. When
connecting to servers older than LoggerNet 3.4.1, the behavior
will be variable. Generally, you will be disconnected at some
point, but the timing of the disconnect will not be predictable.
Maximum Baud Rate – Select the arrow to the right of this field to choose a
maximum baud rate for communication with this datalogger. Note that the
actual rate of communication may be limited by the capability of other devices
in the communications chain.
PakBus Verify Interval – The amount of time, in seconds, that will be used as
the link verification interval in the PakBus hello transaction messages. If no
communication has taken place during the specified interval, LoggerNet will
initiate a hello exchange with the datalogger. A verify interval of zero causes
LoggerNet to use a default verify interval of 2.5 times the beacon interval. If
the beacon interval is also zero, the default verify interval is 5 minutes.
4-41
Section 4. Setting up Datalogger Networks
PakBus Address – This read-only field displays the address that has been
assigned to the PakBus port. In most instances, the LoggerNet server has only
one PakBus address that is used for all PakBus ports; however, if you want to
have multiple independent PakBus networks, you can set the PakBus address
for each port separately. Multiple PakBus ports can be assigned individual
addresses using the Options | LoggerNet PakBus Settings menu option.
To add a range of addresses, enter the begin and end of the range in the fields
above the Add Range button and click Add Range. Multiple ranges can be
specified. To delete a range, select the range in the list and click Delete Range.
Note that this setting will not affect the acceptance of a neighbor if that
neighbor's address is greater than 3999. A device with an address greater than
3999 will always be accepted as a neighbor.
Select a PakBus port in the device map and press Start Search to have
LoggerNet query the network for devices. When a device is found, its PakBus
address will be added to the Node PakBus ID column. When Get Device Type
is pressed, the type of device (e.g., CR1000, CR10X-PB) will be displayed in
the Device Type column. To automatically add a device to the network map,
highlight it and press Add To Network Map. The datalogger will be added to
the network as the correct device type and PakBus ID.
Standard
Communication Enabled – This check box is used to turn communication on
or off. This check box must be enabled for any communication to take place
over the PakBus router.
4-42
Section 4. Setting up Datalogger Networks
PakBus Address – The address of the device in the PakBus network. Valid
ranges are 1 through 4093.
Advanced
Maximum Packet Size – Data is transferred in “chunks” called packets. The
default value is 1000 bytes; however, the value entered in this field can be
between 32 and 2048 bytes, in 32 byte increments. If a communication link is
marginal, reducing the packet size may improve the success rate.
PakBus Encryption Key – This setting specifies text that will be used to
generate the key for encrypting PakBus messages sent to or received from this
device. The key entered here must match the PakBus Encryption Key setting in
the device. (The device setting is entered using DevConfig, PakBus Graph,
Network Planner, or a CR1000KD.)
The PakBus Encryption Key can be up to 63 bytes long and can include any
character with the exception of the Null character. Note that if Unicode
characters are included in the key, those characters may take up to three bytes
each.
If the PakBus Encryption Key device setting is specified as an empty string, the
device will not use PakBus encryption. If the PakBus Encryption Key device
setting is specified as a non-empty string, however, the device will not respond
to any PakBus message unless that message has been encrypted. AES-128
encryption is used.
4.2.4.19 PakBusPort HD
This virtual device is used to facilitate communication with a PakBus
datalogger in an RF95 or RF232 radio network or in an MD9 network.
Standard
Communications Enabled – This check box is used to turn communication on
or off. This check box must be enabled for any communication to take place
over the PakBus node.
Maximum Time On-Line – This field is used to define a time limit for
maintaining a connection to the device. (This may be useful in avoiding costly
communication costs, in the event that a connection to a station is inadvertently
maintained for a long period of time.) Maximum Time On-Line applies to both
scheduled connections and manual connections. However, for manual
connections from the Connect Screen, it is always best to manually disconnect
rather than relying on LoggerNet to disconnect for you.
When the device is connected in the Connect Screen and the time limit
approaches, a dialog box is displayed warning the user that Max Time On-Line
is about to be exceeded. The dialog box has Reset Max Time and Don’t Reset
buttons. If the Reset Max Time button is pressed, the Max Time On-Line
counter will be reset. If the Don’t Reset button is pressed or if no button is
pressed, the connection will be terminated when Max Time On-Line is
reached.
4-43
Section 4. Setting up Datalogger Networks
NOTE If you are using LoggerNet Admin or LoggerNet Remote 4.0 and
using the Connect Screen to connect to a remote server that is
running an older version of LoggerNet, the behavior will be
different than described above. When connecting to a LoggerNet
3.4.1 server, you will be disconnected with no advanced warning
when Max Time On-Line is reached. A message will be displayed
indicating that Max Time On-Line has been reached. When
connecting to servers older than LoggerNet 3.4.1, the behavior
will be variable. Generally, you will be disconnected at some
point, but the timing of the disconnect will not be predictable.
PakBus Verify Interval – The amount of time, in seconds, that will be used as
the link verification interval in the PakBus hello transaction messages. If no
communication has taken place during the specified interval, LoggerNet will
initiate a hello exchange with the datalogger. A verify interval of zero causes
LoggerNet to use a default verify interval of 2.5 times the beacon interval. If
the beacon interval is also zero, the default verify interval is 5 minutes.
Advanced
Extra Response Time – The amount of additional time, in seconds, that
LoggerNet should wait for this device to respond. Note that Extra Response
Time is cumulative for all devices in the network.
PakBus Address – This field is for display only. It shows the PakBus address
that has been set up for the PakBus Port. This address can be changed by going
to the Setup Screen’s Options | LoggerNet PakBus Settings menu item.
4.2.4.20 PakBusTcpServer
The PakBus TcpServer can accommodate multiple incoming PakBus/TCP
connections to service the stations attached to it. Therefore, the same IP port
can be used to listen for incoming connections from multiple dataloggers.
The Outbound PakBus Connections portion of the Routing tab can be used to
specify IP addresses and port numbers to be used for outgoing connections to
specific dataloggers attached to the PakBus TcpServer. The Maintained Nodes
portion of the Routing tab can be used to cause LoggerNet to maintain a
connection with a range of these dataloggers, once an incoming connection has
been established.
4-44
Section 4. Setting up Datalogger Networks
Beacon Interval – The time interval on which the PakBus node should
transmit its routing table information in the PakBus network.
PakBus Verify Interval – The amount of time, in seconds, that will be used as
the link verification interval in the PakBus hello transaction messages. If no
communication has taken place during the specified interval, LoggerNet will
initiate a hello exchange with the datalogger. A verify interval of zero causes
LoggerNet to use a default verify interval of 2.5 times the beacon interval. If
the beacon interval is also zero, the default verify interval is 5 minutes.
PakBus Address – This field is for display only. It shows the PakBus address
that has been set up for the PakBus TcpServer. This address can be changed by
going to the Setup Screen’s Options | LoggerNet PakBus Settings menu item.
IP Port Used for Call-Back – Enter the port number that LoggerNet should
open and monitor for incoming call-back messages. If a value of zero is
entered, LoggerNet will make an outbound TCP connection and then listen on
that connection until validated data is received.
Routing Tab
Outbound PakBus Connections
This box is used to specify IP addresses and port numbers for outbound
connections to any of the dataloggers attached to the PakBus TcpServer.
To add a connection, enter the PakBus Address and IP Address in the fields
below the Outbound PakBus Connections box. Then press the Add
Connection button. To remove a connection, highlight the connection to
remove in the Outbound PakBus Connections box and press the Remove
Connection button.
PakBus Address – The address of the device in the PakBus network. This
must match the PakBus address specified on the Hardware tab of the
datalogger.
4-45
Section 4. Setting up Datalogger Networks
IP Address – Enter the IP address and port number that is assigned to the
device to which you will be connecting. The address and port are entered in
decimal form in the format XXX.XXX.XXX:YYYY, where the Xs represent
the IP network number and the Ys represent the port number. Leading 0s
should not be entered (e.g., 123.45.6.789:6789; note that 45 was entered
instead of 045, and 6 instead of 006).
Maintained Nodes
This box is used to cause LoggerNet to maintain a connection with a range of
dataloggers attached to the PakBus TcpServer. LoggerNet waits for an
incoming connection from a datalogger in the range. Once an incoming
connection has been established with a datalogger in the range, the connection
is maintained. Multiple ranges can be specified.
To add a range, enter the Range Begin and Range End in the fields below the
Maintained Nodes box. Then press the Add Range button. To remove a range,
highlight the range to remove in the Maintained Nodes box and press the
Remove Range button.
Range Begin – The beginning of the range of PakBus addresses for which a
connection should be maintained.
Range End – The ending of the range of PakBus address for which a
connection should be maintained.
4.2.4.21 SerialPortPool
The SerialPortPool is used to allow LoggerNet to call, by phone, multiple
remote dataloggers, when there is more than one phone line and modem
available to make the connections. With pooled devices, multiple com
port/phone modem combinations can be specified for each datalogger. That
way, LoggerNet is not restricted to the use of a single base modem when
calling a particular station. When calling a station, LoggerNet will decide
which modem from a group (pool) of modems to use. Preference will be given
to a modem based on availability and past performance.
Each remote phone modem and datalogger has its own SerialPortPool device.
4-46
Section 4. Setting up Datalogger Networks
When configuring a SerialPortPool, use the Serial Ports tab to add all the
serial ports that are connected to base modems that can be used to call this site.
The dialing string entered in the phone remote must work in conjunction with
any of the modems connected to the serial ports added to the pool for this
datalogger.
NOTE When using pooled modems, the modems should all be such that
any of the modems/phone lines could be used in conjunction with
any device that has it as part of its modem pool. Using the same
type – brand of modem is suggested. Also ensure that there is no
preference for phone lines that may be used (e.g., long distance
rates).
Hardware Tab
Standard
Communications Enabled – This check box is used to turn communication on
or off. This check box must be enabled for any communication to take place
over the SerialPortPool.
4-47
Section 4. Setting up Datalogger Networks
When the device is connected in the Connect Screen and the time limit
approaches, a dialog box is displayed warning the user that Max Time On-Line
is about to be exceeded. The dialog box has Reset Max Time and Don’t Reset
buttons. If the Reset Max Time button is pressed, the Max Time On-Line
counter will be reset. If the Don’t Reset button is pressed or if no button is
pressed, the connection will be terminated when Max Time On-Line is
reached.
NOTE If you are using LoggerNet Admin or LoggerNet Remote 4.0 and
using the Connect Screen to connect to a remote server that is
running an older version of LoggerNet, the behavior will be
different than described above. When connecting to a LoggerNet
3.4.1 server, you will be disconnected with no advanced warning
when Max Time On-Line is reached. A message will be displayed
indicating that Max Time On-Line has been reached. When
connecting to servers older than LoggerNet 3.4.1, the behavior
will be variable. Generally, you will be disconnected at some
point, but the timing of the disconnect will not be predictable.
Advanced
Extra Response Time – The amount of additional time, in seconds, that
LoggerNet should wait for this device to respond. Note that Extra Response
Time is cumulative for all devices in the network.
Serial Ports
The Serial Ports tab is used to add all of the serial ports that are connected to
base modems that can be used to call this site. The dialing string entered in the
phone remote under the SerialPortPool must work in conjunction with any of
the modems connected to the serial ports added to the pool.
All of the COM ports which are set up and recognized by Windows will be
shown in the Available Ports list. Select a COM port from the list and press
Add Port to add the port to the modem pool. To delete a COM port from the
pool, select it in the Serial Port Pool box and press Remove Port.
4-48
Section 4. Setting up Datalogger Networks
4.2.4.22 TerminalPortPool
The TerminalServerPool is used to allow LoggerNet to call, by phone, multiple
remote dataloggers, when there is more than one phone line and modem
available to make the connections.
• Lantronix EDS8PR
• Cisco 2500 Series
• PC Micro Net Modem Software
NOTE When these products are used, software is often installed on the
host computer (LoggerNet PC) that will define a local serial port
(virtual serial port) that redirects to the IP-based service provided
by the terminal server. Serial Port Pooling is used when virtual
serial ports are installed.
Each remote phone modem and datalogger has its own TerminalServerPool
device.
4-49
Section 4. Setting up Datalogger Networks
NOTE When using pooled modems, the modems should all be such that
any of the modems/phone lines could be used in conjunction with
any device that has it as part of its modem pool. Using the same
type – brand of modem is suggested. Also ensure that there is no
preference for phone lines that may be used (e.g. long distance
rates).
Hardware Tab
Standard
Communications Enabled – This check box is used to turn communication on
or off. This check box must be enabled for any communication to take place
over the TerminalServerPool.
4-50
Section 4. Setting up Datalogger Networks
When the device is connected in the Connect Screen and the time limit
approaches, a dialog box is displayed warning the user that Max Time On-Line
is about to be exceeded. The dialog box has Reset Max Time and Don’t Reset
buttons. If the Reset Max Time button is pressed, the Max Time On-Line
counter will be reset. If the Don’t Reset button is pressed or if no button is
pressed, the connection will be terminated when Max Time On-Line is
reached.
NOTE If you are using LoggerNet Admin or LoggerNet Remote 4.0 and
using the Connect Screen to connect to a remote server that is
running an older version of LoggerNet, the behavior will be
different than described above. When connecting to a LoggerNet
3.4.1 server, you will be disconnected with no advanced warning
when Max Time On-Line is reached. A message will be displayed
indicating that Max Time On-Line has been reached. When
connecting to servers older than LoggerNet 3.4.1, the behavior
will be variable. Generally, you will be disconnected at some
point, but the timing of the disconnect will not be predictable.
Advanced
Extra Response Time – The amount of additional time, in seconds, that
LoggerNet should wait for this device to respond. Note that Extra Response
Time is cumulative for all devices in the network.
Terminal Servers
The Terminal Servers tab is used to add all of the terminal servers that are
connected to base modems that can be used to call this site. The dialing string
entered in the phone remote under the TerminalServerPool must work in
conjunction with any of the modems connected to the terminal servers added to
the pool.
To add a terminal server to the pool, type in the address of the terminal server
port in the Terminal Server field and press the Add button. The address
consists of an IP or DNS address followed by a colon followed by the TCP port
4-51
Section 4. Setting up Datalogger Networks
To delete a terminal server from the pool, select it in the Terminal Server
Pool box and press Remove.
In some cases, LoggerNet client applications display only data that has been
collected. In other cases, LoggerNet initiates the retrieval of data to display.
These cases are described below:
Final storage data from mixed array dataloggers is retrieved only when
data collection from the datalogger occurs (initiated manually from the
Connect Screen or based on a schedule). Therefore, the final storage
information on the data displays will be updated only as often as data
collection is performed for these dataloggers. Input locations do not have
to be scheduled for collection to be displayed. When connected, these
values are updated based on the update interval of the display (but limited
by how fast measurements are actually being made in the datalogger).
When connected, data from table data dataloggers is updated based on the
Update Interval. (This is referred to as real time monitoring.) Note that
data can be updated no faster than the data values are being generated by
the datalogger. When not connected, data from table data dataloggers is
updated only as often as data collection is performed. (This is referred to
4-52
Section 4. Setting up Datalogger Networks
RTMC
In RTMC, data displays will be updated no more frequently than data is being
collected from the datalogger, either manually or on a schedule.
4.2.5.2 Intervals
One of the most significant considerations for setting up data collection is all of
the intervals associated with reading, storing, and retrieving data. The intervals
and their significance for data handling are described below.
4-53
Section 4. Setting up Datalogger Networks
RF networks with repeaters also add time delays since each modem must be
contacted and then pass the message on to the next RF modem and so on. Each
of these operations takes time so the time schedule for RF networks with
repeaters should allow enough time for the link to be established with the
datalogger and collect the data.
To set up a data collection schedule for a datalogger, first ensure that your
device map has been configured with all of the devices listed as they actually
exist. Next, determine which tables or final storage areas should be collected
from the datalogger each time a data collection attempt is made. If no tables are
selected on the Data Files tab or the Final Storage Area is not enabled for
collection, no data will be collected from the datalogger.
You should check the directory path and the data file options to make sure the
files are where you want them and in the right format. Note that for table-based
dataloggers, each table must be configured separately (i.e., selected for
collection, file name provided, file format specified, etc.).
4-54
Section 4. Setting up Datalogger Networks
The data collection schedule should be set up next. Set the initial date and time
to when you would like the first data collection attempt to occur and set the
interval at which subsequent data collection attempts should occur. Make sure
that communications are enabled for all devices in the communications path,
and that scheduled collection is enabled. If the initial date and time is set to a
time that has already passed, data collection will begin immediately.
• The Scheduled Collection Enabled box on the Schedule tab for the
datalogger must be selected. This turns the schedule “on”. You can
temporarily disable data collection by clearing this check box and applying
the change.
• The tables or final storage areas from which you desire data should be
enabled for collection in the Data Files or Final Storage Area tab of the
Setup Screen.
• All devices in the communications path to the datalogger must have the
Communications Enabled check box on the Hardware tab selected.
• Look at the collection state data for the datalogger in the Status Monitor.
This is displayed as one of four states.
− Normal collection
− Primary retry
− Secondary retry
− Collection disabled
• Check the Status window and ensure the Pause Schedule check box is
cleared.
Your datalogger clock should deviate no more than ±1 minute per month.
Typically, this drift is less than what will be experienced with a personal
4-55
Section 4. Setting up Datalogger Networks
Another point to consider is how the clock checks may affect the time stamp
for your data. Let’s say, for instance, that you have a data collection schedule
of one minute with a clock set if the two clocks deviate more than two minutes.
Over time, the clocks may drift sufficiently that the datalogger’s clock is set. If
the datalogger’s clock is 12:02:00, and the LoggerNet computer clock is
12:04:15 the datalogger’s clock will be set to 12:04:15. Therefore, there will be
no data for the time stamps 12:03 and 12:04. Conversely, if the datalogger’s
clock is a few minutes faster than the LoggerNet computer’s clock, the result
would be duplicate time stamps that contained different data.
Changing the computer system clock while the display screens are
running will terminate the connection for most of the screens. This
can also affect LoggerNet operations or even crash the program.
Select the Program tab for the highlighted datalogger. If a program exists in
the datalogger that LoggerNet has knowledge of, it will be displayed in the
Current Program fields. To send a new program, press the Send button.
4-56
Section 4. Setting up Datalogger Networks
LoggerNet allows you to select the time option that will be used for the
LoggerNet server. There are three options:
Use local time without correction for daylight saving time – The
LoggerNet server will use the clock from the computer on which
LoggerNet is running. When the time changes to daylight saving time, the
server’s time is not affected.
Use local time with correction for daylight saving time – The
LoggerNet server will use the clock from the computer on which
LoggerNet is running. When the time changes to daylight saving time, the
server’s time will be adjusted.
Use Greenwich Mean Time (GMT) – The LoggerNet server will use
Greenwich Mean Time, regardless of whether daylight saving time is
applicable.
This box is used to set a maximum size, in bytes, for data files. When the
maximum file size is reached, the current file will be archived (with an
incrementing number and a .backup extension) and a new file will be created.
Entering a value of 0 or less indicates that no maximum data file size should be
enforced.
Network Communications
4-57
Section 4. Setting up Datalogger Networks
If the PakBus ports are set up as separate networks, the PakBus ID for each
PakBusPort is entered in the PakBus ID/PakBus Port table.
Valid PakBus IDs are 1 through 4094, though typically numbers greater than
3999 are used for PakBus ports. This is because, when a neighbor filter is set
up, a PakBus datalogger will answer a Hello message from any device with an
ID greater than 3999, but will ignore devices with IDs less than 4000 that are
not in their neighbor list.
You can also use the Copy Defaults to Existing Stations button to apply these
defaults to existing stations. You will be asked to select the default settings to
apply and the stations to apply them to.
If you have changed settings on the LoggerNet Defaults screen, you can press
the Restore Original Defaults button to restore the settings to the original
LoggerNet defaults.
LoggerNet will “listen” for update notification messages from the modem, and
change the IP address for a device based on the information received. A setting
in the modem defines the interval at which transmissions to LoggerNet will be
made.
The IP Manager Port is the UDP port on which LoggerNet will “listen” for an
incoming transmission. If this setting is left at the default of 0, LoggerNet will
not listen for incoming notifications. The default value for AirLink modems is
17338. However, if a firewall is in use, the port value may be changed when
passed through the firewall.
The IP Manager Key is the 128-bit encryption key used for decoding
transmissions from all AirLink modems used in the datalogger network. This
setting must match the setting in the modems for the transmission to be
successfully interpreted. LoggerNet’s default value in the IP Manager Key
field is the default for the AirLink modems.
Note that in addition to these settings, an ID for each modem is set on the
IPPort’s Hardware tab of the Setup Screen (AirLink Modem Name).
4-58
Section 4. Setting up Datalogger Networks
LoggerNet will be unable to open the UDP port if another process is using it.
To determine what ports are being used by processes on the system, the
following command can be issued at a command prompt:
netstat -a -p UDP -n
where:
This dialog box allows you to copy setting(s) from the device currently
selected in the network map to other devices in the network. The list on the left
side shows the settings that can be copied. The list on the right side shows the
available devices in the datalogger network. Select the setting(s) you wish to
copy and the device(s) you wish to copy the settings to. You can press the
Select All (or Clear All) button under the Selected Settings list to select all (or
clear all) of the possible settings. You can press the Select All (or Clear All)
button under the Copy Settings to list to select all (or clear all) of the possible
devices. After selecting the desired settings and stations, press OK to copy the
settings.
4-59
Section 4. Setting up Datalogger Networks
4.2.8.3 Troubleshooter
This menu item opens the Troubleshooter application. This application is used
to help discover the cause of communication problems in a datalogger network.
From the application you can launch a communication test, search for PakBus
devices in the network, view the Status Table for PakBus dataloggers, or open
the LogTool Client. Refer to Section 6.5, Troubleshooter (p. 6-18), for more
information.
Choose File | Select Server from the Setup Screen’s toolbar to open a dialog
box in which to type the IP address of the server. If security has been enabled
on that server you will need to enter a user name and password as well. This
option is not available if you are running a standard version of LoggerNet.
In LoggerNet Admin, you can use the View | Configure Subnets menu item to
configure subnets of the dataloggers in your network map. You can then use
the Subnet button to view a subnet rather that the entire network in the Setup
Screen, Connect Screen, or Status Monitor.
4-60
Section 4. Setting up Datalogger Networks
Press the New Subnet button to add a new subnet. You will be asked to enter a
name for the subnet. All of the dataloggers that are not assigned to a subnet
will be shown in the Unassigned Dataloggers column. Select a datalogger and
press the right arrow key to add it to the current subnet. It will be moved to the
Assigned Dataloggers Column. (You can also add a datalogger to the subnet
by dragging and dropping it from the Unassigned Dataloggers column to the
Assigned Dataloggers column.) You can remove a datalogger from the
current subnet by selecting the datalogger in the Assigned Dataloggers column
and pressing the left arrow key or by dragging and dropping it from the
Assigned Dataloggers column to the Unassigned Dataloggers column.
Dataloggers in the Assigned Dataloggers column can be rearranged by
dragging and dropping a datalogger to a new location in the list.
The New Group button can be pressed to add groups to your subnet. Groups
are a way to group dataloggers together within a subnet. Note that groups only
show up, when viewing the network map in “Stations Only” view.
To add another subnet, press the Add Subnet button again. By default, only
the dataloggers that are not a part of another subnet will be shown in the
Unassigned Dataloggers column. Select the Show Dataloggers Assigned to
other Subnets check box to show all dataloggers.
Once subnets have been set up, you can use the Subnet button in the Setup
Screen, Connect Screen, Status Monitor, and Troubleshooter to select whether
to view the entire network or a subnet.
4-61
Section 4. Setting up Datalogger Networks
NOTE When viewing a subnet of the network map with Display set to
All Devices, you will only see each network branch down to the
datalogger(s) included in the subnet. If there are devices
(communication devices, other dataloggers, storage modules, etc.)
in the branch below the bottom datalogger included in the subnet,
they will not be shown when viewing the subnet. This means you
will not be able to view a storage module in a subnet.
NOTE Beginning in LoggerNet 4.7, you can now edit stations while
viewing a subnet. (Previously, you had to change your view to the
whole network before making station changes.) Adding a device
while in a subnet view will add the device to the network and the
subnet. Deleting a device while in a subnet view will delete the
device from the network and remove it from every subnet it is a
member of. Attempting to add or delete devices while in Stations
Only view will prompt you to change to All Devices view, but
will leave you in the subnet.
The first step in developing a Network Planner model is to create the stations.
A station is created by adding a root device to the Drawing Canvas. Examples
of root devices include PakBus dataloggers, the LoggerNet server, as well as
radio and phone modems. Once the root device is added to the model, any
peripherals that connect to the root device are added. After adding the stations
to your network, you must identify links between stations and the activities that
occur over those links.
After the stations, peripherals, links, and activities have been added, all items
on the Configure Devices list will need to be completed in order to configure
the devices in the system. Check boxes are provided that allow items to be
checked off as an indication that they have been completed.
At anytime, details about a station can be viewed and edited by selecting the
station on the Drawing Canvas and viewing the Station Summary.
4-62
Section 4. Setting up Datalogger Networks
Network Planner model. The online help also has a step-by-step example of
creating a model for a simple wireless sensor network.
Background images offer significant value in that they allow you to see the
layout of your network related to the geography that you are trying to cover. A
good image can show landmarks and/or topographical locations that guide
placement of things like radio repeaters. That said, the Network Planner does
not derive any intelligence from the background image. It is present strictly to
satisfy user aesthetics.
4-63
Section 4. Setting up Datalogger Networks
Navigation buttons that can be used to scroll the model canvas are highlighted
in the image below.
The following buttons and features are associated with this control:
Arrow Buttons–Pressing any arrow button will scroll the drawing canvas
in the direction of the arrow.
Collapse Button–The small black arrow in the upper left hand corner of
the scroll button area allows you to hide the navigation buttons. When this
is done, the black arrow will remain but point in the opposite direction
and, if clicked on, will expand the scroll button area.
The model overview is a small square area in the lower right hand corner of the
canvas. If a background bitmap is associated with the model, a miniature
version of that bitmap will appear in the overview. The positions of stations on
the canvas are represented with red dots. A black rectangle is used within this
area to show the current viewing area. If you click the left mouse button while
the pointer is anywhere within this area, the canvas will be scrolled so that the
center of the canvas is positioned relative to the mouse position within the
overview area. If you depress the left mouse button and drag the mouse
pointer, the canvas will be scrolled as you move the mouse. If you click on the
small black arrow on the lower left hand corner of the overview area, most of
the area will be hidden although the arrow will remain with a reversed
direction. Pressing the arrow will make the model overview area visible again.
4-64
Section 4. Setting up Datalogger Networks
If the station list is shown and you select a station in this list, that station will
be selected in the drawing canvas. If the station is not currently visible at that
time. The canvas will be scrolled until the station is shown on the canvas.
While the Hand Tool is selected by pressing the hand icon on the toolbar, you
can scroll the drawing canvas by holding down the left mouse button while the
mouse pointer is over a blank area of the canvas and dragging the mouse in the
desired direction.
The scale of the drawing canvas can be changed using the toolbar controls
highlighted in the above figure. The red button with the minus symbol
decreases the zoom factor and the green button with the plus symbol increases
the zoom factor. The combo-box to the right of these buttons allows the zoom
factor to be selected directly. The Network Planner expresses zoom factors as
percentages and supports 25%, 50%, 100%, 125%, 150%, 200%, 400%, 800%,
1600%, 3200%, and 6400%.
You can delete a station by pressing the Delete key with the station selected.
4-65
Section 4. Setting up Datalogger Networks
the peripheral in the Device Palette and then position the mouse cursor over the
top of the station icon. If a compatible interface for connecting to any of the
devices in that station is available, the dashed square around the mouse cursor
will disappear and small “+” icon will appear in the center of that station’s
icon. If you then click the left mouse button, the dialog box shown below will
appear.
You must select the communication interface for the new peripheral from the
dialog list box. These interfaces are prioritized by the Network Planner, such
that the links considered the best are listed at the top. Note that the Network
Planner simplifies the task of selecting a link by hiding, by default, all of the
links except those that have the highest priority. You still have the option of
seeing all available choices by clicking on the Show all possible links check
box. The peripheral will not be added to the station until an appropriate link
type is selected and the OK button is pressed.
4-66
Section 4. Setting up Datalogger Networks
When the canvas is operating in this mode, the mouse cursor changes from a
hand to a jagged line. While in this mode, you can click on a station icon to
indicate the first device in the link. If that device can support a new link, a
small green “+” icon will appear in the center of that station icon and, as you
move the mouse, a green rubber-band line will follow the mouse cursor. At this
point, when you move the mouse cursor over the icon for another station that
has an interface that can be linked to the first station, another green “+” icon
will appear in the middle of that station. If the mouse cursor is hovering over a
station that does not have any compatible interface, a red “-” icon will appear
in the center of that station icon. You can complete the link by clicking the left
mouse button again, when the cursor is over another station with a small green
“+” icon indicating a compatible interface. At this point, the dialog shown
below will appear.
As with adding peripherals to stations, you must select the appropriate kind of
link. The dialog presents possible combinations between all the devices in both
stations. The list of link types is prioritized by the Network Planner, such that
the links considered the best are listed at the top. Note that the Network
Planner simplifies the task of selecting a link by hiding, by default, all of the
links except those that have the highest priority. You still have the option of
seeing all available choices by clicking on the Show all possible links check
box. The link will not be added until an appropriate link type is selected and
the OK button is pressed.
Station links are shown as lines between “connection points” on or around the
station icons. The end points of the link lines are small icons that represent the
nature of the link. An example of this is shown in the figure below of an RF401
based link.
4-67
Section 4. Setting up Datalogger Networks
In this instance, for radio based links, the icon is a small yagi antenna. If you
hover the mouse cursor over the top of this connection point icon with the
station selected, a tool-tip type window will appear that gives specific details
about that link.
4-68
Section 4. Setting up Datalogger Networks
Activities can be added to the model by using the Activity Tool icon
highlighted below, or by choosing Add Activity from the context menu that
results from right-clicking on a station.
When the Add Activity mode is selected, the mouse cursor will change to
indicate a linking mode and you will be expected to click on the station that
contains the device that should originate the activity. When the mouse cursor
hovers over a station in this mode and that station has a device that is capable
of originating an activity, a small “green plus” symbol will appear in the
middle of the station’s icon. When the mouse cursor hovers over a station that
does not contain devices capable of originating activities, a small “red cross”
symbol will appear in the middle of that station’s icon. You can specify the
origin of the activity by clicking the left mouse button while the mouse cursor
is over a station. At this point, a rubber band line will follow the mouse cursor
and you will be expected to click on a second station to indicate the target of
the activity. As with selecting a source station, the icon for the target station
will show the “green plus” symbol if the station contains a device that is able to
receive activities from the source station. Once the target station has been
selected, the dialog shown below will appear. This dialog will also be shown
when you add an activity from a station’s context menu.
The purpose of this dialog is to allow you to specify the devices that will act as
the source and target of the intended activity and also to select the type of
activity that will take place. (Note that if the add activity icon was used to add
the activity, the source and target devices will already be designated in the
dialog box.)
Source Device – Specifies the device that will initiate the activity. This choice
box will be populated with the list of all devices in the model that are capable
4-69
Section 4. Setting up Datalogger Networks
Target Device – Specifies the device that will be targeted by the activity. This
choice box will be populated with the list of all devices in the model that are
capable of receiving an activity. Like the source devices, devices in this list
will be identified by their station names and devices names separated by a
colon character.
Activity Types List – This list box will be populated with the set of activities
that are compatible with the two selected devices serving as source and target.
If a given type of activity has already been specified between two devices, that
activity will not be present in this list.
Once the source device, target device, and activity type has been chosen and
the OK button has been pressed, another dialog box will appear which requires
you to specify the properties for that activity. At the minimum, the properties
for an activity will specify the time interval for that activity. An example of
such a property sheet is shown below. After specifying the properties, press the
Apply button the add the activity to the model.
4-70
Section 4. Setting up Datalogger Networks
The station summary is a view that displays details about the station that is
currently selected on the drawing canvas. By default, it is shown in the lower
right-corner of the Network Planner window. This view provides the following
features:
• An Edit Station Properties link that, if clicked, will present you with a
dialog that contains property sheets for all of the devices and/or links in
the station.
• A list of links to the parts of the report that describe the each of the devices
in the station.
• A link to delete the device (does not appear for the station root device)
• An optional link to perform the Configure Devices list item for the
device.
4-71
Section 4. Setting up Datalogger Networks
The Configure Devices panel lists tasks that need to be completed before the
network can be deployed. These tasks include configuring any LoggerNet
servers and writing settings to devices. The Configure Devices panel is divided
into two sections. The list box at the top lists all of the Configure Devices items
and provides check-boxes that allow these items to be checked off to indicate
when an item has been completed. The bottom portion of the panel displays a
more detailed description of the selected item. If the item has been completed,
4-72
Section 4. Setting up Datalogger Networks
the description will include the date and time when it was completed. Unless
the selected item must be performed manually (setting a dip-switch on a
COM220, for instance), the item description will also contain a link that can be
followed to initiate the action associated with that item. When the selected
station on the drawing canvas changes, the selected item in the Configure
Devices list box will also change to the item, if any, associated with the root
device of that station.
Serial Port – Specifies the PC serial port that should be used for
communication with the device.
Baud rate – Specifies the baud rate at which communication should take place
with the device.
Revert to Defaults First – Specifies whether all of the device settings should
be automatically set to their factory default values before writing the settings
generated by the Network Planner. Typically, the Network Planner does not
generate every possible setting for a device but, rather, generates the minimal
set needed to express the model parameters.
Connect – When clicked, this button will disable the dialog controls and
initiate communication with the device in order to transmit the settings. In
order to accomplish this, the Network Planner uses its own private LoggerNet
server.
4-73
Section 4. Setting up Datalogger Networks
Save – This button allows you to save the generated settings for the device into
an XML file that can be read by the Device Configuration Utility as well as the
PakBus Graph LoggerNet client. This presents an alternate means of loading
settings into the device.
Connect Instructions – The HTML control in the upper right hand corner of
the dialog shows the instructions to connect to the device. These instructions
are obtained from the Device Configuration library and will thus always be the
same as is shown in the Device Configuration Utility.
Generated Settings Summary – The HTML control in the lower right hand
corner of the dialog shows a summary of the settings that have been generated
for the device. The format of this summary is exactly the same as that can be
seen when connecting to the device from within the Device Configuration
Utility or from within PakBus Graph.
4-74
Section 4. Setting up Datalogger Networks
This dialog allows you to specify a user name and password for an
administrative account on the local server and, if you click on the OK button,
will perform the following actions:
• The dialog will connect to the local server and scan its network map for
any serial port devices that use the same serial port name as that specified
in the device configuration dialog.
• For any local LoggerNet device that matches the criteria listed above, the
dialog will attempt to override the communications enabled setting for that
device to a value of false.
• The dialog will wait for the Line State statistics for all matching devices to
report an off-line state.
• The dialog will close once all of the Line State statistics report an off-line
state. At this point, communications will be initiated with the device. The
settings override(s) explained above will be released once configuration of
the device has been completed.
• If you click on the Cancel button at any time while the dialog is waiting
for the local LoggerNet devices, the dialog will be closed and all overrides
cancelled.
This dialog will not be shown if the Network Planner is launched outside of the
LoggerNet tool bar or if there is no local server reported to be running.
4-75
Section 4. Setting up Datalogger Networks
Model Prefix – This field allows you to enter a string that will be placed at the
beginning of the name for every device in LoggerNet’s network map that the
Network Planner generates. If specified, this value will appear in the names
followed by an underscore character. This feature allows you to use several
different Network Planner models with the same LoggerNet server by keeping
the generated devices separate.
Server Address – This field specifies the IP address or domain name of the
computer that is running the LoggerNet server. In order to configure
LoggerNet on a remote computer, that instance of LoggerNet must allow
remote connections and must also be reachable from the computer hosting the
Network Planner.
User Name & Password – These fields allow you to specify the account that
should be used when connecting to the LoggerNet server. The account used
should have at least Network Manager privileges. If security is not
configured on the server, these values will be ignored.
Remember Password – This check box can be checked (the default) if the
user name and password specified should be saved as part of the information
for the LoggerNet device in the model.
Connect – When this button is clicked, the Network Planner will initiate the
connection to the LoggerNet server.
4-76
Section 4. Setting up Datalogger Networks
Help Window – The HTML window in the upper right corner of the dialog
shows context sensitive help about the control that has the current keyboard
focus.
Generated Devices and Settings – The HTML window in the lower right
corner of the dialog shows a summary of the devices that will be created in
LoggerNet’s network map as well as the settings associated with those devices.
This summary is presented in an indented list form where the level of
indentation depends upon device links. Since the dialog has not yet
communicated with the server, this summary shows only the expected structure
and does not reflect any devices in the actual network map.
If you click on the Connect button in the Configure LoggerNet dialog, the
Network Planner will attempt to connect to the specified server address and
will log in using the specified user name and password. Once attached, the
dialog will attempt to reconcile the structure and settings that it has generated
with the structure and settings currently in LoggerNet’s network map. The
result of this reconciliation is a set of changes that will be made to the current
LoggerNet network map. An example of this set of changes is shown below.
4-77
Section 4. Setting up Datalogger Networks
The colors of items in this dialog indicate the impact they may have on the
operation of devices that are already in the LoggerNet network map.
Green – The change is merely additive (adding new devices, for instance) and
is unlikely to have any noticeable impact on the workings of existing devices.
Red – The change will alter the structure of the network map and will relocate
existing devices.
The Network Planner will store all of the information about the model in an
NWP file. By default these files will be written to the
C:\CampbellSci\NetworkPlanner directory but you can select any other
directory. Along with model information, the model file will also store the
background image if there was any associated with the model. Screen layout
and zoom options will not be stored in the model file.
The structure of these files is such that they can be easily transferred to another
computer. If a Network Planner model is created on one computer and then
used on another computer, some machine specific properties, such as IP
addresses and serial port identifiers may have to be adjusted to account for
differences between the two computers.
4-78
Section 4. Setting up Datalogger Networks
The View menu can be used to show or hide various user interface components
including the toolbar. It also includes an entry, Restore Default View, that can
be used to restore the application layout to the “standard” layout.
4-79
Section 5. Real-Time Tools
LoggerNet’s real-time tools are used to manage your stations in the datalogger network.
Tools are provided for sending new programs, setting the clock, toggling ports and flags,
collecting data, and displaying data numerically and graphically.
5-1
Section 5. Real-Time Tools
As noted above, you can work with a datalogger station while actively
connected to it or when you are in a disconnected state. Even when not actively
connected, you can choose to collect data, check or set the clock, etc. When a
button is pushed, LoggerNet will attempt to contact the datalogger, performed
the desired action, and then terminate communication.
If you want to perform only one task, such as collecting new data, it may be
more efficient not to actively connect to the datalogger. LoggerNet will merely
contact the datalogger, collect the data, and end communication. If you were
actively connected, LoggerNet would also update the clock displays during this
process, which, when collecting large amounts of data over a slow
communication link, could affect the speed of data collection. If you want to
perform multiple tasks — e.g., send a new program and view measurements on
a Numeric Display to ensure the program is running correctly — then it is
usually more efficient to establish an active connection, perform the tasks, and
then terminate the connection yourself. Otherwise, LoggerNet must establish
communication with the datalogger twice. Over remote communication links,
this connect/disconnect/connect sequence will increase the time to complete the
tasks.
When you select the Connect button the animated graphic will indicate an
active connection state. It will show that LoggerNet is trying to establish the
connection; the two connectors join together when the connection is made. You
can also connect to the datalogger by double clicking the datalogger name or
selecting Connect from the File menu.
5-2
Section 5. Real-Time Tools
NOTE When you connect to a station, LoggerNet checks for Status Table
errors. If the station has Status Table errors (skipped scans,
skipped records, and so forth), a yellow exclamation point will be
added to the Station Status button. Once you click on the Station
Status button, this indicator will be removed.
To disconnect from the datalogger, click the button that now reads Disconnect.
To work with another datalogger you must disconnect from the first one
(unless you have installed LoggerNet Admin or LoggerNet Remote). Double
clicking another datalogger will disconnect from the first datalogger and
connect to the new one without prompting.
If LoggerNet fails to make a connection to the datalogger, it will time out and
display an error message that it could not connect. It will immediately attempt
the connection again and will continue trying until the user clicks Cancel.
The data for each of these two pointers is, by default, stored in two separate
data files on the computer. The default directory for scheduled data
collection/manual data collection is C:\Campbellsci\LoggerNet. The default
directory for data collection via the Custom Collection window is
C:\Campbellsci\LoggerNet\Data. Because different pointers are kept in
LoggerNet for each of these collection options, if you select Collect Now from
the main Connect Screen, and your Setup option is set to collect only new data,
and then you do a Custom Collection and also choose to collect only new data,
the new data collected using the Custom Collection window is the new data
since the last time you collected using this window. Similarly, new data
collected using Collect Now from the main Connect Screen is the new data
since the last time you chose Collect Now, or since the last scheduled data
collection.
5-3
Section 5. Real-Time Tools
Once you have started data collection with Collect Now, you can stop it by
clicking the Cancel button on the animated screen. This might be necessary if
you started a data collection that is bringing in more data than you really
wanted, especially over a slow communications link.
NOTE
While retrieving data from the datalogger using Custom
Collection, scheduled data collection will be suspended. The
directory where the files are stored for custom collection is
separate from the files for scheduled collection data and by default
is a Data directory under the LoggerNet directory (e.g.,
C:\CampbellSci\LoggerNet\Data).
File Format
This option is used to select the file format in which to store the collected data.
Appendix B, Campbell Scientific File Formats (p. B-1), provides information on
File Formats.
5-4
Section 5. Real-Time Tools
Collect
− Collect All will get all of the data available in the selected final storage
areas. For a datalogger that has a lot of data stored, this could result in a
large file and take a long time.
− Last Number of Arrays specifies how many of the last stored records
will be retrieved from the datalogger.
− Advanced allows the user to specify the memory pointer address for each
of the final storage areas. Data collection will begin at the specified
memory pointer and go through the last record collected. Use of this
option depends on knowing the memory pointer values for data collection.
Pointers can be manually reset but are updated and stored with each data
collection.
File Mode
The File Mode determines if the data to be collected will be added to the end of
the data file, if it exists, or if it will overwrite an existing data file. This option
only applies if a file with the same name exists in the directory specified.
− Append to End of File – When this option is selected, the data collected
using the Custom Collection option will be appended to the end of the file
from previous Custom Collection.
NOTE CR500, CR7, and 21X dataloggers have only one final storage
area.
Each table is saved in a separate file so there will be one file created for each
table that is selected.
5-5
Section 5. Real-Time Tools
Collect Mode
− Newest Number of Records will retrieve the number of records specified
in the Starting Record Information area going back from the last record
collected.
5-6
Section 5. Real-Time Tools
− Data Since Last Collection will retrieve the data stored since the last time
a custom collection was performed. LoggerNet keeps track of the records
collected from each table every time a custom collection is executed. This
option will work even if the last custom collection used a different option.
− All the Data will get from the datalogger all the data available from all of
the selected tables. If the datalogger is full this could take a long time,
especially with large memory dataloggers or over slow communication
links.
− Data From Selected Date and Time uses the starting and ending time and
dates to get the data from the datalogger stored between those times. If the
datalogger does not have data for the specified time range a blank file will
be created. (CR5000, CR9000, CR9000X, and CR200-series dataloggers
do not support this collection option.)
File Mode
File Mode is used to choose whether the collected data should be appended to
the file if it exists, overwrite the existing file, or create a new file. If Create
New File is selected and the named file exists, a new file will be created with
the specified file name and a sequence number added to it.
File Format
This option is used to select the file format in which to store the collected data.
Appendix B, Campbell Scientific File Formats (p. B-1), provides information on
File Formats.
Format Options
Select the Include Timestamp check box to have timestamps included in your
data. If the check box is not selected, timestamps will not be included.
Select the Include Record Number check box to have record numbers
included in your data. If the check box is not selected, record numbers will not
be included.
When Midnight is 2400 is selected, the timestamp will reflect midnight as the
current date with 2400 for the Hour/Minutes. Otherwise, the timestamp will
reflect midnight as the next day's date, with the Hours/Minutes as 0000.
5-7
Section 5. Real-Time Tools
When the Don’t Quote Strings check box is selected, strings in the data will
not be surrounded by quotation marks. If the check box is not selected, strings
will be surrounded by quotation marks. (Note this option is only available for
the ASCII Table Data, No Header Output Format.)
Enabling the Use Reported Station Name check box will cause the station
name from the Status Table to be used in the header of the data files. If this
check box is not enabled, the network map station name will be used. (Note
that this check box affects only the header of the data files. It has no effect on
the filenames.)
Click the button next to the flag or port to turn it on or off. You do not have to
be actively connected to the datalogger to toggle a port or flag. If you are not
connected, when the port or flag is toggled, LoggerNet will connect to the
datalogger, make the change, and disconnect.
Program variables that are declared as Boolean can also be placed on this
display, for dataloggers that support data types. For these dataloggers, an Add
button is available that, when pressed, lists all of the tables in the datalogger.
When a table is highlighted on the left side of the window, any variables that
are declared as Boolean in the program will be displayed on the right side of
the window.
To return the Ports and Flags display to its original state, press the Defaults
button. This will reset all labels to their original names, update the number of
flags based on the currently running program (for CR6-series and CRX000
dataloggers), and remove any Boolean values placed on the screen (for CR6-
series and CRX000 dataloggers that support data types).
5-8
Section 5. Real-Time Tools
Different datalogger models have a different number of ports and flags. The
Ports and Flags dialog box will display only those ports and flags available for
the datalogger type. Behaviors for each datalogger type are shown below.
• Mixed array dataloggers have a fixed number of ports and user flags that
are available. The ports and flags dialog box will display only the ports
and flags supported by the datalogger; no additional values can be added.
The Defaults button will reset any user-defined labels that have been
typed in.
• CR9000 and CR200 dataloggers do not have ports that can be toggled
from this display. They also do not have predefined flags or support the
declaration of variables as Boolean. The Ports and Flags dialog will
display one to three columns, depending upon the number of flags defined
in the program. Pressing Defaults will reset all labels to their original
names and update the number of flags based on the currently running
program.
NOTE A Boolean variable is a variable that can have one of two states:
high/low, off/on, –1/0, true/false. Variables for CRBasic
dataloggers can be declared as Boolean with the Public or Dim
statement.
NOTE A control port must first be configured for output in the datalogger
program before it can be toggled on or off. Consequently, if you
select a port and it doesn’t appear to change, your program may
not have the port configured for output (refer to your datalogger
operator’s manual). The CR500 and CR510 have two control
ports, but only one of the ports, control port 1, can be configured
for output. Therefore, control port 2 cannot be toggled on or off.
It is included on the display so that you can monitor its status.
Ports on the CR5000 and CR9000X cannot be controlled directly
with the Ports and Flags window. For these dataloggers, special
Flag settings tied to the ports must be set up in the datalogger
program to achieve the desired control.
5-9
Section 5. Real-Time Tools
You can customize the labels for the Ports and Flags display by clicking within
the label field and typing the desired text.
You can set the clock by clicking the Set button. LoggerNet attempts to set the
datalogger clock as closely as possible to the computer clock. A slight
difference in the clocks might exist after the clock is set because of the
communications time delay. Over some communication links it is impossible
to match the computer clock exactly. LoggerNet uses advanced compensation
to get the best possible synchronization between the computer and the
datalogger clocks.
If you are not connected to the datalogger, or if you are connected but the clock
update is paused, you can press the Check button and LoggerNet will check
both the datalogger’s clock and the computer’s clock and display the results on
the screen.
An automatic scheduled clock check can be set up in the Setup Screen (see
Section 4.2.4.4.5, Clock Tab (p. 4-22)).
Setting the clock may affect the time stamps assigned to your data. Refer to
Section 4.2.6, Setting the Clock (p. 4-55), for further discussion.
5-10
Section 5. Real-Time Tools
NOTE Programs for the CR7, 21X, and the CR10(X), CR510, and
CR23X-series dataloggers must be compiled in the editor to create
the *.dld file that is downloaded to the datalogger. The CR200-
series datalogger also requires a precompiled file (*.bin), which
can be done in the editor or when the program is sent using
LoggerNet. CR6-series and CRX000 dataloggers compile their
program on-board.
After selecting a datalogger program file a warning will appear to remind you
that data may be lost when the new program is sent. (For mixed-array
dataloggers data is not lost if the memory configuration does not change;
sending a new program to table-based dataloggers always clears all data
memory.) If there is any data in the datalogger that has not been collected,
click Cancel to stop the program send, and collect the needed data.
If OK is selected at the warning, the progress bar will come up with the
program transfer progress. Once the program has been sent, the text changes to
Compiling Program. When the datalogger finishes compiling the program the
progress box will close and a Compile Results box will open. For CRBasic-
programmed dataloggers (excluding the CR200 series), this box will have a
Details button that can be pressed to bring up information about files and
tables stored in the datalogger.
The LoggerNet installation includes all of the compilers for the CR200 that
were available at the date of release. When sending a program, if you choose to
send the *.CR2 file, LoggerNet will first check the OS version of the
5-11
Section 5. Real-Time Tools
datalogger and then attempt to compile the *.CR2 file with the matching
compiler. If you choose to send the *.bin file, LoggerNet will not check the
CR200’s OS or precompile the file, it will just send the *.bin file. In this
instance, it is up to you to ensure that the *.bin file was created with the correct
precompiler.
This feature is useful if the original file has been corrupted, lost, or erased.
Note that programs may not be reliably retrieved over noisy or slow
communications links.
Programs created with Edlog version 2.0 and greater include both the input
location information and the final storage information in the *.dld file.
Previous versions of Edlog stored only the input location information in the
*.dld. If final storage information is not available for viewing in LoggerNet
after associating the file, you may need to recompile the program file with a
version of Edlog that stores this information in the *.dld file.
NOTE If you are using Edlog Version 2.0 or greater and labels are still
not available for use, check Edlog’s Options | DLD File Labels
menu item and ensure that labels are being stored in the file when
the program is compiled.
5-12
Section 5. Real-Time Tools
program_name.TDF file is created along with the original program file. This
file contains the table definitions for that program. Associating the TDF file
with a datalogger can be useful if communication is taking place over a slow or
unreliable communications link where the attempt to receive table definitions
back from the datalogger fails.
Final storage data from mixed array dataloggers is retrieved only when
data collection from the datalogger occurs (initiated manually from the
Connect Screen or based on a schedule). Therefore, the final storage
information on the data displays will be updated only as often as data
collection is performed for these dataloggers. Input locations do not have
to be scheduled for collection to be displayed. When connected, these
values are updated based on the update interval of the display (but limited
by how fast measurements are actually being made in the datalogger).
When connected, data from table data dataloggers is updated based on the
Update Interval. (This is referred to as real time monitoring.) Note that
data can be updated no faster than the data values are being generated by
the datalogger. When not connected, data from table data dataloggers is
updated only as often as data collection is performed. (This is referred to
as passive monitoring.) Therefore, for input locations or public variables
to be updated when not connected, they must be included for scheduled
collection.
NOTE If clock updates and display updates are paused while connected
to a datalogger, the connection may time out and terminate the
connection.
There are three Numeric Display screens and three Graphical Display screens.
Each screen is launched as a separate window that can be moved or resized as
needed. The display screens are not minimized if the Connect Screen is
minimized, but they can be minimized independently.
5-13
Section 5. Real-Time Tools
limited to a maximum of 500 input locations by packet size and record format.
CR1000X-series, CR6-series, CR300-series, GRANITE 6, GRANITE 9,
GRANITE 10, CR1000, CR3000, CR800-series, CR5000, CR9000, and
CR9000X dataloggers don’t have practical limits for the number of Public
Variables that can be requested or displayed.
Both mixed-array and CRBasic dataloggers will transfer only the requested
input locations or public variables. However, CR10X-TD type dataloggers
must transfer the entire input locations record in one packet. Because of packet
size, if more than 500 input locations are used in the datalogger program the
record cannot be collected.
5-14
Section 5. Real-Time Tools
Selecting a table name or final storage array ID will bring up a list of data
fields in the right hand window. Select the fields to add by clicking the data
field names. Multiple data fields can be selected by holding down the Shift or
Ctrl key while clicking additional names. An entire table or array can be
selected by clicking the table name.
The selected data fields can be added to the display either by clicking the Paste
button to enter them on the display starting at the selected cell, or dragging the
selected fields to the display cells. The Add Selection dialog can be kept in
front of other windows by clicking the Stay on Top check box.
5-15
Section 5. Real-Time Tools
Once the fields have been added to the Numeric Display and the Start button
has been pressed, they will update automatically as new data is collected from
the datalogger. Once Start has been pressed, the name of the button changes to
Stop, and it can then be pressed to stop monitoring the data values.
To display any units that have been assigned to data values, select the Show
Units check box.
To delete data fields from the Numeric Display, select the data fields on the
display and press the Delete button. You can delete all data fields from the
display using the Delete All button. Adding new data fields on top of existing
fields in the display will overwrite the existing fields.
Display Tab
Format – Allows you to specify the format in which the data value is
displayed. You can choose to have it automatically formatted, to specify the
number of decimal places up to a maximum of seven, or to display the data
value as a timestamp.
Boolean Options
True Text/False Text – Allows you to choose the text that will be displayed
for Boolean values (such as ports and flags, or variables declared as Boolean).
By default the strings are True/False, though they could be set to High/Low,
On/Off, etc.
Timestamp Options
Show Dates – When this check box is enabled, the date will be included in
timestamps that are displayed on the numeric monitor. Otherwise, only the
time will be displayed.
Color Options
Cell Color – Defines the color to be used for the background of the data value
name. Press the button to the right of the color square to define a new color.
5-16
Section 5. Real-Time Tools
Text Color – Defines the color to be used for the data value name. Press the
button to the right of the color square to define a new color.
Data Value – Determines whether the data value will be flush with the right or
left side of the cell.
Alarms Tab
Enable Alarms – Alarms can be set to turn the background of a field a
different color depending on the value of a data point. Select the Enable
Alarms check box to turn on the alarm feature. The alarm levels can be set for
one or more selected cells by entering a value in the High Alarm Trigger Value
field and the Low Alarm Trigger Value field. When the data value displayed in
a cell exceeds the limits set, the cell color will change based on the colors
selected for each of the alarm values. If the Enable Sound check box is
selected, the indicated sound file will be executed when an alarm is triggered.
The sound file will be repeated based on the Sound Alarm Interval while the
alarm is active.
Setup Tab
Number of Rows/Columns – Allows you to configure the number of rows or
columns for a numeric display. The maximum number of rows is 100. The
maximum number of columns is 10. The maximum number of total cells is
300. After changing the number of rows or columns, you may need to resize
the numeric display window (by dragging one of the window’s corners) for the
font size and cell size to properly accommodate the values.
Restore Defaults – Select this button to restore the numeric display to the
default configuration of 18 rows and 3 columns.
5.1.7.2.4 Font
The font size on a Numeric Display cannot be changed directly. However, you
can drag a corner of the Numeric Display window to resize it. This will also
cause the font to be resized to correspond to the new window size.
5-17
Section 5. Real-Time Tools
Pressing the Graphs button and selecting one of the numbered graphs will
bring up a graphical display. An example is shown below. If a Graphical
Display screen is already active but hidden behind other windows, selecting it
from the drop-down list will bring it to the front.
5-18
Section 5. Real-Time Tools
Once data fields have been added and the Start button has been pressed, the
graph will start plotting data automatically. If there is historical data for the
selected fields in the data cache, it will be displayed on the graph. Note that
Input Location or Public table data does not have history stored in the data
cache; therefore, no historical data will be displayed.
To delete data fields from the Graphical Display, select the data fields on the
Graphical Display and press the Delete button. Existing data fields can be
replaced by adding new data fields to the same cells.
Once the Start button has been pressed, the name of the button changes to
Stop, and the plotting of data can then be temporarily stopped by pressing this
button. The name of the button then changes back to Start, and it can be
pressed to resume plotting. The contents of the graph can be cleared by
pressing the Clear button.
The Rescale button is used to bring outlying data values back within the
vertical axis of the graph when using one of the Powers of 10 scaling options
(see below for information on Powers of 10 scaling).
The Graph Width field determines the amount of time displayed across the
width of the graph. The default setting of 1 minute will display 1 minute of
data. If larger graph widths are specified the graph will backfill to plot
whatever data is available in the data cache.
The Drawing Mode determines how data is plotted on the graph. The choices
are Strip Chart or Shift Data. In Strip Chart mode, the data will stream across
the graph. After the graph is filled, the oldest points will fall off the left edge of
the graph as new points are added to the right edge. If Shift Data is chosen, the
data will be positioned in a static location. Once the graph is filled, the data on
the graph will be shifted over. The size of this shift and, therefore, the amount
of data that will be removed from the graph is determined by the percentage
specified in the Shift % field.
To display any units that have been assigned to data values, select the Show
Units check box.
5-19
Section 5. Real-Time Tools
Scaling Tab
The Scaling options has tabs to set up the scale for the left and right axes. The
axes can be scaled automatically, fixed to a specific range, or set to Powers of
0 to 10 or Powers of –10 to 10.
The left and right axes are set independently. Data values can be set to graph
relative to the right or left axis by customizing the Trace Options (see below).
The axes scales are only shown on the graph if there are data fields linked with
them.
Automatic Scaling – Adjusts the axis according to the values currently being
displayed on the graph. The maximum and minimum will be set to display the
largest and smallest values of all the fields being graphed. If there are values
that are much larger or much smaller than the others, they can dominate the
scaling and make variations in the other fields harder to see.
Powers of 10: 0 to 10 – Data values will be scaled so that they fit on a graph
ranging from 0 to 10. Negative values will not be displayed. Each trace is
scaled based on its maximum value. If that value is greater than 10, all of the
points in the series are divided by 10 until the maximum value is less than or
equal to 10. If the maximum value is greater than 0 but less than or equal to 1,
all of the points in the series are multiplied by 10 until the maximum value is
greater than 1. Note that scaling occurs when the Apply button is pressed on
the dialog box. Rescaling does not automatically occur if the maximum value
5-20
Section 5. Real-Time Tools
goes out of the range. However, you may press the Rescale button to
recalculate the scale at any time.
Powers of 10: –10 to 10 – Data values will be scaled so that they fit on a graph
ranging from –10 to 10. Each trace is scaled based on its maximum and
minimum values. If the maximum value is greater than 10 or if the minimum
value is less than –10, all of the points in the series are divided by 10 until the
maximum value is less than or equal to 10 and the minimum value is greater
than or equal to –10. If all of the points in the series are greater than or equal to
–1 and less than or equal to 1, all of the points in the series are multiplied by 10
until the at least one value is outside that range. Note that scaling occurs when
the Apply button is pressed on the dialog box. Rescaling does not
automatically occur if the values go out of the range. However, you may press
the Rescale button to recalculate the scale at any time.
Custom Scaling – The Max Value and Min Value fields are used to specify a
fixed range for the Y axis. When this option is in effect, the Rescale button on
the Graph will be disabled. Any data values that do not fall within the specified
range will not appear on the graph.
Buffer Data – Determines whether data coming into the graph is buffered.
Select the Buffer Data check box to buffer data. Clear the check box to
prevent data from being buffered.
When the Buffer Data option is on, a stopped graph can be scrolled backward
in time for the amount of pages specified. The paused graph will have arrows
which appear to facilitate moving backward or forward one page at a time,
moving to the earliest page, or moving to the latest page. You can also
manually navigate through the buffered data by right-clicking and dragging to
move backward or forward in time.
5-21
Section 5. Real-Time Tools
5-22
Section 5. Real-Time Tools
Display Options – These options are used to set the look of the graph itself.
The Background Color selects the color of the graph background. This is white
by default. Be careful to select a background color that does not make any of
the data traces disappear.
The Left / Right Grid Colors select the color of the grid lines that go with the
left or right axis scale. The X-Axis color sets the color of the X-Axis.
Data Direction – This option determines how the data values will populate the
graph – right to left (oldest data on the left and newest data on the right) or left
to right (oldest data on the right and newest data on the left).
Title – Allows you to place a descriptive title over the top edge of the graph
and for each axis. Enter the text of the title in the text box, then click the Show
Title check box to make the title visible. Click the Font button to choose the
font style, size and color for the title.
Ignore NAN – If this option is selected, the NAN is not represented on the
graph. There will be one continuous line in which the data points adjoining the
NAN values will be bridged
Plot NAN as gap – When this option is selected any NAN value in the data
will be represented by a discontinuity. This means that the data points on either
side of the NAN will not be connected by a line. There will be breaks in the
line for each NAN in the data.
Plot NAN as value – With this option is selected, each NAN value in the data
will be represented by the specified value.
5-23
Section 5. Real-Time Tools
Configuration Tab
The Configuration tab of the Graph Display Options dialog box is used to
Save or Load graph configurations.
Save Config – Saves the graph configuration to a file that can be loaded in the
future.
5-24
Section 5. Real-Time Tools
Display Tab
Color – Sets the color of the trace and the data points. The user can choose
from the Windows color palette for the color. The color for this trace is shown
in the color window.
Select Axis – Sets whether the data trace is displayed on the left or right axis.
Line Width – Sets the width of the trace. Wider traces are easier to see but
may obscure other traces or small variations of the data.
Symbol Style – Selects the appearance of the individual data points. Data
points are displayed on the line in the Graph Options, Visual Display tab (see
Graph Options above).
Show Symbol – Determines whether symbols are displayed for this data value.
Select the check box to have symbols displayed. If the check box is cleared, no
symbols will be displayed.
Marks Tab
Marks are the labels for the data points on the graph. The items on this tab
affect how the marks appear on the graph.
Show Marks – When selected, the labels for the data points are displayed on
the graph. When cleared, they are not displayed and all other items on this tab
are disabled.
Draw Every – Determines how often the data points should be labeled. When
set to 1, every data point will be labeled. When set to 2, every second data
point will be labeled, and so on.
Round Frame – When selected, the frame that surrounds the data label will
have rounded corners. When cleared, the frame will have angled corners.
Transparent – When selected, a frame will not be displayed for the data
labels.
Color – Defines the color of the background for framed data labels. Press the
button to the right of the color square to define a new color.
Data – Determines the type of label to be displayed for data point. Value
displays the numeric value for the data point. Timestamp displays the time for
the data point. Value and Timestamp displays the time and the numeric value
for the data point.
Right Click Within the Graph – Displays a short cut menu with items for
saving, printing, and formatting the graph:
5-25
Section 5. Real-Time Tools
Rescale – Scales the data values so they are all displayed within the graph
boundaries. This option is available only when Powers of 10 Scaling is
chosen for the graph.
Print Preview – Displays a preview of the printed page with the ability to
set the paper orientation, page margins, and other print properties.
Print – Brings up the standard windows Print dialog box so that the graph
can be printed.
Right Click on a Table Cell – Displays a short cut menu with options specific
to traces.
Add – Brings up the Add Selection dialog box from which you can add a
trace to the graph.
Rename – Sets the name of the field to a state in which it can be edited.
Do Not Plot – Stops the trace from being plotted on the graph. A check
mark appears beside the Do Not Plot menu item for a trace that will not be
plotted. Record numbers and timestamps are not plotted.
Delete All – Resets all settings for the traces on the graph. This will
remove all traces from the graph.
Select All – Selects all traces on the graph. This allows options for all
traces to be set at once.
Show Symbol – Determines whether symbols are displayed for this data
value (symbols are configured from the Trace Options dialog box.)
5-26
Section 5. Real-Time Tools
Trace Options – Displays a dialog box that lets you set the color and
appearance of the trace.
Zoom and Scroll – An area of the graph can be zoomed in on by using the
mouse pointer to draw a box around the area to be viewed. (Place the mouse
cursor in the area for the upper left of the box, press the mouse button, and hold
and drag the mouse pointer to the desired bottom right corner of the box.) To
return to normal view, press the Undo Zoom button in the upper right corner
of the graph. (You may also press the mouse button, hold and drag the mouse
pointer up and to the left to return to normal view.)
If you have stopped a graph and zoom in to a region, you can use the right
mouse button to drag the screen and thus scroll to other locations of the graph
at the current zoom level.
To begin, select the table you wish to monitor from the drop-down list. The
fields of the specified table will be displayed in the Field/Value grid.
When data is being monitored, you can press the Stop button to stop the
monitoring of data. The text on the button will change to Start and it can be
pressed to start monitoring data again.
The Interval determines how often data in the table monitor will be updated.
This interval controls how often the table monitor is updated, only when you
are actively connected to a station (by pressing the Connect button).
When you are not actively connected to a station, the table monitor will be
updated only when data is collected (on a schedule or by pressing the Collect
Now or Custom buttons).
5-27
Section 5. Real-Time Tools
memory programs that have been downloaded to them unless the programs are
specifically deleted or the datalogger memory is completely reset.
File Control is used to manage all the files on these dataloggers. File Control is
opened from a button on the Connect Screen or from the Connect Screen’s
Datalogger | File Control menu item.
The File Control window displays a list of files stored on the datalogger’s
CPU, PC card, USB, or USR drive. The window on the left lists all of the data
storage devices available for the selected datalogger (CPU, CRD, USB, CS9,
or USR). Selecting a device shows a list of the files stored there.
The Run Options for a file indicate whether it is set to running, power up, or
both. The currently executing program is indicated by the running attribute.
The file that will be run when the datalogger is powered up is indicated by
power up. The file size is displayed, as well as the last time the file was
modified and the file attributes which indicate whether the file is Read Only
5-28
Section 5. Real-Time Tools
(R) or Read/Write (RW). Note that the Size, Modified date, and Attributes may
not be available for all dataloggers.
At the bottom of the right-hand side of the window is a summary box that
indicates the Running Program, the Run On Power Up Program, the current
Program State (running, stopped, or no program), and the last compile results.
There are several options to work with the files and directories on the
datalogger.
Send is used to transfer files from the computer to the datalogger. Clicking the
Send button brings up a standard file selection dialog box. A new file can be
chosen to send to the highlighted device.
Datalogger programs, data files, and other ASCII files can be sent to the
datalogger.
Format is used to format the selected device. Just like the formatting a disk on
a computer, all of the files on the device are deleted and the device is
initialized.
Refresh will update the list of files for the selected device.
Retrieve will get the selected file from the datalogger and store it on the
computer. A Save As dialog box comes up allowing you to specify the
directory and file name for the saved file.
NOTE File Control should not be used to retrieve data from a CF card
created using the CardOut instruction. Using File Control to
retrieve the data file can result in a corrupted data file.
Run Options brings up a dialog box that is used to control what program will
be run in the datalogger. Highlight a file, and then select the Run Options
button. From the resulting dialog box, select the run options.
Run Now
The Run Now run options are different for the different datalogger
types.
CR1000X-Series/CR6-Series/CR300-Series/GRANITE 6/
GRANITE 9/GRANITE 10/CR1000/CR3000/CR800-Series
Datalogger Run Now Options
5-29
Section 5. Real-Time Tools
When Run Now is checked, the program is compiled and run in the
datalogger. You may choose to preserve existing data tables on the
datalogger’s CPU if there has been no change to the data tables
(Preserve data if no table changed) or to delete data tables on the
CPU that have the same name as tables declared in the new program
(Delete associated data tables).
To summarize, any change in data table structure will delete all tables
on the datalogger’s CPU, regardless of whether or not the Preserve
Data option was chosen. If the Preserve Data option was chosen but
the datalogger was unable to retain the existing data, the following
5-30
Section 5. Real-Time Tools
The Run Now options and behavior for the CR9000(X) and CR5000
dataloggers are different from the CR1000X-series, CR6-series,
CR300-series, GRANITE 6, GRANITE 9, GRANITE 10, CR1000,
CR3000, and CR800-series dataloggers. Below is a dialog box for a
CR5000 datalogger.
When Run Now is checked, the program is compiled and run in the
datalogger. All data tables on the CPU are erased. You have the
option of whether or not to erase data files stored on a card.
Run On Power-up
The file will be sent with the Run On Power-up attribute set. The
program will be run if the datalogger loses power and then powers
back up.
Run Always
Run Now and Run On Power-up can both be selected. This sets the
program’s file attribute in the datalogger as Run Always. The
program will be compiled and run immediately and it will also be the
program that runs if the datalogger is powered down and powered
back up.
5-31
Section 5. Real-Time Tools
Delete – Highlight a file and press the Delete button to remove the file from
the datalogger’s memory.
If you select the option to Delete Data, you also have the option of whether or
not to clear the Run On Power-up option for the file. Select the check box to
clear the Run On Power-up option. Clear the check box to leave the Run On
Power-up option of the file unchanged.
When a file name is selected, pressing the right mouse button displays a menu
with the Retrieve File, Delete File, Rename File, View File (retrieves the file
and opens it in the CRBasic Editor), and Run Options choices.
NOTE The View File option can be used to edit a program on your
datalogger. After making the desired edits and saving it to your
computer, you will need to send the edited program to the
datalogger.
5-32
Section 5. Real-Time Tools
To view Station Status press the Station Status button or select Datalogger |
Station Status from the menu on the Connect Screen. A window similar to the
one below will be displayed.
NOTE When you connect to a station, LoggerNet checks for Status Table
errors. If the station has Status Table errors (skipped scans,
skipped records, and so forth), a yellow exclamation point will be
added to the Station Status button. Once you click on the Station
Status button, this indicator will be removed.
5-33
Section 5. Real-Time Tools
The window has three tabs. The Summary tab provides an overview of
important status information in the datalogger, including the information about
the datalogger model and its firmware, program details, battery voltage levels,
and card memory (if one is present).
The Table Fill Times tab lists the tables in the datalogger, along with the
maximum number of records the table can hold and the estimated amount of
time that it will take the table to fill. A data table can be reset from this window
by pressing the Reset Tables button.
NOTE Resetting a table will erase the data in the datalogger and in the
data cache.
The Status Table tab lists all of the status table fields in the datalogger along
with their values. By default, all of the fields in the status table are displayed.
To select only certain status data to be viewed, press the Select Fields button.
This will display a list of the status data available in the datalogger. Select one
or more of the fields and then press OK. The current values will be displayed
in the table. If you select a cell within the status table and right click, a short
cut menu will be displayed. From this menu, you can select fields or
view/modify a value (if it is a writable value).
Press Refresh to prompt LoggerNet to query the datalogger and update the
values again, the Print button to print the information in the current tab, or the
Save button to save the information in the tab being displayed to a file. (Note
that you cannot save or print the information on the Table Fill Times tab.)
Refer to individual datalogger manuals for a list of fields included in the Status
Table for each datalogger and a description of each.
The program running in the datalogger must contain one or more FieldCal or
FieldCalStrain instructions for the variables you wish to calibrate. One function
of this instruction is to write a programname.cal file to datalogger memory that
5-34
Section 5. Real-Time Tools
Project Component List – The panel on the left shows the hierarchy of the
display components and how they are associated with each other. Every
component of the display screen is shown in this list and it provides a shortcut
to get to any graphical component.
Project Workspace – The right panel is the display screen workspace. The
graphic components are placed in the workspace, as they should appear on the
final display.
5-35
Section 5. Real-Time Tools
Component Toolbox – The toolbox on the top contains the display screen
components that can be placed in the workspace. Selecting a component and
clicking in the workspace places the component and brings up the Properties
window for that component.
Many images have been included with RTMC. The directory in which these
files are stored is C:\Campbellsci\Lib\RTMCMediaLib. Custom images can be
used as well; these should be placed in the media library directory to make
them available for RTMC’s use.
5-36
Section 5. Real-Time Tools
tabs in the project. The size of the workspace (and the resulting run-time
window) can be changed by selecting Project | Configure WorkSpace.
NOTE When a display component is linked to a data value, the value will
be automatically updated on the display when data is collected by
LoggerNet on a schedule. If scheduled data collection is not set up
in LoggerNet or the selected data value is excluded from
scheduled collection, the values will not update and an
exclamation point will appear in the upper right corner of the
component. Input locations, ports and flags for mixed-array
dataloggers are collected at the scheduled collection interval or
any time a manual collection is done.
After a component’s properties have been set, select OK to enable the changes
and close the Properties window. Once the link to the data value has been
applied, if there is data available from LoggerNet for the component, the value
on the display will update.
Available Components
The following is an overview of the display components available. The online
help has detailed information about each of the components and their
properties.
Status Bar depicts the selected data value as a single vertical bar.
5-37
Section 5. Real-Time Tools
Chart displays one or more traces on a line graph. The time stamp
on the X axis reflects the server clock. Note that a difference in the
server clock and the datalogger clock, coupled with a small time
window for the chart, could result in no data being displayed.
Time displays the server time, server time at last data collection,
station time, station time of last record stored, or PC time.
5-38
Section 5. Real-Time Tools
File Menu
New Project starts a new RTMC project. The currently opened project will be
closed. If there are changes that have not been saved the user will be prompted
to save changes.
Open brings up the File Open dialog to open a previously saved project.
Save will save the changes in the current project to the RTMC project file. If
this is the first time the project has been saved, a Save As dialog will open to
select the file name and directory for the project file.
Save As brings up the Save As dialog to save the current project with another
name or in a different directory.
Save and Run Project saves the changes in the current project and displays it
in the run-time window.
Exit closes RTMC. If there are unsaved changes, the user will be prompted to
save changes before exiting.
Edit Menu
Cut/Copy/Paste are standard editing operations to take selected objects to the
Windows clipboard and paste them into RTMC or other applications.
Select All selects all of the components in the workspace. The components can
then be cut, copied, deleted, grouped, etc.
The Preferences menu item is used to change some global settings that affect
all projects in RTMC. The Visual Theme determines the look and feel of the
application (i.e., colors, button appearance, etc.). The Working Directory is the
directory in which to store RTMC project files. By default, this is
C:\Campbellsci\RTMC. Press the Change Default Font button to set a new
font for components that have text (numeric value text, graph titles and axes
labels, etc.).
Component summaries are small boxes that are displayed on the screen beside
a component when your mouse cursor hovers over the component for a few
seconds. The box displays information on the type of component, the data
value linked to the component, images used, traces plotted, etc. Select the
Show Component Summaries box to display these hint boxes or clear the box
to turn off the display of the information.
5-39
Section 5. Real-Time Tools
The Grid Options settings allow you to turn on or disable the display of a grid
in the project workspace, and lets you see the size of the grid.
With the Graphics Options settings, you can control the maximum number of
times the RTMC screens will be updated per second, disable animation when a
data value changes, and specify whether high quality or high speed is more
important. (Disabling animation disables the smooth transition between values
on gauges, status bars, etc. When a data value changes, the component will
jump to the new value. This greatly enhances performance when dealing with
fast data or large, complex projects.)
The Customize menu item allows you to customize RTMC’s toolbars and
menus.
View Menu
All of the View menu items are toggles. When a check mark appears to the left
of the menu item, it is enabled. When the check mark is absent, the option is
disabled. If an option is off (unchecked), select it once to turn it on (checked)
and vice versa.
Full Screen Mode expands the RTMC workspace to fill the entire computer
screen. This provides more space to work with in designing your RTMC
project. In this mode, you must use the right-click menus to add components
and perform other functions available from RTMC’s toolbar. Press the Esc key
to exit this mode.
Show Project Tree hides or displays the RTMC Project Tree (left pane of the
default window).
Show Layout Toolbar hides or displays the Layout Toolbar which gives quick
access to the Align, Space Evenly, Make Same Size, Center, and Order menu
items of RTMC’s Component menu.
Show Tabs hides or displays the tabs which allow the user to switch between
screens. When tabs are not displayed, you can switch between screens by
selecting a screen from the Project Tree.
Show Status Bar hides or displays the Status bar at the bottom of the screen.
The Status Bar provides hints on objects, window size, and the server
connection.
Project Menu
Project Menu options work with the whole project or workspace.
5-40
Section 5. Real-Time Tools
Configure Auto Tabbing lets you enable or disable the automatic switching
between project tabs when an RTMC form is run, and set the rate at which a
new tab will be displayed. When RTMC is in AutoTab mode, it will display a
tab for a set amount of time and then display the next tab.
Change Screen Order allows you to change the order that the screens will
appear, left to right, in the project.
Screen Menu
Screen Menu options work with the tabbed screens in the project. The Screen
Menu is also available by right clicking any blank area of the workspace.
Screen Properties brings up the dialog to choose the background image and
color for the current screen.
Delete Screen removes the current screen from the project. If there are
components on the screen, they will also be removed.
Rename Screen brings up a dialog to change the name of the current screen.
This is the name that appears on the screen tab in run-time mode.
Duplicate Screen creates a duplicate of the active screen. Once the duplicate
screen is created, it can be modified as needed. This allows you to easily create
multiple similar screens, for example, screens that display the same
information for different stations.
Paste places a copy of the Windows clipboard content on the active screen.
(Note that only valid XML code can be pasted into RTMC.)
Insert New brings up a submenu allowing you select one of the components to
insert on the screen. When the component is added to the screen the Properties
window for the new component will come up.
Component Menu
The Component Menu is used to set the component properties, placement and
alignment. The Component Menu is also available by right clicking any of the
components in the workspace.
5-41
Section 5. Real-Time Tools
Lock Aspect Ratio allows you to drag the object to a new size without
distorting the look of the component. If the height of a component is changed,
the width will automatically be changed as well.
Rename Component lets you change the name of the component in the list
tree.
Manual Resize lets you set the size and position of the selected component.
Cut deletes the selected component and places a copy on the Windows
clipboard.
Paste places a copy of the Windows clipboard content on the active screen.
(Note that only valid XML code can be pasted into RTMC.)
Align provides some options for lining up a group of components with the last
component selected. Select two or more components by using the cursor to
click and drag a bounding box around the desired components. Components
can also be selected by selecting the first component and then selecting the
other components while holding down the <ctrl> key. With the components
selected choose one of the alignment options. The components will be aligned
based on the last component selected. The last component is identified by dark
blue handles and by dark blue highlighting in the Project Tree. The other
selected components have handles with blue outlines and are highlighted in
light blue in the Project Tree.
NOTE Be careful about the alignment you choose. Selecting Top Align
for a group of components that are arranged vertically will cause
all the components to end up on top of each other.
Make Same Size allows you to set two or more objects to the same overall
size, width or height as the last object selected. Select one or more components
by using the cursor to click and drag a bounding box around the desired
components. The components can also be selected by selecting the first
component and then selecting the other components while holding down the
<ctrl> key. The last component is identified by dark blue handles and by dark
blue highlighting in the Project Tree. The other selected components have
handles with blue outlines and are highlighted in light blue in the Project Tree.
5-42
Section 5. Real-Time Tools
Group Selection allows you to group components together. They can then be
moved, copied, ordered, etc. as a single object. Select the components to be
grouped by holding the Ctrl key and clicking the components with the primary
mouse button. Then choose the Group Selection item from the Component
menu or the Component right-click menu. You must have at least two
components selected for this menu item to be enabled.
When a component group is selected, the Ungroup Selection menu item will
be enabled. You can undo the component grouping by selecting this menu
item.
When components are grouped, the properties for each of the components will
show up as an item in the Component right-click menu. These menu items can
be used to modify the properties for each component.
If there are multiple screens in the project, the Window Menu will allow you
to change between the screens.
Help Menu provides access to help for all of the features of RTMC.
5.2.1.4 Expressions
Components that display data values either numerically or graphically can be
processed using expressions. These expressions can include simple
mathematical expressions, functions to manipulate strings, or more complex
functions that deal with the state of a data value over time.
“Server:CR5000.TempData.Temp1” * 1.8 + 32
Strings
As shown above, double quotes are used in RTMC to enclose the name of a
data value (or source, datalogger, or table depending on the component).
Therefore, when defining a literal string, a dollar sign is used as a prefix. This
indicates to RTMC that you are defining a literal string rather than a data value.
For example, to search for the position of the sequence abc in the data value
mystring, you would use the following expression:
Expressions can also use Statistical Functions, some of which involve the state
of a data value over a period of time. For instance, you can return the
maximum value of a data value over the past 24 hours using the expression:
5-43
Section 5. Real-Time Tools
MaxRunOverTime(“Server:CR1000.QtrHour.Temp”,Timestamp(“Server:
CR1000.QtrHour.Temp”),nsecPerDay)
StartRelativeToNewest(nsecPerDay,ordercollected);
MaxRunOverTime(“Server:CR1000.QtrHour.Temp”,Timestamp(“Server:
CR1000.QtrHour.Temp”),nsecPerDay)
Aliases
Alias(alias_name, data_value)
For example,
StartAtOffsetFromNewest(5,OrderCollected);IIF(ABS((“Server:CR1000.MyT
able.Value”-
ValueAtTime(“Server:CR1000.MyTable.Value”,TimeStamp(“Server:CR1000.
MyTable.Value”),30*nsecPerSec,0))>10 AND
ABS(ValueAtTime(“Server:CR1000.MyTable.Value”,TimeStamp(“Server:CR
1000.MyTable.Value”),30*nsecPerSec,0)-
ValueAtTime(“Server:CR1000.MyTable.Value”,TimeStamp(“Server:CR1000.
MyTable.Value”),60*nsecPerSec,0)))>10,1,0)
Alias(X,”Server:CR1000.MyTable.Value”);StartAtOffsetFromNewest(5,Order
Collected);IIF((ABS(X-ValueAtTime(X,TimeStamp(X),30*nsecPerSec,0))>10
AND ABS(ValueAtTime(X,TimeStamp(X),30*nsecPerSec,0)-
ValueAtTime(X,TimeStamp(X),60*nsecPerSec,0)))>10,1,0)
Synchronizing Variables
The ValueSynch function can be used to synchronize data values coming from
multiple data sources so that you can display the results of a calculation on
those data values in a single component. The Value Synch function takes the
form:
ValueSynch(synchronized_name, data_value)
5-44
Section 5. Real-Time Tools
For example, if you wish to display the average air temperature of two stations
on a chart, the following expression can be used to synchronize the timestamps
of the stations and then calculate the average air temperature:
ValueSynch(air_temp_1,”Server:CR1000_1.SECOND.air_temp”);ValueSynch
(air_temp_2,”Server:CR1000_2.SECOND.air_temp”); (air_temp_1 +
air_temp_2) / 2
If the timestamps of the stations are not the same (for example, if
one datalogger is a few minutes behind the other), the component
will display the exclamation point indicating no data, until the data
sources have common timestamps and, therefore, can be
synchronized.
All of the functions available in RTMC are described below. For details on a
function, refer to RTMC’s online help.
5.2.1.4.1 Operators
Operator Description
() Prioritizes parts of an expression within the larger expression.
* Multiply by
/ Divide by
^ Raised to the power of
+ Add
– Subtract
= Equal
<> Not equal
5-45
Section 5. Real-Time Tools
Operator Description
> Greater than
< Less than
>= Greater than or equal to
<= Less than or equal to
• Exponentiation ^
• Negation (unary) –
• Multiplication *, division /
• Addition +, subtraction –
When consecutive operators have the same priority, the expression evaluates
from left to right. This means that an expression such as a-b-c is evaluated as
(a-b)-c.
e 2.718282
PI 3.141593
True –1
False 0
NOPLOT NAN
5-46
Section 5. Real-Time Tools
Constant Description
5.2.1.4.5 Functions
The following functions show the use and placement of the numbers the
function operates on. The parentheses are not required unless there are two or
more parameter values. (e.g., ATN2(y,x))
Function Description
5-47
Section 5. Real-Time Tools
Function Description
Function Description
5-48
Section 5. Real-Time Tools
Function Description
5-49
Section 5. Real-Time Tools
Function Description
5-50
Section 5. Real-Time Tools
5-51
Section 5. Real-Time Tools
Function Description
The User Name and Password are only used if you are connecting to a
LoggerNet server that supports security and the network administrator has
implemented security.
Clicking Remember username and password will save the computer address,
username and password as part of the RTMC file so the screen can be run
without requiring the user to know the address or when using RTMC Run-
Time
5-52
Section 5. Real-Time Tools
When the run-time display screen is started, the display components will have
a red exclamation point in the upper right corner until data is received from
LoggerNet. If data is not displayed, check to see that the data is being collected
on a schedule by LoggerNet.
Once a project file has been created, the display screen can be run without
starting the development mode window. Select Data | RTMC Run-Time from
the LoggerNet toolbar. In the Run-Time window select File | Open to select
the RTMC project screen to run.
In Run-time mode, you can print an image of the RTMC display screen by
selecting File | Print Screens. A new form to be run is selected under File |
Open.
A copy of RTMC Run-Time comes with LoggerNet. If you want to run RTMC
projects on remote computers, additional copies of RTMC Run-Time can be
purchased separately. One copy is required for each computer on which RTMC
Run-Time will be used. As noted above, when running RTMC Run-Time on a
remote computer, the host computer must have Remote Connections enabled
(LoggerNet Toolbar, Tools | Options | Allow Remote Connections).
5-53
Section 6. Network Status and
Resolving Communication Problems
LoggerNet provides several tools for monitoring the status of a datalogger network and
troubleshooting communication problems within that network.
The Status Monitor screen provides a way to monitor communications statistics. Statistics
are displayed for data collection attempts and communication failures. PakBus Graph
provides a visual representation of the devices in a PakBus network and lets you edit
PakBus device setting. The Log Tools utility provides a way to read communication logs
more easily. The Troubleshooter highlights potential problems in a communication network
and provides access to a Communications Test and other troubleshooting tools. The
LoggerNet Server Monitor is used to monitor the communication log for a remote instance
of LoggerNet or when LoggerNet is being run as a service.
Note that a Troubleshooting section is also provided in Section 14, Troubleshooting Guide (p. 14-1).
Above the display area is a row of buttons providing quick access to many of
the functions available in the Status Monitor (these buttons are discussed in
subsequent sections). Below the display area is the current LoggerNet server
time, an indication of the disk space available on the computer, and a check
box for Pause Schedule. Checking Pause Schedule suspends scheduled data
collection for all of the dataloggers in the network. This can be useful when
trying to isolate specific problems.
6-1
Section 6. Network Status and Resolving Communication Problems
( failures + retries )
Height = 25 + 75 *
attempts
6-2
Section 6. Network Status and Resolving Communication Problems
where
6-3
Section 6. Network Status and Resolving Communication Problems
Entries in the Available Columns field will not be displayed on the main
screen. Entries in the Selected Columns field will be displayed on the screen.
The arrow buttons are used to move entries between the two columns.
Alternately, an entry can be moved from one column to the other by double
clicking it.
A variety of status information and statistics are available. Note that some
statistics are obtained automatically as part of data collection for some
dataloggers but can be only obtained with additional communication
commands for other dataloggers. In this latter case, these statistics are not
retrieved by default as users with slow or expensive communication may not
wish to incur the additional cost or time associated with the extra commands.
In cases where the user does want to retrieve the additional statistics, the Poll
for Statistics setting (on the datalogger’s Schedule tab in the Setup Screen)
can be enabled to request that the statistics are retrieved. The statistics will be
retrieved during scheduled or manual data collection. These statistics are
shown in the table below. The table also shows how the LoggerNet server
maps these server statistics to the Status Table of each datalogger.
6-4
Section 6. Network Status and Resolving Communication Problems
Primary Retry – A data collection failure has led to the primary retry
collection schedule.
Secondary Retry – The number of primary retries set in the Setup Screen
have run out and the secondary retry schedule is now active.
Schedule Off – Scheduled data collection is not enabled for this device.
Network Paused – Scheduled data collection has been suspended for the
whole network.
6-5
Section 6. Network Status and Resolving Communication Problems
• Last Clock Check (Last Clk Chk) – The computer date and time of the
last time the clock was checked for this device.
• Last Clock Difference (Last Clk Diff) – The amount of time the
datalogger clock deviated from the LoggerNet computer’s clock when the
last clock check was performed. If the datalogger clock is slower than the
computer clock, this will be a positive value.
• Last Clock Set (Last Clk Set) – The computer date and time that the
datalogger’s clock was last set to match the LoggerNet computer’s clock.
• Last Collect Attempt (Last Col Attempt) – The computer date and time
when data collection was last attempted for this device.
• Last Data Collection (Last Data Coll) – The date and time that data was
last collected from the device by LoggerNet.
• Lithium Battery Voltage (Lith Batt Volt) – The voltage level of the
datalogger’s lithium SRAM back-up battery.
• Low 5v Battery Detect (Low 5V) – A counter that indicates the number
of times the datalogger’s 5V supply has dropped below 5V. The counter’s
maximum limit is 99. For array-based dataloggers, it can be reset in the
datalogger’s *B mode.
6-6
Section 6. Network Status and Resolving Communication Problems
• Low Voltage Stopped (Low Volt Stopped) – The number of times the
datalogger program has been halted because the datalogger’s 12 V power
source has dropped below the minimum power requirement. The counter’s
maximum limit is 99. For array-based dataloggers, it can be reset in the
datalogger’s *B mode.
• Next Data Collection (Next Data Coll) – The date and time of the next
scheduled data collection for the device.
Current – The table definitions from the datalogger match what LoggerNet
has stored as the table definitions for the datalogger.
Invalid Table Defs – The table definitions from the datalogger do not
match what LoggerNet has stored as the table definitions for the
datalogger. Table definitions will need to be updated before data collection
can occur.
6-7
Section 6. Network Status and Resolving Communication Problems
• Values In Last Collect (Vals Last Collect) – The number of values that
were collected during the last data collection attempt. Used in combination
with the Values to Collect the user can get an idea of how much data is left
to collect.
6.1.2.2 Display/Subnet
The Display button can be used to determine what is shown in your network
map. You can choose to view only your dataloggers by selecting Stations
Only. Selecting All Devices will show your entire network including root
devices, communication devices, etc.
In LoggerNet Admin, you can also use the Subnet button to view a subnet of
your network map. Subnets are configured from the Setup Screen’s View |
Configure Subnets menu item.
6-8
Section 6. Network Status and Resolving Communication Problems
The Tools | Pool Statistics menu item opens a new window displaying
statistics for all of the pooled devices in the network. For each pooled device
(resource), the following information is given:
• Overall Error Rate – This represents the error rate based on all dialing
attempts this device has made.
• Available – This indicates if the resource is currently available or if it
is in use.
• Percent Used –This gives an indication of how much this resource has
been used. This can assist in organizing pools and devices to minimize
wait time or increase calling rates.
• Current Target – This shows the target device of the current call.
6-9
Section 6. Network Status and Resolving Communication Problems
The Tools | Pool Devices menu item opens a new window that offers
information about each pool (root device) and each pooled device that has been
assigned to it. For example, given that a modem pool root device
(SerialPortPool or TerminalServerPool) for a particular station has three serial
ports assigned to it, the specific information for each of the serial port/modems,
based on when it has been used to call that station, can be viewed by selecting
the Modem Pool and in turn each Modem Pool Resource assigned to it. The
following information can be displayed:
• Error Rate – This represents the error rate specific to the selected
Modem Pool use of the selected pooled device.
• Skipped Count – The number of times this pooled device has been
skipped when using the selected Modem Pool.
• Available – Indicates if the selected resource is available or is in use.
The Graph can be used to view the history of attempts to use the selected
resource (pooled device) by the selected Modem Pool.
The Records tab is used to view events associated with the use of the pooled
resources.
6-10
Section 6. Network Status and Resolving Communication Problems
Column Descriptions
Device Name – Indicates the name of the device associated with the operation.
State – Indicates the current state of an active operation, or the most recent
state of a completed operation. Currently active operations are identified by a
green circle displayed to the left of the Device Name.
Priority – Indicates the priority for this operation as a value between 0 and 4.
0. No priority used
1. Low priority
2. Normal priority
3. High priority
4. Top priority
Transmit Time – Indicates the time the operation last transmitted to the
device.
Receive Time – Indicates the time the operation last received information from
the device.
Timeout Interval – Indicates the time out interval, in milliseconds, for any
datalogger transaction associated with this operation.
Client – If a client application initiated the operation, the name of the client
application is indicated. Otherwise, this field will be empty.
6-11
Section 6. Network Status and Resolving Communication Problems
When this check box is selected (default), operations that are no longer active
will be deleted from the displayed list. If this box is not selected, the last state
of the operation before completion will continue to be displayed in the list. The
displayed list is limited to a maximum of one thousand lines. After reaching the
limit, the oldest lines are deleted as new lines are added.
Save to File
When this check box is selected, the information provided by the server for
each listed operation is saved to a comma delimited text file
(C:\Campbellsci\LoggerNet\Operations.log). The information logged provides
a record of when the operation was added, changed (updated), and deleted by
the server. For each line in the file, the information is ordered as follows:
Event, Start Time, Device Name, Description, State, Priority, Transmit Time,
Receive Time, Timeout Interval, Client, Account, and ID#. Entries in the file
are limited to twelve thousand lines. After reaching the limit, the oldest four
thousand lines are deleted.
6.2 LogTool
There are four logs kept by LoggerNet that track the operation of the server,
communications with the dataloggers and data collection. These logs can be
used for troubleshooting communication problems. The LogTool utility allows
you to view the communication packets transferred between the computer
running the datalogger support software and other devices in the network.
LogTool can be launched from the Status Monitor, or from the LoggerNet
Toolbar’s Tools category.
Each of the logs is explained here briefly. Operation of the LogTool utility is
also explained. For additional information on interpreting logs, see
Appendix D, Log Files (p. D-1).
6-12
Section 6. Network Status and Resolving Communication Problems
Object State Log (state$.log) – This log is used for troubleshooting an object
in the datalogger network. The information in this log conveys the state of an
object at a given time.
Low Level I/O Log (io$SerialPort_1) – This log displays low level incoming
and outgoing communications for a root device (i.e., serial port).
Toolbars – Toggles the display of an individual tool bar for each of the logs.
You can pause the display of messages for a tool bar by selecting the Pause
check box. You can clear all messages for a log by pressing the Clear
Messages button.
Trans Log, Object State Log, Comms Log – Toggles the display of the
associated Log.
6-13
Section 6. Network Status and Resolving Communication Problems
I/O Log – Opens the Low Level I/O log for a specific COM port in a new
window.
TimeStamp Options – Allows you to select the format for the time stamp in
the logs. If none of the options are enabled (an option is enabled if a check
mark appears to the left of the option name), only the time is displayed
(hh:mm:ss AM/PM). If Date is selected, a date (MM/DD/YY) will be added to
the time stamp. If Military is selected, the time stamp will be displayed in 24
hour format instead of 12 hour format. If ms Resolution is chosen, the time
stamp will also include milliseconds.
The following settings are used to save the logs to disk as well as to control the
number and size of the log files.
To Disk – Selecting this check box enables saving the associated logs to files
on the server computer hard disk.
6-14
Section 6. Network Status and Resolving Communication Problems
File Count – This setting determines the number of log files to be saved to
disk for this type of log. The server will store up to the number specified before
overwriting the oldest log.
File Size – This setting determines how big the log file is allowed to grow
before being saved to an archived file. The $ sign identifies the active file.
Once a file reaches the specified File Size, it is saved to disk with a sequential
number beginning with 0 (e.g. tran0.log, tran1.log, tran2.log…).
All of the log files in your log file directory can be deleted by selecting File |
Delete All Log Files from the LogTool menu. All of the log files can be zipped
by selecting File | Zip All Log Files. (When working with a Campbell
Scientific customer support engineer to resolve a problem, you may be asked to
use these options in order to delete the current log files, reproduce the problem,
then zip the new log files and send them to the applications engineer for
analysis.)
The color of the circle to the left of each datalogger station indicates the quality
of communication. The legend displayed on the right side of the window
provides a key to the color codes. The legend can be removed from the window
by clearing the Show Legend check box.
When you first open Comm Test, the state of the devices is unknown, so the
circles for each device will appear grey. To initiate the test, click on one or
more of the datalogger stations to select them (the circles will appear blue), and
press the Test button. The LoggerNet server will attempt to contact the
selected device(s) and perform a simple clock check. While a test is in
progress, the circle for a device will appear yellow. Once the test is performed,
6-15
Section 6. Network Status and Resolving Communication Problems
the resulting circle will be green (clock check successful) or red (clock check
failed).
Press Reset Test to clear the test results before running the test again.
By default, the Communication Test window shows all devices in the network.
You can display on the stations by selecting the Show Stations Only check
box.
The window for PakBus graph is divided into three sections: the list of PakBus
devices, a graphical depiction of the PakBus network, and the log messages for
PakBus communication. The list of devices and the log can be toggled off by
clearing the Show List View and Show Log options, respectively.
Software servers are identified in PakBus Graph by the color green. Other
devices remain colorless unless they have been selected with the mouse cursor.
When selected, they are colored cyan.
The default PakBus address for LoggerNet is 4094. Other PakBus devices will
be shown by name and address, if known.
PakBus Graph can be opened from the LoggerNet Toolbar’s Tools category.
6-16
Section 6. Network Status and Resolving Communication Problems
down list. If the PakBus ports set up in the software have been bridged, the
resulting single port will be named “__global__”.
PakBus Graph also can be opened independently from the software toolbar, by
double-clicking the PakBusGraph.exe in the software’s program files directory
(e.g., C:\Program Files\CampbellSci\PakBusGraph). If opened independently,
the host computer to which PakBus graph should connect can be selected from
File | Select Server on the PakBus Graph menu.
Server Address – The name of the computer with which to connect. This must
be the valid name of an existing computer or a TCP/IP address (in the form
###.###.###.### consisting of the IP network number, ###.###.###, and the
host number, ###).
The User Name and Password fields are required only if your server
administrator has set up security on your system.
Edit Settings – This option shows the PakBus settings of a device (see above).
Ping Node – This option will send a packet to the selected device to determine
if it is reachable in the PakBus network. The results of the ping will be
displayed in the Log Messages. Each ping message will include the size of the
6-17
Section 6. Network Status and Resolving Communication Problems
packet sent, and the time of response from the pinged device. The last message
recorded will include summary information from the ping.
Verify Routing Table – This option will request the routing table from a
PakBus device.
Reset Node – This option will reset the routing table in a PakBus device.
Search for Neighbors (server only) – When this option is selected, the
software server will broadcast a Hello Request every 5 seconds to search for
PakBus neighbors with which it can communicate. During this time, the
PakBus port is kept on-line.
Broadcast Reset (server only) – This option will reset the routing table in the
selected PakBus device, as well as any neighbors of the selected device that are
acting as routers.
Unlock Position – This option will unlock a device that has been locked into
position in PakBus Graph by dragging it to a new position on the screen. All
devices can be unlocked by selecting View | Unlock All Positions from the
menu.
6.5 Troubleshooter
The Troubleshooter is a tool that can be used to help assess communication
problems in a datalogger network. The Troubleshooter can be opened from the
LoggerNet Toolbar’s Tools category.
6-18
Section 6. Network Status and Resolving Communication Problems
The information will be different for each device, but may include the type of
device, the device name in the network map, the state of communication with
the device, whether or not scheduled data collection is enabled, whether or not
table definitions are valid, modem type, phone number, device address, and the
error rate.
You can click on a potential problem to bring up a menu that allows you to fix
the potential problem or bring up help on the problem.
6-19
Section 6. Network Status and Resolving Communication Problems
6.5.2 Buttons
Subnet – Allows you to choose to view the entire network or a subnet
configured using Setup Screen | View | Configure Subnets. (Available in
LoggerNet Admin Only.)
Comm Test – Pressing the Comm Test button will open the Communication
Test window which is described in Section 6.3, Comm Test (p. 6-15).
TD-RF Test – This option opens a window from which you can perform a
communications test on a table data RF modem link (RFBase-TD, RFRemote-
TD, or RFRemote-PB). This test is not applicable for RF400 radios or non-TD
based RF modems. See Section 6.5.3, TD-RF Test (p. 6-20), for more
information.
Station Status – Highlight a datalogger from the list on the left and press the
Station Status button to display the Station Status information from the
datalogger. For additional information on the Station Status see Section 5.1.11,
Station Status (p. 5-33).
Find PakBus IDs – This option is used to find PakBus devices attached to a
PakBus port within the communication network. Highlight a PakBusPort from
the list on the left and press the Find PakBus IDs button. LoggerNet will
initiate a search for all PakBus devices on that particular PakBus Port. (It may
take a few moments to return a response.) If the PakBus ports are bridged, the
IDs for all PakBus devices found will be returned.
Reset Device – This option is used to reset the Statistics and Collection
Schedule for the selected device.
Log Tool – Pressing the Log Tool button opens the Log Tool application
which is described in Section 6.2, LogTool (p. 6-12).
The left-hand pane of the TD-RF Quality Test window displays the network
map as configured in LoggerNet, and provides a means of selecting a device or
an RF path to be tested. The branches of the network map can be expanded or
collapsed by clicking the small triangle to the left of the parent device. The
6-20
Section 6. Network Status and Resolving Communication Problems
The right-hand pane of the TD-RF Quality Test window displays the results of
successful RF Link Quality tests as well as events activating or deactivating the
Advanced Features of the TD-RF modems. The most recent entries are added
to the top of the display. A maximum of one thousand entries are retained in
the display, after which the oldest entries are deleted as new entries are added.
Column Descriptions
Normal
Marginal
Critical
Timestamp – The server time for when the test response or command
acknowledgement was received.
Path (Sender --> Receiver) – Indicates the relative function of the devices
involved in the associated test or command.
Packet Size – Applicable only to an RF Link Quality test, this is the size
in bytes of the test packet received by the RF modem. (See Section 6.5.3.2,
TD-RF Quality Report (p. 6-24).)
6-21
Section 6. Network Status and Resolving Communication Problems
Base Codes – The codes sent to the Base modem to activate the Advanced
Features (see Section 6.5.3.3, Advanced Features (p. 6-26)) or define an RF
path for the RF Link Quality test.
The numerical value for Link Quality is derived from the detailed
information contained in the TD-RF Quality Report and has a theoretical
range of 0 to 102, with greater values equating to greater quality. In
practice, the maximum value can only be achieved in a laboratory
environment, and the minimum value will never be returned as it is
indicative of an inability to decode the test pattern; the RF Link Quality
test would timeout. In the real world, typical values for a viable link will
fall into mid-range.
Where:
Q = Link Quality
StdDev() = Population standard deviation
F1T = Front 1T
B1T = Back 1T
F2T = Front 2T
B2T = Back 2T
TestPktSize = Number of bytes reported in the test packet
NumRptrs = Number of RF repeaters in the complete RF path
Save to File – When this check box is checked, all entries in the right-hand
pane are also written to a text file in LoggerNet’s working directory
(C:\Campbellsci\LoggerNet\Logs\RFTestResults.log). Entries in the file are
limited to maximum of ten thousand records. After reaching the limit, the
oldest two thousand records are deleted.
Clear – Clicking this button clears the contents of the right-hand pane.
Start Test – The action taken when this button is activated depends on the
device selected in the network map.
6-22
Section 6. Network Status and Resolving Communication Problems
Before executing an RF Link Quality test, one must first specify the RF path to
be tested. This can be done in one of two ways. If all the devices in the RF path
to be tested are configured in the network map, simply select the RFRemote-
TD/PB that represents the end of the path. (Right-clicking the end of path
RFRemote-TD/PB will simultaneously select the RF path and launch a popup
window for selecting Start Test.) If one or more of the RFRemote-TD/PB
devices in the path to be tested is not configured in the network map, one must
specify the RF path by first selecting the RFBase-TD that is configured in the
network map and then manually entering the RFIDs of each TD-RF modem
along the path in the RF Base Codes/RFIDs field. (See RF Base Codes/RFIDs
above.)
Once the RF path has been designated, the RF Link Quality test is initiated by
clicking the Start Test button.
6-23
Section 6. Network Status and Resolving Communication Problems
detailing information about the size of the packet and its ability to decode the
data contained in the packet. (For details, see Section 6.5.3.2, TD-RF Quality
Report (p. 6-24).) The end of path modem will then send the RF Test Packet back
across the network toward the RFBase-TD. As the RF Test Packet makes the
return trip across the network, each RFRemote-TD/PB modem in the path as
well as the RFBase-TD will log a TD-RF Quality Report based on the received
packet. The RFBase-TD then sends a command across the network to collect
the TD-RF Quality Report from each of the modems involved in the test. The
RFBase-TD sends the collected reports to LoggerNet in the RF Test response
packet.
The TD-RF Quality Reports are displayed in the right-hand pane of the TD-RF
Quality Test window in the order (top-down) they were collected; starting with
the end of path modem and ending with the RFBase-TD.
In a process known as line coding, the RF modem encodes the binary data from
the wired data stream onto a 3 KHz waveform. The line coding utilized in
CSI’s TD-RF modems is called Miller Encoding. This encoding scheme
employs a method of differentiating the binary “1s” and “0s” of the data stream
based on the timing of transitions in the waveform from one level to another
within the bit period (approximately 333 microseconds for a 3 KHz bit rate). A
binary “1” is represented by a level transition occurring in the middle of the bit
period. A binary “0” is represented by either there being no transition occurring
within the bit period or, in the case of consecutive “0”s, the transition occurs at
the end of the bit period.
In order to properly decode the encoded data from a received signal, one must
precisely detect when the level transitions are occurring in relation to the
middle and end of the bit period; the T1 and T2 transition points respectively.
The OS in the RF modem does this by establishing a detection window
centered about the time a transition is expected to occur. The detection window
is 204 units wide, so the optimal transition timing would occur in the center of
the window; unit 102. In the real world, the optimal transition timing is not
likely to occur that often so there will always be some deviation to where the
transitions occurs within the detection window. The more noisy the received
signal, the greater the deviations. If the transition occurs outside the detection
window, the proper bit will not be detected and a data error will occur. The
demodulated data stream and the associated detection windows for the T1 and
T2 transition points are illustrated in the graphic below.
6-24
Section 6. Network Status and Resolving Communication Problems
The information recorded in the TD-RF Quality Report includes the location of
the maximum and minimum transition point for the T1 and T2 detection
windows and the size, in bytes, of the received test packet.
6-25
Section 6. Network Status and Resolving Communication Problems
The test packet size is significant as an indicator of lost packets. The over the
air (OTA) communications protocol utilized by the RF modems requires that a
modem acknowledge the reception of an RF Test Packet. If the sending modem
does not receive an acknowledgment, it will resend the packet. This is known
as a ‘retry’. After executing a number of unacknowledged retries, the sending
modem will decrease the number of bytes in the packet by approximately half
before attempting additional retries. Therefore a decrease in the size of the test
packet is another indication of less than optimal link quality.
The test packet size, as calculated by the TD-RF Quality Report, includes the
packet header. The packet header contains the RFIDs of each modem in the RF
path being tested. Therefore the packet size will be a minimum of 237 for an
RFBase-TD and a single RFRemote-TD/PB, and will increase by one for each
additional RFRemote-TD/PB in the path.
Activation Method 1 – In the TD-RF Quality test window, select the RFBase-
TD in the network map, enter the appropriate code string in the RF Base
Codes/RFIDs field and click the Start Test button.
NOTE Enter the Code string with a ‘space’ separating each value.
Activation Method 2 – In the TD-RF Quality test window, select the RFBase-
TD in the network map and click the Browse button to the right of the RF Base
Codes/RFIDs to open the Advanced RF Commands window. Select the feature
to be activated using the radio buttons, enter/select any ancillary information
that may be required and click the OK button. The RF Base Codes/RFIDs field
will be populated with the appropriate code string and the feature activated.
6-26
Section 6. Network Status and Resolving Communication Problems
printed or saved. If only one device is selected, only information for that device
is printed or saved.
The Server Monitor is started from the Window’s Start menu, All apps |
Campbell Scientific | LoggerNet Server Monitor. When first opened, a
Login dialog box appears. This dialog is used to specify the name or address of
the computer running the LoggerNet server that you want to monitor. If the
server is running on your local computer, use the default name of LocalHost.
Otherwise, enter the valid name or IP address of the remote computer running
LoggerNet. If security is enabled, you will need to type in a user name and
password.
NOTE The LoggerNet Server Monitor can be run by a user assigned any
one of the five levels of security.
Once you are connected to a LoggerNet server, the main window of the Server
Monitor will be displayed. This window shows messages related to the activity
of the LoggerNet server that are written to the Comm.log. By default, only
Warning and Fault (failure) messages are displayed. However, you can choose
to monitor Status messages as well by selecting the Options | Show All menu
item. When the Show All option is enabled, a check mark will appear to the
left of the menu item. This option is a toggle. Select it once to enable it; select
it a second time to disable it.
6-27
Section 6. Network Status and Resolving Communication Problems
If Warnings or Faults have been encountered, you can reset the state of the icon
by right-clicking it and choosing Reset, or by opening the Server Monitor and
choosing Options | Reset. If you select Options | Clear – Reset, the messages
will be cleared from the message window as well.
NOTE To increase or decrease the font of the message display, select the
Font + or Font – option from the Font menu.
6-28
Section 7. Creating and Editing
Datalogger Programs
Datalogger must be programmed before they can make measurements. LoggerNet offers
three options for programming dataloggers. Short Cut, Edlog, and CRBasic Editor.
Short Cut (also referred to as SCWIN) is an application for generating programs for all of
Campbell Scientific’s dataloggers and preconfigured weather stations except the CR7 and
CR9000. Users do not have to know individual program instructions for each datalogger.
Short Cut not only generates a program for the datalogger, but also a wiring diagram that
can be left with the datalogger for field servicing.
The CRBasic Editor is the full-featured program editor for the CR1000X-series,
CR6-series, CR300-series, GRANITE 6, GRANITE 9, GRANITE 10, CR1000, CR3000,
CR800-series, CR200-series, CR5000, CR9000, and CR9000X dataloggers. It is a full-
featured editor that requires the user to understand the program instructions for the
datalogger, but it can be used to develop more complex programs than what can be created
using SCWIN.
The CR7, CR10, 21X, CR500, CR510, CR10X, and CR23X dataloggers are programmed
using the Edlog editor. Edlog supports all operating systems for these dataloggers,
including the table-data or “TD” and PakBus or “PB” versions. Like the CRBasic Editor,
it requires that the user have more knowledge of datalogger program instructions than
SCWIN.
In addition to the above programming tools, the Transformer utility is offered in LoggerNet
for those users of CR10X or CR23X dataloggers who need to develop programs for the
CR800-series, CR1000 or CR3000 dataloggers.
Edlog dataloggers, the 21X, CR7, CR10, CR10X, CR500, CR510, and CR23X,
come by default with operating systems that store data in one or two areas of
final storage, with all intervals typically stored “end-to-end” in the same area of
memory as individual arrays, hence the name “mixed-array” operating systems.
Each array (e.g., 15 minute, hourly, daily) will have its own identifier that
appears as an integer in the first position of the array. This is referred to as the
Array ID. The other “elements” of the array store year, Julian day, hour-
minute, seconds, and any of a variety of processing of measurements, such as
average air temperature, total rainfall, minimum battery voltage, etc. To
analyze the data, the user may find it useful to post-process the mixed-array
data to extract the interval array of interest. Split (see Section 8, Working with
Data Files on the PC (p. 8-1)) is ideally suited to do this.
Some of these Edlog dataloggers, specifically the CR510, CR10X, and CR23X,
can alternatively be configured with table-data or PakBus operating systems. In
these table-based configurations (CR510-TD, CR510-PB, CR10X-TD,
7-1
Section 7. Creating and Editing Datalogger Programs
CR10X-PB, CR23X-TD, and CR23X-PB), they measure the sensors the same
way, but store the processed data in individual tables instead of arrays. Each
final storage table will contain only data for that interval – e.g., fifteen minute,
hourly, and daily data records will be in different tables. The user can more
closely control the size of these tables (for example to store a “buffer” of
twelve hours of one minute data without taking up all of the available
memory). In addition, the collected data file will have the date/time value in a
single string – e.g.; “2004-05-15 13:50:00” – that is more readable in third
party post-processing software. While there are some differences in
communications between the table-data (TD) and PakBus (PB) operating
systems, their measurement and final storage instructions are the same, so
Short Cut treats them identically.
NOTE Those users who are moving from Edlog to the CRBasic
dataloggers and who also need more control over datalogger
programs, may find Short Cut to be an excellent way to learn
CRBasic. You can follow the same steps in Short Cut for a
CRBasic datalogger as you would for an Edlog datalogger, but
then open the program in the CRBasic Editor to see how Short Cut
created the program.
Short Cut was designed to help the beginning datalogger programmer create
datalogger programs quickly and easily. Short Cut effectively insulates the user
from having to know the nuances of datalogger programming and the Edlog
versus CRBasic programming languages. It supports the most commonly sold
sensors from Campbell Scientific, as well as generic measurements (such as
differential voltage, bridge, and pulse), commonly used calculation and control
functions (such as heat index calculation, alarm conditions, and simple
controls), and multiplexer analog channel expansion devices.
Short Cut cannot be used to edit existing Edlog, CRBasic, or Short Cut for
DOS programs. Program editing and more complex datalogger programming
7-2
Section 7. Creating and Editing Datalogger Programs
Short Cut was designed with extensive built-in help. Help can be accessed at
any time by pressing the F1 key. There are also Help buttons on most screens.
You can also open the Help by selecting Short Cut Help from Short Cut’s
Help menu. Help for each sensor can be accessed by searching the Help Index
or pressing the Help button from the sensor form.
After generating the program, you can send it to the datalogger from the
Results tab of Short Cut’s Finish screen or from LoggerNet’s Connect Screen
or from PC400 or RTDAQ’s Clock/Program tab.
7-3
Section 7. Creating and Editing Datalogger Programs
From this screen, you indicate which CR9000X modules are inserted into
which CR9000X slots. To add a module, select the module by clicking on it in
the Available CR9000X Modules list, select the Slot by clicking on the slot
number, then press the arrow key.
7-4
Section 7. Creating and Editing Datalogger Programs
To remove a module, select the slot containing it and then press the Remove
Module button.
NOTE Whenever you are working with a CR9000X program, this dialog
box can be brought up by choosing Datalogger from the Progress
panel and then pressing Next. However, the Remove Module
button is only available when a new program is being created.
Once the Next button on the screen has been pressed, modules can
be added but they cannot be removed.
The next dialog box that is displayed is used to select the type of integration to
apply to the measurements in the program. Integration can be used to filter out
AC signals that might affect the accuracy of your measurements (such as noise
from fluorescent lighting or a generator). Typically 60 Hz rejection is used for
North America and 50 Hz rejection is used for countries following European
standards. Fast (250 µs) integration should be used when you need an
execution speed that cannot be accomplished using one of the other options.
This dialog box will be displayed the very first time you create a program for a
specific datalogger type; it will not be displayed thereafter. With each
subsequent program you create, the integration you chose when the datalogger
was initialized in Short Cut will be used. However, you can change the
integration from the Program menu. If you make this change, the setting will
remain in effect for all programs for that datalogger type (whether they are new
programs or edited programs) until it is changed again.
NOTE For the CR1000X series, CR6 series, CR300 series, and
GRANITE Data Logger Modules, the integration setting is named
first notch frequency (fN1).
The last dialog box displayed is the Sensor Support dialog box. (This dialog
box will not be displayed when creating a CR9000X program.) This is used to
select which group of sensor files will be displayed when creating a program:
Campbell Scientific, Inc. (CSI, USA) or Campbell Scientific, Ltd. (CSL, UK).
The standard set of Short Cut sensor files was created by CSI; however, CSL
has created some additional files that are customized for their client base.
When one option is selected, the sensor files developed specifically for the
other are filtered out.
7-5
Section 7. Creating and Editing Datalogger Programs
This setting is similar to the Integration setting in that the dialog box will be
displayed only the first time you create a program for a specific datalogger
type, and the setting will apply to all programs created or edited for that
datalogger, unless it is changed via the Program menu. Note that programs
containing sensor files that are filtered from the list of Available Sensors will
still load and work correctly in Short Cut.
NOTE The Integration and the Sensor Support settings are persistent
settings for each datalogger model. The first time you create a
program for a particular datalogger model, you will be presented
with these two dialog boxes. The state of these settings is saved
between Short Cut sessions. Any subsequent new or edited
programs that are generated after a setting has been changed will
reflect the change as well.
Each time you create the first program for a datalogger model you
will be presented with these dialog boxes (e.g., the first time you
create a CR10X program, you must initialize these settings; the
first time you create a CR1000 program, you must initialize these
settings).
After making your selections, note that the title bar shows the datalogger type.
Once you have saved the file, the filename will replace “untitled.scw”.
7-6
Section 7. Creating and Editing Datalogger Programs
7-7
Section 7. Creating and Editing Datalogger Programs
Measurement name
Measurement units
Notes specific to
this sensor
Note that this sensor not only offers a custom name field and units, but also
allows you to correct for sea level, a common practice in measuring
atmospheric pressure. In the middle of the screen, look over the notes (or refer
to the Help for this sensor), for this sensor may require other sensors or have
limitations. When you choose OK, Short Cut adds the necessary instructions
with appropriate multipliers and offsets.
In some cases, multiple sensors of the same type can be added at one time.
These sensors will have a How many sensors? parameter as the first parameter
on the form as shown below. The maximum number of sensors that can be
added will be indicated. The maximum will vary, depending upon the sensor
and the number of other sensors already configured in the program. If the
sensor form includes calibration and/or conversion parameters (e.g., multiplier,
offset, gage factor), there will be a Set button next to these parameters.
Pressing this button will allow you to set unique values for each sensor.
7-8
Section 7. Creating and Editing Datalogger Programs
7-9
Section 7. Creating and Editing Datalogger Programs
Click on the Wiring tab of a sensor’s parameter form to show the wiring for
the sensor (or the first sensor in a sensor group).
Each wire’s caption/color is shown on the left side of the wire. The location
where the wire will be connected to the device is shown on the right side
(under the device). You can change a caption/color by clicking on the
caption/color label. A wiring location can also be changed by clicking on the
wiring location.
NOTE Changes to the wiring location for a sensor group can only be
made when the group is first added. To make changes to a wiring
location at a later time, you will need to change the number of
sensors to one, press OK, reopen the parameter form, make the
desired wiring location changes, and then change the number of
sensors back to the desired number.
NOTE Not all sensors support changes to the wire caption/color and
wiring location. When hovering over a wire caption/color or
wiring location, the mouse cursor will change to indicate that the
property can be changed. Changes are generally supported for
generic sensors and other sensors that do not use special wiring
connections.
At any time, you may choose a measurement label on the right side of the
Sensors screen and edit it or remove it.
7-10
Section 7. Creating and Editing Datalogger Programs
7-11
Section 7. Creating and Editing Datalogger Programs
Refer to the online help for complete information on creating User Calculation.
Short Cut provides you with a wiring diagram by clicking on Wiring Diagram
on the left side of the Sensors window. In the example below, Short Cut was
told to measure a CS106 Barometric Pressure sensor, a CS210 enclosure
relative humidity sensor, and an HMP155 Air Temperature and Relative
Humidity sensor. Each sensor was allocated the necessary terminals. Short Cut
will not let you add more sensors than there are terminals on that datalogger or
device. You can print this diagram (or the textual equivalent) by choosing the
Print button. Many users find it handy to leave a printed wiring diagram in the
enclosure with the datalogger in case a sensor has to be replaced.
Short Cut can also create programs for dataloggers using a variety of interface
devices, including multiplexers and special interfaces for sensors. Add these
devices by selecting them from the Devices folder in the Available Sensors
and Devices tree.
7-12
Section 7. Creating and Editing Datalogger Programs
Once you’ve added a device, such as the AM16/32 multiplexer, a tab is added
to the screen for that device, and the sensors available for that device are
shown:
You can then add sensors to that device just as you would to the main
datalogger.
Note that, once you add a sensor to a multiplexer, it may limit what kind of
sensors can be added thereafter, as each sensor on the multiplexer must share
the same wiring between the multiplexer and the datalogger.
7-13
Section 7. Creating and Editing Datalogger Programs
In the How often should the datalogger measure its sensor(s)? field, specify
how often the datalogger will execute the instructions in its program. This is
known as the measurement or scan interval.
When choosing a scan interval, remember that faster scan intervals will use
more power. For most applications, a 10 to 60 second scan interval is
sufficient. If faster scan intervals are required for your application, make sure
there is sufficient time for the execution of all instructions in the program (refer
to the section in the datalogger manual on Execution Intervals for additional
information).
NOTE By default, data is sent to memory based on time. Data can also be
sent to memory based on one or more of the following conditions:
time, the state of a flag, or the value of a measurement. This is set
up from the Advanced Outputs screen. To use the Advanced
Outputs screen, select the Advanced Outputs (all tables) check
box at the lower left of the Output Setup screen. The Data
Output Storage Interval field will be removed from the Output
Setup screen (and moved to the Advanced Outputs screen). After
completing the fields on the Output Setup screen and pressing
Next, Short Cut will advance to the Advanced Outputs screen.
Two tables are defined by default. Additional tables can be added by pressing
the Add Table button. Short Cut limits the number of output tables to 10. An
output table can be removed by clicking on the table to make it the active table
and pressing the Delete Table button.
7-14
Section 7. Creating and Editing Datalogger Programs
Steps for completing the standard table output are given below:
• The Data Output Storage Interval field and the adjacent drop-down
list are used to set the interval at which data will be stored to memory.
The default output intervals are 60 minutes (Table1) and 1440 minutes
(Table2), but they can be changed. (This field is removed from this
screen if the Advanced Outputs (all tables) checkbox is selected. In
this case, it can be set from the Advanced Outputs screen along with
any other conditions to be met for data to be stored.)
7-15
Section 7. Creating and Editing Datalogger Programs
In the How many records would you like to store for tablename? field,
enter the maximum number of records that should be stored in the table.
Once the maximum number of records have been stored in the table, the
oldest record will be removed when a new record is added.
Instead of specifying a fixed table size, you can let the datalogger set table
size automatically (autoallocate) by using the default value (0 or –1,
depending upon the datalogger). When table size is autoallocated, the
datalogger will first assign memory to any fixed-size tables and then will
divide its remaining memory among the autoallocated tables so that all
tables are filled at approximately the same time.
7-16
Section 7. Creating and Editing Datalogger Programs
If the Memory Card checkbox was selected on the Output Setup screen,
you must also specify How many records would you like to store? to the
memory card.
Check the appropriate box for one or more of the output conditions:
Flags – Use the first list box to select the flag and the second list box to
select the state of the flag that will cause data to be stored to memory. If
Flag 8 is selected from the first list, and High is selected from the second,
data will be stored to memory each time Flag 8 is high during program
execution. For table-based dataloggers, when the Flags option is set, the
table size should be set to a fixed value instead of autoallocate (-1). See
note above.
The Records Before field is used to enter the number of records that
should be stored prior to the condition being met (the datalogger will
keep this number of records in memory). The Trigger is the variable
that will be monitored for the specified condition. Use the drop-down
list box to select the trigger from the list of variables in the program.
The two remaining fields for the trigger are used to specify the value
for the variable that will trigger the condition.
7-17
Section 7. Creating and Editing Datalogger Programs
An Optional Stop Trigger can also be specified, which will stop the
storage of data to the table. If a stop trigger is not specified, data will
be stored to the table indefinitely. If a stop trigger is specified, you
can also specify the number of records to continue storing to the table
after the stop trigger condition is true in the Records After field.
Fields below the trigger criteria indicate the total number of records
that will be stored to the table when the trigger condition is met.
When the Data Event option is set, the table size should be set to a fixed value
instead of autoallocate (–1). See note above.
On the left, Short Cut will show the sensors you’ve added to be measured, with
the measurement labels you’ve used. On the right is a multi-tabbed grid that
shows the output tables.
7-18
Section 7. Creating and Editing Datalogger Programs
Note that outputs for a sensor don’t have to be added in the same sequence as
the measurement. You can even drag and drop the outputs to rearrange their
order. Note also that multiple outputs can be added for any one sensor. For
example, you may want to store the maximum and minimum air temperature as
well as the average.
If Advanced Outputs was selected on the Output Setup screen, there will
also be a column for Resolution. By default, data is stored in low resolution
(2-byte floating point numbers). You can instead select high resolution to have
data stored as 4-byte floating point numbers.
7-19
Section 7. Creating and Editing Datalogger Programs
The Results tab provides information on the files that were created. If a
program was created successfully, a Send Program button will also be
displayed which allows you to send the program to the datalogger.
• ProgramName.DEF is the text file that describes the wiring for the sensors
and devices to the datalogger, measurement labels, flag usage, and the
output expected. You can view the contents of the DEF file by clicking the
Summary button on the Results screen.
7-20
Section 7. Creating and Editing Datalogger Programs
The Summary tab displays the information in the DEF file as described above.
The Advanced tab (for CRBasic dataloggers) displays the CRBasic program
that was generated. It includes a CRBasic Editor button which opens the
program for editing in the CRBasic Editor. Note that any changes made to the
generated program in the CRBasic Editor will not be reflected in Short Cut or
future programs generated by Short Cut.
Note that, while Short Cut can generate a program file for the datalogger, you
must use datalogger communication software to transmit that program to the
datalogger. (This is true even when pressing the Send Program button from
Short Cut’s Finish screen. Short Cut relies on the datalogger communication
software to transmit the program.)
Notwithstanding its intention, one mistake you can make is to set security and
then forget the values. If you send a program with security set, you will then
need to add that security setting to LoggerNet's Setup Screen or RTDAQ or
PC400’s EZSetup Wizard for that datalogger. If you don’t, you may find that
you can no longer communicate with the datalogger. Should this happen and
you forget the security code and have lost the Short Cut program file, you may
have to visit the datalogger site and cycle power on the datalogger to be able to
communicate with it. Most dataloggers that offer security will communicate
over their CS I/O port directly with a keyboard/display or PC in the first few
seconds of powering up. See the datalogger manual for a full description of the
security features.
7.2.3.2 Datalogger ID
Mixed-array dataloggers keep a memory location available for a datalogger ID
value. This is typically an integer that you can read from within the program
and store into final storage to keep track of the identity of the datalogger that
created the data. Valid Datalogger IDs are 1 through 12 and 14 through 254.
Use the Datalogger ID instruction in Short Cut (found under Miscellaneous
Sensors) to use the ID in the datalogger program.
7-21
Section 7. Creating and Editing Datalogger Programs
Most Campbell Scientific dataloggers are sent an ASCII program file, which
they then compile into machine code. The CR200/205 does not have enough
memory and processing capability to do this compilation, so it’s necessary to
compile the program file into the binary version used by the datalogger itself.
This compilation is done by Short Cut to check for errors in the program before
sending it. It’s done again by LoggerNet, RTDAQ, or PC400 when sending the
program to the datalogger. Compilation is performed using a special executable
that mimics the functions and capability in the datalogger’s operating system.
Therefore, the compiler executable must match the datalogger’s operating
system or the datalogger may fail to run the compiled binary (*.BIN) program.
LoggerNet, RTDAQ, PC400, and Short Cut are installed with precompilers for
all of the released versions of the CR200/205 operating systems. If, at some
time in the future, you acquire a newer CR200/205, or choose to install a later
operating system, you must make sure you also have the compiler executable
that matches. These compiler executables are typically installed in a library
directory. By default, this directory would be installed as:
C:\Campbellsci\Lib\CR200Compilers
If you receive an operating system update, you should copy the compiler
associated with it to this directory. If, for some reason, you put the compiler in
a different directory, this menu item provides a way to choose that compiler
executable.
This dialog box is displayed the very first time you create a program for a
specific datalogger type; it will not be displayed thereafter. With each
subsequent program you create, the group of sensor files that you chose when
the datalogger was initialized in Short Cut will be used. However, you can
change this setting at any time. If you make a change, the setting will remain in
effect for all programs for that datalogger type (whether they are new programs
or edited programs) until it is changed again.
7-22
Section 7. Creating and Editing Datalogger Programs
NOTE For the CR1000X series, CR6 series, CR300 series, and
GRANITE Data Logger Modules, the integration setting is named
first notch frequency (fN1).
7.2.3.7 Font
This setting is accessed from the Options menu item of the Tools menu. Use
this setting to change the appearance of the font used by Short Cut. Most
windows other than the wiring descriptions (which require a non-proportional
font to make sure wiring diagrams are aligned) will use this font.
For Edlog dataloggers, the easiest method is to Document the DLD file from
within Edlog (discussed later in this section). Short Cut creates a .DLD file to
send to the datalogger that includes input location and final storage labels.
Documenting a .DLD file causes Edlog to use the same labels and to show you
the individual instructions being used to carry out the program. You can then
add and delete instructions from within Edlog to add functionality to the
program. Short Cut cannot import the files created by Edlog, however. Short
Cut reads only its own SCW-formatted files.
For CRBasic dataloggers, you can use the CRBasic Editor to open the .CR#
files directly. Again, Short Cut will not be able to open the files you’ve edited
with the CRBasic Editor, since they are not an SCW file.
7-23
Section 7. Creating and Editing Datalogger Programs
www.campbellsci.com/downloads
It is also possible to have custom sensor files created for sensors your
organization uses that are not included with Short Cut. Contact your Campbell
Scientific applications engineer for details.
The resulting dialog box will allow the user to make changes to the chosen
sensor file and then save it with a new name. (See Short Cut’s Online Help for
additional information on changes that can be made.) By default, custom
sensor files will be created in C:\CampbellSci\SCWin\SENSORS, which is a
different location than that of Short Cut’s included sensor files.
Once the custom sensor file has been saved, it will be added to the Available
Sensors list.
As shown below, the CRBasic Editor's main window is divided into three
parts: the Program Entry Window, the Instruction Panel, and the Message area.
The Instruction Panel on the right side is a list that comprises the instructions
for a particular datalogger in the CRBasic language. Instructions can be
selected from this list or entered directly into the Program Entry Window on
the left. The Message area at the bottom becomes visible after a program is
compiled and shows results of the compile and any errors detected.
7-24
Section 7. Creating and Editing Datalogger Programs
You can filter the list of instructions available in the Instruction Panel by
clicking the drop-down arrow to the right of the text box above the list. This
will allow you to display only instructions of a specific type such as
Measurement or Program Structure/Control. This provides a smaller list to
select from and makes it easier to find the instruction you want. Switch back to
All to see all of the instructions available. You can create custom instruction
filter lists as described later in this section.
Below is an example of the Parameter dialog box for the differential voltage
instruction (VoltDiff).
7-25
Section 7. Creating and Editing Datalogger Programs
The Prev (Previous) and Next buttons can be used to move to the next (or
previous) instruction with the parameter entry box opened.
The variable list is sorted by variable type and then alphabetically by name. In
the list above, the first green A denotes that the variable AIRCOOL is set up as
an Alias.
Constants are listed with a blue C, Dimensioned variables are listed with a red
D, and Public variables are listed with a black P.
At any time you can press F10 to bring up the list of variables, regardless of
the input type for the selected parameter. Also, defined variables can be
selected from the Variables drop-down list box at the upper right of the
Parameter dialog box.
Pressing F9 at any time will also bring up a list of variables. However, when a
variable is chosen from the list brought up by F9, it will simply be inserted at
the cursor without overwriting anything.
7-26
Section 7. Creating and Editing Datalogger Programs
Right-clicking or pressing F2 on a parameter that does not fall within the two
categories above will bring up help for that parameter.
Pressing F1 with any parameter selected will bring up help for that parameter
along with a list of possible options where appropriate.
Right-click an instruction name to show the Parameter dialog box to edit the
instruction parameters.
Right-click a block of text that is highlighted to bring up a short cut menu with
the following options:
7-27
Section 7. Creating and Editing Datalogger Programs
7.3.3 Toolbar
The toolbar of the CRBasic Editor provides easy access to frequently used
operations.
Print Preview – Opens a Print Preview screen that will show what
the program will look like when printed. You can check and set the
margins and printer options.
7-28
Section 7. Creating and Editing Datalogger Programs
Undo – Each time the Undo button is clicked it will step back
through the last changes made to the program.
Redo – Cancels the undo and steps forward restoring the changes.
Cut – Removes the selected part of the program and puts it on the
clipboard to be pasted elsewhere.
Find Next – Finds the next occurrence of the text string specified in
the Find dialog.
Save and Compile – Saves and then compiles the opened file.
Next Error – Moves the cursor to the part of the program where the
next error was identified.
7-29
Section 7. Creating and Editing Datalogger Programs
7.3.3.1 Compile
Compile is a function provided by the CRBasic Editor to help the programmer
catch problems with the datalogger program. Compile is available from the
toolbar and the Compile menu.
When the Compile function is used, the CRBasic Editor checks the program
for syntax errors and other inconsistencies. The results of the check will be
displayed in a message window at the bottom of the main window. If an error
can be traced to a specific line in the program, the line number will be listed
before the error. You can double-click an error preceded by a line number and
that line will be highlighted in the program editing window. To move the
highlight to the next error in the program, press the Next Error button or
choose Next Error from the Compile menu. To move the highlight to the
previous error in the program, press the Previous Error button or choose
Previous Error from the Compile menu.
It is important that the compilers used for checking programs match the OS
version loaded in the datalogger, otherwise errors may be returned when the
program is sent. When a CR200 program is being edited, the Pick CR200
Compiler menu item is available. This item opens a dialog box from which a
compiler can be selected for the CR200 datalogger.
The error window can be closed by selecting the Close Message Window
menu item from the View menu, or by clicking the X in the upper right corner
of the message window.
This function first checks the program for errors using the pre-compiler, then
saves the program (using the current name, or by prompting the user for a
name if the program is new). After the compile and save, this function sends
7-30
Section 7. Creating and Editing Datalogger Programs
NOTE When a file is sent to the datalogger using Compile, Save, and
Send and the software is not actively connected to the datalogger,
the software connects to the datalogger, sends the file, retrieves
table definitions, and then disconnects. There will be little
indication in the software that a connection was established.
When this function is chosen a dialog box is displayed. Below is the dialog box
for a CR1000 datalogger:
The Select the destination list shows all dataloggers configured within
LoggerNet, PC400, or RTDAQ that may receive a program matching the
extension of the current CRBasic program to be sent. Assume, for example,
that you have three CR1000s and some other dataloggers in your LoggerNet,
PC400, or RTDAQ network map. When you send a *.CR1 program, this screen
will show only the three CR1000 dataloggers. Any other dataloggers will be
excluded from the list in this case, even when they are defined in the network
map, because those dataloggers are not associated with *.CR1 programs. A
program with the extension of .DLD will be associated with all CRBasic-
programmed datalogger types.
7-31
Section 7. Creating and Editing Datalogger Programs
Select the datalogger to send the file to, and then select the Run Options.
Run Now
The Run Now run options are different for the different datalogger types.
CR1000X-Series/CR6-Series/CR300-Series/GRANITE 6/
GRANITE 9/GRANITE 10/CR1000/CR3000/CR800-Series Run Now
Options
When Run Now is checked, the file will be sent with the Run Now attribute
set. With this attribute, the program is compiled and run in the datalogger. You
may choose to preserve existing data tables on the datalogger's CPU if there
has been no change to the data tables (Preserve data if no table changed) or
to delete data tables on the CPU that have the same name as tables declared in
the new program (Delete associated data tables).
When using the Preserve data if no table changed option, existing data and
data table structures are retained unless one of the following occurs:
To summarize, any change in data table structure will delete all tables on the
datalogger's CPU, regardless of whether or not the Preserve Data option was
chosen. If the Preserve Data option was chosen but the datalogger was unable
to retain the existing data, the following message will appear in the Compile
Results: Warning: Internal Data Storage Memory was re-initialized.
7-32
Section 7. Creating and Editing Datalogger Programs
When Run Now is checked, the file will be sent with the Run Now attribute
set. With this attribute, the program is compiled and run in the datalogger. All
data tables on the CPU are erased. You have the option of whether or not to
erase data files stored on a card.
Run On Power-up
The file will be sent with the Run On Power-up attribute set. The program will
be run if the datalogger loses power and then powers back up.
Run Always
Run Now and Run On Power-up can both be selected. This sets the program's
file attribute in the datalogger as Run Always. The program will be compiled
and run immediately and it will also be the program that runs if the datalogger
is powered down and powered back up.
Compress File
If the Compress File check box is selected, a renamed version of the CRBasic
program which has all unnecessary spaces, indentation, and comments
removed in order to minimize the file size will be sent to the datalogger instead
of the original program.
7-33
Section 7. Creating and Editing Datalogger Programs
Press Cancel if you do not wish to send the program to the datalogger.
NOTE When sending a program with the Compile, Save, and Send
feature to a CR9000X datalogger while you are connected to the
datalogger, you may get a disconnect message or similar
notification. This is unique to the CR9000X datalogger and does
not indicate any problem with the sending of the program. You
can simply reconnect to the datalogger and continue your work.
7.3.3.4 Templates
The use of templates can be a powerful way to quickly create a set of similar
datalogger programs. All or part of a program can be saved so that it can be
used when creating new programs. These files are called templates. The
Template menu provides access to create and use templates.
Save as Template – Saves the comments and instructions in the active file as a
template. To save part of a program as a template, copy the selected part to a
new program file and then Save as Template.
Save as Default Template – Saves the comments and instructions in the active
file as a template that will be used each time File | New is selected for that type
of datalogger.
7-34
Section 7. Creating and Editing Datalogger Programs
NOTE Template files are associated with a specific datalogger type. For
example, templates for a CR5000 cannot be used for CR9000X
programming and vice versa. Each datalogger has its own set of
instructions that may be different than the other.
Save and Encrypt – Encrypts the active file. Encrypted files can be compiled
in the datalogger but cannot be read by a user. (Refer to FileEncrypt in the
CRBasic Editor’s online help for dataloggers that support file encryption.)
7-35
Section 7. Creating and Editing Datalogger Programs
Save As CRB – Saves highlighted text to a file with a *.CRB extension. This
file is referred to as a library file. The file can then be reused by inserting it into
another CRBasic program.
Insert File – Inserts a library file (*.CRB) into the current program at the
location of the cursor.
The Editor tab allows the user to toggle on or off the pop-up hints for
parameters in instructions, set the amount of time the cursor must hover over
the instruction before the pop-up hint appears, and the background color of the
pop-up hint. This is also used to choose whether CRBasic automatic instruction
indenting indents using tabs or spaces, and set the number of spaces if that
option is chosen. Other options relating to the use of the tab key, capitalization,
name checking, and line numbers are also available. Press the Help button for
more information.
7-36
Section 7. Creating and Editing Datalogger Programs
The Vertical Spacing tab is used to set up the rules for the CRBasic Editor's
Rebuild Indentation function (Edit | Rebuild Indentation). You can control
whether blank lines are inserted before or after certain instructions, and how
the CRBasic Editor will process multiple blank lines in the program. If Do Not
Insert or Remove Any Blank Lines is selected, all other fields on this tab will
be disabled. If either of the other two line options is chosen, the remaining
fields will be available for the user to customize as desired.
The Syntax Highlighting tab sets up the appearance of different text elements
in the program using different font styles and colors. You can customize the
7-37
Section 7. Creating and Editing Datalogger Programs
Background Color – Displays a color selection dialog to set the color of the
CRBasic program window.
7-38
Section 7. Creating and Editing Datalogger Programs
Wrap Text When Printing – When this option is selected, long lines that
extend past the right margin will be wrapped to the next line. This option
affects printing, as well as the Print Preview mode. A check mark will appear
next to the option in the menu when it is selected.
Close Message Window – After you have pre-compiled your program with the
Compile | Compile menu item, or using the toolbar, a message window opens
up at the bottom of the CRBasic Editor main screen. This option will close
down that message window.
View Instruction Panel – Select this option to View or Hide the instruction
panel which displays a list of available instructions which can be used in your
datalogger program based on the pre-defined instruction filter selected with the
drop-down selection box.
To create a new list, first select the Add New Category button and provide a
name for the user-created category. Next, ensure the category name is selected
and click the Edit Category button to bring up the Select Instructions dialog
(shown below). Instructions that should be included in the new list are
indicated by a check in the box to the left of the instruction name. This feature
allows the user to display a filtered instruction list containing only those
instructions most often used. Press OK to save the list.
7-39
Section 7. Creating and Editing Datalogger Programs
To set up Constant Customization, place the cursor on a blank line within the
CRBasic Editor and choose Tools | Set Up Constant Customization Section.
This will insert two comments into the program:
Within these two comments, define the constants. Following each constant, use
the keywords noted below formatted as a comment, to set up edit boxes, spin
boxes, or list boxes for the constant values. The fields are edit boxes by default.
If a maximum/minimum are defined for a constant, the field will be a spin box.
If a discrete list is defined for the constant, the field will be a list box.
7-40
Section 7. Creating and Editing Datalogger Programs
Const Reps=1
Const Number=0
'Min=-100
'Max=100
Const TableName="OneSec"
'value="OneMin"
'value="OneHour"
'value="OneDay"
This code will create the following constant customization dialog box:
7-41
Section 7. Creating and Editing Datalogger Programs
The constant SUnits has a list box with sec and min; sec is the default.
The constant Reps is defined with a default value of 1. It is an edit box, into
which any value can be entered.
Before compiling the program, open the Customize Constants dialog box,
select the constant values you want to compile into the program, and then
perform the Conditional Compile and Save.
Check one or more boxes for file extension(s) you want to associate and press
the Associate Files button.
Show Keyboard Shortcuts – This option displays a list of the functions of the
CRBasic Editor which are accessible via the keyboard. The list can be copied
to the clipboard for printing or other uses.
Show Tables – This option displays details about the output tables and the
items they store as they are defined in the current CRBasic program. The list
can be copied to the clipboard for printing or other uses.
Set Datalogger Type – This option displays a list of dataloggers so the user
can select the instruction set, compiler, and help files to use when the program
extension is .DLD or .CRB (e.g., myprogram.DLD, or myprogram.CRB).
Insert Symbol – Opens a dialog box that lets you insert Unicode symbols into
your CRBasic program for use in strings and units declarations.
Set DLD Extension – This option selects which datalogger’s pre-compiler will
be used when performing a pre-compile check on a DLD program which uses
conditional compile statements. A CRBasic program must be named with the
DLD extension for this item to be active.
Open Display Settings File – Opens a previously saved display setting file.
Save Display Settings File – The look and feel of the CRBasic Editor can be
changed from the default. The Font and Background can be changed, as well as
the syntax highlighting. These changes can be saved to a file (with an ini
extension) using the Save Display Settings File menu item. The file can be
7-42
Section 7. Creating and Editing Datalogger Programs
reloaded on the same or different computer running CRBasic using the Open
Display Settings File.
7-43
Section 7. Creating and Editing Datalogger Programs
For I=1 to 10
TCTemp(I)=TCTemp(I)*1.8+32
Next I
Aliases can also be created that will allow an element of an array or another
data result to be referred to by a different name. To continue the example
above, TCTemp(3) could be renamed using the following syntax:
In the display software, the more descriptive alias, AirTemp, would be used for
the cell name.
7-44
Section 7. Creating and Editing Datalogger Programs
TCTempF=TCTemp(1)*1.8+32
Many parameters will allow the entry of expressions. In the following example,
the DataTable will be triggered, and therefore data stored, if TCTemp(1)>100.
The instructions for making measurements and outputting data are not found in
a standard basic language. The instructions Campbell Scientific has created for
these operations are in the form of procedures. The procedure has a keyword
name and a series of parameters that contain the information needed to
complete the procedure. For example, the instruction for measuring the
temperature of the CR5000 input panel is:
PanelTemp(RefTemp, 250)
7-45
Section 7. Creating and Editing Datalogger Programs
'CR5000
'VARIABLE DECLARATION
Dim TCTemp(4) 'Dimension TC measurement variable
Alias TCTemp(1)=EngineCoolantT 'Rename variables
Alias TCTemp(2)=BrakeFluidT
Alias TCTemp(3)=ManifoldT
Alias TCTemp(4)=CabinT
In the sample code above, the datalogger compiler will ignore the commented
text.
7-46
Section 7. Creating and Editing Datalogger Programs
Const RevDiff=1
Const Del=0
Const Integ=250
Const Mult=1 Declare constants
Const Offset=0
Public RefTemp
Public TC(6) Declare public variables, Declarations
Units RefTemp=degC dimension array, and
Units TC=DegC declare units.
DataTable (Temp,1,2000)
DataInterval (0,100,mSec,10)
Average (1,RefTemp,FP2,0)
Average (6,TC(),FP2,0) Define Data Table
EndTable
BeginProg
Scan (10,mSec,3,0)
PanelTemp (RefTemp, 250)
TCDiff (TC(),6,mV20C ,1,TypeT,RefTemp,RevDiff,Del,Integ,Mult,Offset) Measure
CallTable Temp Scan loop
NextScan Call Data Table
EndProg
The user's program determines the values that are output and their sequence.
The datalogger automatically assigns names to each field in the data table. In
the above table, TIMESTAMP, RECORD, RefTemp_Avg, and TC_Avg(1) are
fieldnames. The fieldnames are a combination of the variable name (or alias if
one exists) and a three letter mnemonic for the processing instruction that
outputs the data. Alternatively, the FieldNames instruction can be used to
override the default names.
The data table header may also have a row that lists units for the output values.
The units must be declared for the datalogger to fill this row out (e.g., Units
RefTemp = degC). The units are strictly for the user's documentation; the
datalogger makes no checks on their accuracy.
The above table is the result of the data table description in the example
program:
7-47
Section 7. Creating and Editing Datalogger Programs
DataTable (Temp,1,2000)
DataInterval(0,10,msec,10)
Average(1,RefTemp,fp2,0)
Average(6,TC(1),fp2,0)
EndTable
All data table descriptions begin with DataTable and end with EndTable.
Within the description are instructions that tell what to output and the
conditions under which output occurs.
The DataTable instruction has three parameters: a user specified name for the
table, a trigger condition, and the size to make the table in RAM. The trigger
condition may be a variable, expression, or constant. The trigger is true if it is
not equal to 0. Data are output if the trigger is true and there are no other
conditions to be met. No output occurs if the trigger is false (=0). The size is
the number of records to store in the table. You can specify a fixed number, or
enter –1 to have the datalogger auto allocate the number of records. The
example creates a table name Temp, outputs any time other conditions are met,
and retains 2000 records in RAM.
DataInterval is an instruction that modifies the conditions under which data are
stored. The four parameters are the time into the interval, the interval on which
data are stored, the units for time, and the number of lapses or gaps in the
interval to track. The example outputs at 0 time into (on) the interval relative to
real time, the interval is 10 milliseconds, and the table will keep track of 10
lapses. The DataInterval instruction reduces the memory required for the data
table because the time of each record can be calculated from the interval and
the time of the most recent record stored. The DataInterval instruction for the
CR200 does not have lapses.
NOTE Event driven tables should have a fixed size rather than allowing
them to be allocated automatically. Event driven tables that are
automatically allocated are assumed to have one record stored per
second in calculating the length. Since the datalogger tries to make
the tables fill up at the same time, these event driven tables will
take up most of the memory leaving very little for the other, longer
interval, automatically allocated data tables.
7-48
Section 7. Creating and Editing Datalogger Programs
BeginProg
Scan(1,MSEC,3,0)
PanelTemp(RefTemp, 250)
TCDiff(TC(),6,mV50,4,1,TypeT,RefTemp,RevDiff,Del,Integ,Mult,Offset)
CallTable Temp
NextScan
EndProg
The Scan instruction determines how frequently the measurements within the
scan are made:
The Scan instruction has four parameters (the CR200 datalogger’s Scan
instruction has only two). The Interval is the time between scans. Units are the
time units for the interval. The BufferSize is the size (in the number of scans)
of a buffer in RAM that holds the raw results of measurements. Using a buffer
allows the processing in the scan to at times lag behind the measurements
without affecting the measurement timing (see the scan instruction in the
CR5000 help for more details). Count is the number of scans to make before
proceeding to the instruction following NextScan. A count of 0 means to
continue looping forever (or until ExitScan). In the example the scan is 1
7-49
Section 7. Creating and Editing Datalogger Programs
millisecond, three scans are buffered, and the measurements and output
continue indefinitely.
The binary format makes it easy to visualize operations where the ones and
zeros translate into specific commands. For example, a block of ports can be
set with a number, the binary form of which represents the status of the ports
(1= high, 0=low). To set ports 1, 3, 4, and 6 high and 2, 5, 7, and 8 low; the
number is &B00101101. The least significant bit is on the right and represents
port 1. This is much easier to visualize than entering 72, the decimal
equivalent.
7-50
Section 7. Creating and Editing Datalogger Programs
The datalogger will also evaluate multiple expressions linked with and or or.
For example:
If X>=5 and Z=2 then Y=0
will set Y=0 only if both X>=5 and Z=2 are true.
If X>=5 or Z=2 then Y=0
will set Y=0 if either X>=5 or Z=2 is true (see And and Or in the help). A
condition can include multiple and and or links.
The expression evaluator evaluates the expression, X>=5, and returns –1, if the
expression is true, and 0, if the expression is false.
W=(X>Y)
will set W equal to –1 if X>Y or will set W equal to 0 if X<=Y.
The datalogger uses –1 rather than some other non-zero number because the
and and or operators are the same for logical statements and binary bitwise
comparisons. The number –1 is expressed in binary with all bits equal to 1, the
number 0 has all bits equal to 0. When –1 is anded with any other number the
result is the other number, ensuring that if the other number is non-zero (true),
the result will be non-zero.
7.3.4.12 Flags
Any variable can be used as a flag as far as logical tests in CRBasic are
concerned. If the value of the variable is non-zero the flag is high. If the value
of the variable is 0 the flag is low. LoggerNet, PC400, or RTDAQ looks for the
variable array with the name Flag when the option to display flag status is
selected from the Connect Screen. If a Flag array is found, as many elements of
that array which can fit will be displayed in the Port and Flags dialog box.
Constant
Variable
Variable or Array
Constant, Variable, or Expression
7-51
Section 7. Creating and Editing Datalogger Programs
TABLE 7-4 lists the maximum length and allowed characters for the names for
Variables, Arrays, Constants, etc.
Maximum Length
Name for (number of characters) Allowed characters
Variable or Array 39 (17 ) Letters A-Z, upper or
lower case, underscore
Constant 39 (16) “_”, and numbers
Alias 39 (17 ) 0-12. The name must
start with a letter.
Data Table Name 20 (8) CRBasic is not case
sensitive.
Field name 39 (16)
Values in parentheses refer to the CR5000, CR9000 and CR9000X dataloggers.
VoltSE(Dest,Reps,Range,SEChan,Delay, Integ,Mult,Offset)
'Calibration factors:
Mult(1)=0.123 : Offset(1)= 0.23
Mult(2)=0.115 : Offset(2)= 0.234
Mult(3)=0.114 : Offset(3)= 0.224
VoltSE(Pressure(),3,mV1000,6,1,1,100,Mult(),Offset()
7-52
Section 7. Creating and Editing Datalogger Programs
Note that one exception to this is when the Multiplier or Offset points to an
index into the array, then the instruction will not advance to the next Multiplier
or Offset but use the same for each repetition. For instance in the above
example, if Mult(2) and Offset(2) were used, the instruction would use 0.115
and 0.234 for the Multiplier and Offset, respectively, for each repetition. To
force the instruction to advance through the Multiplier and Offset arrays while
still specifying an index into the array, use the syntax Mult(2)() and
Offset(2)().
7-53
Section 7. Creating and Editing Datalogger Programs
7.4 Edlog
7.4.1 Overview
Edlog is a tool for creating, editing, and documenting programs for Campbell
Scientific’s mixed-array dataloggers: CR7, CR500, CR510, CR10, CR10X,
21X, CR23X. Edlog also supports these same dataloggers configured with
table-based operating systems, including the table-data or “TD” and PakBus or
“PB” versions. It provides a dialog box from which to select instructions, with
pick-lists and detailed help for completing the instructions’ options (or
parameters). Edlog checks for errors and potential problems in the program
when pre-compiling the program. Some highlights of Edlog’s features are
listed below.
7.4.1.1 Precompiler
Edlog precompiles the program to check for errors and to create the file that is
downloaded to the datalogger. The precompiler will catch most errors. Errors
that the precompiler misses should be caught by the datalogger when the
program is compiled. The download file (*.DLD) is stripped of comments to
make it more compact. During the precompile step, a Program Trace
Information file (*.PTI), that provides an estimate of program execution time,
is also created. For mixed-array dataloggers the precompiler also creates a
Final Storage Label file (*.FSL) to supply labels for final storage values to be
used by other software applications.
7-54
Section 7. Creating and Editing Datalogger Programs
For example, the following expression could be used to create a new input
location for temperature in degrees Fahrenheit from an existing input location
for temperatures in degrees Celsius.
TempF=TempC*1.8+32
7-55
Section 7. Creating and Editing Datalogger Programs
Select the datalogger you are using from the list and click OK. A blank
program template will come up as shown below for a CR10X.
The first line of text identifies the type of datalogger program to be written.
This is followed by a comment line and the Program Table Headers and
Execution Interval fields. The Program Table Headers and Execution Interval
fields are protected text that cannot be deleted or commented out. (The asterisk
is used to identify the beginning of a program table in the datalogger.) When
the cursor is moved to the Execution Interval line, the field for the execution
interval is highlighted. A numeric value must be entered or the instructions in
the table will never be executed.
Instructions inserted under the Program Table 1 header will be run based on the
execution interval for that table. Likewise, instructions inserted under the
Program Table 2 header will be run based on the execution interval for Program
Table 2. Program Table 3 is reserved for subroutines that are called by either of
7-56
Section 7. Creating and Editing Datalogger Programs
the other tables. Most users find they can write the entire program in Program
Table 1, avoiding complications associated with synchronizing two tables.
Program Table 2 is normally used only when portions of the program require a
different execution interval (placed in Program Table 2).
When the program is complete, select File | Save from the Edlog menu. A
standard file dialog box will appear in which to type a file name. Edlog
supports long file names for the datalogger programs. Use descriptive names to
help document the program’s function. After saving the file, you will be
prompted to compile the program. When a program is compiled the code will
be checked for errors. After compiling, the datalogger program can be sent to
the datalogger using the Connect Screen.
Comments – Edlog provides the ability to add comments on any blank line
and to the right of all instructions. Liberal use of descriptive comments makes
the program clearer and will help you remember what you were doing when
you come back to it a year or two later. Especially useful are descriptions of
what sensors are connected and how they are wired to the datalogger.
• Measure Sensors – In this first section put all the instructions that get data
from the sensors attached to the datalogger. The sensor readings are stored
in input locations, ready for the next section.
Descriptive Labels – Use input location and final storage labels that are
meaningful for the data they contain.
• *.CSI – The CSI file is what the user actually edits. When an Edlog
program is saved, Edlog automatically adds a CSI extension to the
7-57
Section 7. Creating and Editing Datalogger Programs
program’s name. Existing CSI files can be edited by selecting File | Open.
Although CSI files are ASCII files they require a particular format, so
editing the *.CSI files with some other text editor can corrupt the Edlog
programs so that they no longer load or compile.
• *.PTI – Program Trace Information files show the execution times for each
instruction, block (e.g., subroutine), and program table, as well as the
estimated number of final storage locations used per day. The execution
times are estimates. PTI files do not account for If commands, Else
commands, or repetitions of loops. For some instructions, the execution
times are listed as 0. This occurs when the execution time is unknown
(e.g., P23 – Burst Measurement).
• *.FSL – Final Storage Label files contain the final storage labels for the
data values in the output data records. This file is used by Split to show
labels for data values in reports, and by View for column headings. FSL
files are not created for table-based dataloggers. Table-based datalogger
program files contain the final storage labels.
Other files that are used in Edlog but are generated by other means than
compiling the program include:
• *.LBR – Library files (*.LBR) are parts of a program that can be retrieved
and used in other Edlog programs. If a programmer often uses an
instruction set in his/her datalogger programs, this partial file can be saved
to disk and inserted into a new program.
NOTE Library files that are created for one type of datalogger should not
be used in a different type of datalogger (e.g., do not use an LBR
file created for a CR10X-TD in a CR10X or CR510-TD program).
Instructions differ among dataloggers, and bringing in an invalid
instruction to a datalogger will result in errors.
• *.TXT – Printer output files created by Edlog are saved with a TXT
extension. These files can be sent to a printer or viewed with a text editor.
A TXT file is created by selecting File | Print to File.
• Right click a blank line and select Insert Instruction from the pop-up
menu.
7-58
Section 7. Creating and Editing Datalogger Programs
• Type the instruction number onto a blank line and press enter.
The first three options will open the Insert Instruction dialog box.
To insert an instruction into the program, select it and then choose OK, or
double click the entry in the list. If you need more information on an
instruction, select the instruction and click the Help button.
Note that to the right of each instruction name is a code for the instruction type:
I/O for input/output, Process for instructions that calculate new values, Output
for instructions that write to final storage, or Control for instructions that affect
program flow.
• Select the parameter with your mouse and press the right mouse button.
This brings up a dialog box from which to select a value or a pop-up
description of what should be entered.
• With your cursor anywhere within the instruction, press <F1>. This opens
the help system to a detailed description of the instruction and parameters.
Edlog provides hints for each parameter at the very bottom of the Edlog screen.
These hints often display the valid entries for a field.
7-59
Section 7. Creating and Editing Datalogger Programs
Edlog has a Data Entry Warning function that is accessed from the Options |
Editor menu item. By default, the Data Entry Warning is enabled. When the
Data Entry Warning is active, a warning is displayed immediately after an
invalid input or potentially invalid input has been entered for an instruction’s
parameter. The warning lists the valid inputs. A valid input must be entered
before advancing to the next parameter.
• Select a block of text, press the right mouse button, and select “comment”
or “uncomment” from the right button pop-up menu.
Edlog will not allow a portion of an instruction or the table execution intervals
to be commented out.
7.4.2.6 Expressions
Algebraic expressions can be used in a program to easily perform processing
on input locations. When a datalogger program that contains an expression is
compiled, the appropriate instructions are automatically incorporated into the
DLD file. As an example, the following expression could be used to convert
temperature in degrees Celsius to temperatures in degrees Fahrenheit:
TempF=TempC*1.8+32
• Expressions must be set equal to the label of the Input Location that will
store the result. The result label must be to the left of the expression.
• Expressions can have both fixed numbers and Input Location labels. Input
Locations can only be referenced by their label; each number in an expression
is assumed to be a constant.
7-60
Section 7. Creating and Editing Datalogger Programs
• Floating-point numbers are limited to six digits plus the decimal point and
sign.
• To continue an expression to the next line, end the first line with an
underscore ( _ ).
Operators
* multiply
/ divide
+ add
– subtract
^ raise to the power of; enclose negative values in parentheses
@ modulo divide
E scientific notation; 6e–1=0.6
Functions
COS cosine; angle in degrees
SIN sine; angle in degrees
TAN tangent; angle in degrees
COTAN cotangent; angle in degrees
ARCTAN arctangent; angle in degrees
ARCSIN arcsine; angle in degrees
ARCCOS arccosine; angle in degrees
ARCCOT arccotangent; angle in degrees
SQRT square root
LN natural logarithm
EXP exponent of e; EXP(2) = e2
RCP reciprocal; RCP(4) = 1/4 = 0.25
ABS absolute value
FRAC takes the fraction portion; FRAC(2.78)=.78
INT takes the integer portion; INT(2.78)=2
7-61
Section 7. Creating and Editing Datalogger Programs
TempF = (TempC*1.8)+32
When this program is compiled, the DLD file contains the following
instructions. The last 5 instructions calculate the expression.
2: Z=X (P31)
1: 2
2: 5
3: Z=F (P30)
1: 1.8
2: 0
3: 3
4: Z=X*Y (P36)
1: 3
2: 5
3: 5
7-62
Section 7. Creating and Editing Datalogger Programs
5: Z=F (P30)
1: 32
2: 0
3: 3
6: Z=X+Y (P33)
1: 3
2: 5
3: 6
Some of the error messages that occur when using expressions need no further
explanation:
This message occurs when the expression is not set equal to an Input Location
label. The label must be to the left of the expression and not enclosed in
parentheses. An expression that contains no equal sign causes compiler error
202, “unrecognized text”.
For Example:
(2) An expression with a + or – operator does not have a number or label after
the operator.
7-63
Section 7. Creating and Editing Datalogger Programs
(3) An expression with an @ operator does not have a number after the @;
only a fixed number is allowed immediately after the @ operator.
(4) An expression with an @ operator does not have either a number or label
before the @.
All fixed numbers are limited to five digits not including negative signs and
decimal points.
Function Expected
For Example:
zee=(label1)(label2)
ex=(5)(ARCTAN(data))
eee=(em)(see^2)
zee=(label1)*(label2)
ex=(5)*(ARCTAN(data))
eee=(em)*(see^2)
For Example:
tee=5(2)
mu=(nu)103
bee=7.52(ef/2)
sigma=–17(RCP(alpha))
7-64
Section 7. Creating and Editing Datalogger Programs
tee=5*(2)
mu=(nu)*103
bee=7.52*(ef/2)
sigma=–17*(RCP(alpha))
For Example:
result=(ex^2)data
gamma=(10–omega)SIN(psi)
dee=(17)number
result=(ex^2)*data
gamma=(10–omega)*SIN(psi)
dee=(17)*number
An equal sign MUST immediately follow the label of the Input Location that
stores the results (e.g., label = expression). An expression that contains no
equal sign causes compiler error 202, “unrecognized text”.
For Example:
zee/2=bee
data+number=volt1+volt2
bee=zee/2
data=volt1+volt2–number
7-65
Section 7. Creating and Editing Datalogger Programs
PgUp Page Up
PgDn Page Down
Up Arrow Move Up One Line
Down Arrow Move Down One Line
Right Arrow Move One Character Right
Left Arrow Move One Character Left
<Ctrl> Home Move Cursor to Beginning of File
<Ctrl> End Move Cursor to End of File
<Ctrl> PgUp Move Cursor to Top of Screen
<Ctrl> PgDn Move Cursor to Bottom of Screen
<Enter> Move to Next Field or Create New Line
<Shift> <Ins> Select an Instruction from a Dialog Box
<Ctrl> Right Arrow Move Instruction 1 Tab Right (Cursor on Parameter)
<Ctrl> Left Arrow Move Instruction 1 Tab left (Cursor on Parameter) or
Move from Input Location label to Input Location
number.
<Ctrl> n Comment out a Line or Instruction
<Shift> <ctrl> n Uncomment a Line or Instruction
<End> Move to end of line, Add a comment if on an
Instruction
<Ctrl>C Copy selected text
<Ctrl> X Cut selected text
<Ctrl>V Paste clipboard
<Del> Delete character to right or selected text
<Shift> Del Delete the Instruction or Line Under the Cursor
<Esc> Close Dialog Box
7-66
Section 7. Creating and Editing Datalogger Programs
NOTE You cannot move, copy, delete or comment out protected text
(Tables, Execution Intervals) or partial instructions. To move,
copy or delete an Instruction, the entire instruction, including all
of the parameters, must be selected.
To create a library file, select the text to be stored and then select Edit | Save
To Library File. When the window appears, type in the library file name. To
insert a library file in a program, move the cursor to the desired insertion point
and select Edit | Insert Library File.
NOTE Library files created for one type of datalogger type should not be
used in programs for a different datalogger type; i.e., a library file
for a CR10X-TD should not be used in a program for a CR10X or
a CR510-TD. Instructions differ among dataloggers, and bringing
in an invalid instruction to a datalogger could result in errors.
Programs created with the DOS versions of Edlog earlier than 6.0 were stored
with the instruction description and comments in a *.DOC file instead of a
*.CSI file. The DLD version of these programs can be imported into current
versions of Edlog by using this Document DLD feature, though any comments
will be lost.
7-67
Section 7. Creating and Editing Datalogger Programs
7-68
Section 7. Creating and Editing Datalogger Programs
7.4.5.4 Indention
Indention is typically used with If Then/Else sequences and loops to provide a
visual key to program flow. Indention is a visual aid; it has no meaning to the
datalogger. If the programmer chooses to use indention, it can be done
automatically or manually.
The settings for indention are found under Options | Editor. Turn on
Automatic Indention by checking the box next to it. The distance for each
indention (in spaces) is set on the same dialog box. To manually indent an
instruction, place the cursor on one of the instruction’s parameters and press
either <Ctrl>+right arrow or <Ctrl>+left arrow; the instruction is indented the
direction the arrow is pointing.
The Display | Rebuild Indention menu item resets all existing indentions and
rebuilds automatic indentions. Automatic indentions may need to be rebuilt
when editing instructions causes the indentions to misalign.
In an Edlog program, each Input Location has an Input Location number and a
label that appear whenever the Input Location is referenced in the program.
Edlog automatically assigns Input Location numbers as labels are entered.
You may prefer to enter all input locations into the Edlog program before
writing the program. This makes all the labels available from the input location
pick list, and can help reduce programming errors because of typos.
To enter the Input Location number instead of the label, use the mouse or press
<ctrl> left arrow.
7-69
Section 7. Creating and Editing Datalogger Programs
7.4.8 Repetitions
Many input/output and output processing instructions have a repetitions
parameter. Repetitions (REPS) allow one programming instruction to measure
several identical sensors or to process data from several Input Locations. When
REPS are greater than 1, the Input Locations are assigned consecutive numbers
(e.g., with REPS of 2 and LOC of 5, the Input Locations are 5 and 6). Each rep
label is the initial label with a “_” and the next consecutive number (i.e., with 3
REPS and a label of “data” the labels for each REP are: data_1, data_2, and
data_3).
Only the first input location of an instruction is linked to the instruction. Reps
of input/output instructions and output processing instructions are not linked,
so use care if altering their sequence in the Input Locations Editor.
When the program is executed, the datalogger will perform the Sample (P70)
instruction twice. The first time, it will sample the value stored in the TempC
location. The second time, it will sample the value stored in the BatteryV
location.
7-70
Section 7. Creating and Editing Datalogger Programs
Editing functions are available from the Input Location Editor’s Edit menu and
a hot key:
Insert (<F2>) – Inserts blank Input Locations. This is used to provide space for
new input labels between existing labels. This automatically changes the Input
Location numbers for all of the labels that are after the inserted location.
Delete (<F3>) – Deletes the Input Location label, flags, number of reads and
writes, and block information for a designated location number. Wherever the
datalogger program references a deleted location label, the Input Location’s
number automatically becomes 0.
Move (<F4>) – Moves the Input Location to a different number. This may
change several Input Location numbers.
Optimize (<F6>) – Deletes Input Locations that aren’t read, written to, or
marked as Manual. Optimize tries to reduce the total number of locations used
by moving existing Input Location labels to fill in unused locations. This might
change several Input Location numbers. Any changes in location number made
by the Optimize command are reflected in the Edlog program.
7-71
Section 7. Creating and Editing Datalogger Programs
Insert Block (<F7>) – Inserts and labels a block of Input Locations and marks
them as “Manual”. The locations are labeled in the same manner as reps.
Esc – The escape key closes the Input Location Editor and updates the label
assignments in the program.
There are certain instructions that generate multiple Input Locations for which
Edlog does not automatically allocate Input Locations. The user should
manually allocate these locations in the Input Location Editor. These are:
7-72
Section 7. Creating and Editing Datalogger Programs
See Edlog Help for each instruction to get a detailed description of input
location usage. You can also refer to the datalogger user’s manual for more
information on these instructions.
When these instructions are used in a program, the Toggle Manual feature can
be used to manually mark Input Locations for use by the program.
For mixed-array dataloggers the final storage labels are stored in an *.FSL file
when the program is compiled, as well as in the DLD files. For table-based
dataloggers the final storage labels are included as part of the datalogger
program in the *.DLD file; no FSL file is created. LoggerNet gets the final
storage labels as part of the table definitions from the datalogger. Split, the
Graphical and Numeric Displays, and View Pro use the final storage labels.
The user can create a custom label to reflect the meaning of the value that is
being stored. Click the FSL Edit button on the toolbar or press F9 to bring up
the Final Storage Label Editor as shown below.
In this example from a mixed-array datalogger, the final storage output data for
Array ID 112 is shown. Each of the columns indicate the essential
characteristics of the data value being stored.
• Array ID or Table Name identifies the set of output data instructions the
data is associated with. For mixed-array dataloggers the array ID is at the
beginning of each output record. In table-based dataloggers, the table name
shows the name of the table where the data values will be stored.
• Output Instruction lists the output instruction that was used to store the
data value.
7-73
Section 7. Creating and Editing Datalogger Programs
• Line Number is the line number in the Edlog program for the output
instruction.
• Final Storage Label is the label that is associated with this final storage
value. Red labels are associated with automatically created data entries
such as time stamps and record numbers. The red labels cannot be changed
with the Final Storage Label Editor. The green labels are associated with
user programmed sensor data. To change the label, click in the box and
type in the new label.
• Resolution shows whether the data will be stored in low or high resolution.
(High resolution stores data as a 4-byte floating point number, Low
resolution uses a 2-byte number)
• Inloc Name is the label of the input location that the final storage data is
based on.
• Inloc Address is the numeric label for the input location used for the final
storage data value.
The final storage labels created by Edlog can be restored by selecting the menu
item Edit | Restore Default Labels from the Final Storage Label Editor menu.
7-74
Section 7. Creating and Editing Datalogger Programs
Refer to the datalogger manual or Edlog’s help file for additional information
on Security in the datalogger.
Minimize DLD Size – No input location labels or final storage labels are
saved in the DLD file.
Default – Up to 255 input location labels and all final storage labels are
saved in the DLD file.
All – All input location labels and all final storage labels are saved in the
DLD file.
7-75
Section 7. Creating and Editing Datalogger Programs
Include All Input Location Labels – All input location labels are saved
in the DLD file.
Include First X Input Location Labels – Allows you to specify a certain
number of input location labels to be saved in the DLD file.
If you are trying to minimize the size of your DLD file but still want to be able
to monitor input locations in the software, you can put all of the labels that you
want to view at the beginning of your list of input locations, and put the labels
for scratch and less important values at the end. Then, use the second option
above to display only those values of interest.
Port Status – The state of the ports (high/low) the last time the datalogger
was on.
Flag Status – The state of the flags (high/low) the last time the datalogger
was on.
User Timer – Allows you to continue timing events that occurred when
the datalogger was on last.
Input Storage – Allows the values that were stored in the input locations
before you turned the datalogger off to be included in the sample, average,
and total when you turn the datalogger back on.
Intermediate Storage – Allows data processing to continue from when
the datalogger was on last.
NOTE Not all dataloggers have a Compile Settings option. This option
refers only to the CR510, CR10X, and CR23X.
When the Fixed Baud Rate check box has been selected, the datalogger is
forced to communicate at the baud rate selected. When it is not selected, the
datalogger will first try to use the initial baud rate, but will try the other baud
rates if it cannot connect.
The CR23X has an RS232 Power Always On check box. This keeps the
power to the RS232 port on at all times. In some instances, this may be
desirable but it consumes much more power than when the datalogger turns on
the port as needed.
7-76
Section 7. Creating and Editing Datalogger Programs
NOTE Not all dataloggers have a Serial Port Settings option. This option
refers only to the CR510, CR10X, and CR23X.
For any of the options, if the check box Do Not Change Current Settings is
enabled, then those settings will not be changed when the program is
downloaded to the datalogger.
7.4.18.1 Network
The Network option is used to set the PakBus address in the datalogger and to
configure the datalogger as a router if required. This option is the same as the
datalogger’s *D15 mode.
7-77
Section 7. Creating and Editing Datalogger Programs
7-78
Section 7. Creating and Editing Datalogger Programs
The Source File is the CSI or DLD file to be converted. The Program File is
the new CR* file that will be created. By default, the resulting file name for the
CR1000, CR800, or CR3000 program that will be created is the name of the
original program with a CR* extension. This can be changed if desired by
typing in a new path and/or file name directly, or by pressing the Browse
button to the right of the Program File field.
7-79
Section 7. Creating and Editing Datalogger Programs
Comments about the conversion are shown in the Action Log (bottom portion
of the window). The Action Log should be reviewed carefully; it provides
useful comments and alerts you to any problems that may exist in the
converted file. To view only the messages related to problems in the field,
enable the Show Only Problem Messages check box.
If an Edlog file previously has been opened in the Transformer, when the file is
opened a second time you will receive a message “This file, <filename>,
already exists. If you overwrite it, the information it contains will be lost. Do
you want to overwrite it?” If you choose Yes, the existing CR1 file will be
overwritten. If you choose No, you will be given the opportunity to provide a
new name for the file. This message can be suppressed by selecting Options |
Suppress “Overwrite File” Warning from the Transformer menu. However,
note that you should strongly consider keeping this message intact to avoid the
possibility of overwriting a file that you transformed and then subsequently
edited in the CRBasic Editor.
7.5.2 Controls
The following buttons are used within the Transformer to move to a different
location in the file, or save or print the file.
7-80
Section 8. Working with Data Files on
the PC
After data has been collected from the datalogger, you need a way to analyze that data.
LoggerNet provides two tools to do this.
View Pro is a file viewer that provides a way to look at the collected data. It will open data
files (*.DAT) saved in a variety of formats including files from mixed-array and table-
based dataloggers. It can also be used to view data from a LoggerNet database table
created with LNDB. View can also open other CSI file types (*.DLD, *.CSI, *.PTI, *.FSL,
*.LOG, *.CR2, *.CR5, *.CR1, *.CR1X, *.CR3, *.CR300, *.CR6, *.CR8, *.CR9). Once a
data file or database table is opened, data values can be graphed in several different
formats including Line Graphs, Histograms, XY Plots, FFTs, and Rainflow Histograms.
View Pro is discussed in Appendix G.
Split is a tool that is used to post-process collected data from either mixed-array or table-
based dataloggers. Split can create reports by filtering data based on time or conditions. It
can generate statistics, perform calculations, reformat files, check for data quality (limit
testing), and generate tables with report and column headings. It can also handle the time
synchronization necessary to merge up to eight data files.
Data stored on a compact flash, microSD, or PCMCIA card must be converted prior to
analyzing it on your computer. CardConvert is a utility used to retrieve binary data from a
compact flash or PCMCIA card, convert it to an ASCII or binary file, and save it to disk.
8.1 Split
8.1.1 Functional Overview
Split is a tool to analyze data collected from Campbell Scientific dataloggers.
Its name comes from its function of splitting out specific data from a larger
data file. Originally, Split could only process mixed-array files, and it was used
to “split” the different arrays – typically different time intervals – of a file into
separate files (e.g., for hourly versus daily data).
In addition to splitting out mixed-array data, Split can filter output data based
on time or conditions, calculate statistics and new values, reformat files, or
check data quality (limit testing). Split can generate tables with report and
column headings, as well as time synchronize and merge up to eight data files.
Input Files (maximum of eight) are read by Split, specific operations are
performed on the data, and the results are output to a new Output File or a
printer. Split creates a parameter file (filename.PAR) that saves all of your
settings such as which data files are read, what operations are performed on the
data set, and where the final results will be saved. The parameter file may be
saved and used again.
8-1
Section 8. Working with Data Files on the PC
Split can be used to convert a file of one format to a different format. For
example, a Table Oriented ASCII file can be converted to the Comma
Separated ASCII format used in mixed-array datalogger data files. This is
useful to convert table-based data files to work with applications that were
written to work with mixed-array files.
Split lends itself to experimentation. The processed data are displayed on the
screen, giving immediate feedback as to the effect of changes or new entries to
the parameter file. Split does not modify the original Input File.
In the following example, hourly data are split from a data set that contains 15
minute, hourly and daily data. The data was collected from BirchCreek, a
CR10X datalogger. The CR10X was loaded with a program created by Edlog
named Birch.dld.
The 15 minute data, array 99, the hourly data, array 60, and the daily data,
array 24, are intermixed in the data file.
When Edlog compiled Birch.dld, it also created the Final Storage Label file,
Birch.fsl that lists the final storage locations for each data element.
8-2
Section 8. Working with Data Files on the PC
When you start Split a blank template similar to the one above is shown. This
template is used to enter the parameters that will define what data from the
8-3
Section 8. Working with Data Files on the PC
input file to include in the output file. The parameters entered on this template
can be saved as a parameter file (*.PAR) and reused for other data.
On the INPUT FILE tab you only need to specify the input file name, copy
condition, and the data to select. Split allows start and stop conditions to be
specified but if they are left blank, the entire file will be read.
The name of the Input Data File can be typed in or the Browse button can be
used to select from available files. In this example BirchCreek.dat will be
selected as the input data file.
Selecting the data to copy is simplified by the use of the Birch.fsl file. From the
toolbar menu, click Labels | Use Data Labels. From the Data File Labels pop-
up, Select File is used to find Birch.fsl. When one of the Output Arrays is
highlighted, the Field Names of the data in that array are displayed.
NOTE In this example, a mixed array data file is processed and the Use
Data Labels feature uses an FSL file. When processing a table-
based datalogger file, change the file type to “Table-based data
file to use for labels” and select the table-based DAT file. Split
will use the header information from this file for its labels.
In this example we want the hourly data (note the Output Interval at the bottom
of the Data File Label window), so click array 60. To paste the desired values
from this array into the Select box, select the field names while holding down
the <ctrl> key. All of the values could be selected by clicking the first one and
holding the mouse button down, and dragging to the end. Once the values you
want have been selected click Paste.
8-4
Section 8. Working with Data Files on the PC
Note that the cursor in the INPUT FILE(S) screen must be in valid paste area
(Copy or Select). If the cursor is in the File name box or in Start/Stop
condition, you will get the error message “Cannot Paste There”.
The Paste operation copied the numbers of each of the fields into the Select
box. Notice also that it pasted the Array ID into the copy condition: 1[60] tells
Split that in order to copy a line of data, the first value in that line must be 60.
Split uses the Array ID to discriminate between the hourly and daily data.
Now specify the Output File name. (Without one specified, Split will run and
display results but no output file will be created.) Click the OUTPUT FILE
tab. Type in “hourly” for the name of the output file. By default, Split will use
the file extension “PRN”, creating the output file: hourly.prn. Depending upon
the option chosen in the “If File Exists then” list box, an existing PRN file may
be overwritten, appended to, or saved under a new name.
The Labels option from the toolbar can also assist in labeling the output values.
Once again, choose LABELS | USE FINAL STORAGE LABELS and select
array 60 and all the field names. This time move the cursor to Line 1 of the first
column of labels on the OUTPUT FILE tab and press Paste. The labels from
the final storage file will be pasted into each of the columns. Split will
automatically break a label name into multiple rows at the “_” in a label name.
8-5
Section 8. Working with Data Files on the PC
Maximum column heading width is one less than the number entered in the
Default Column Width field. However, entering a number in the Width row for
the column will set the column width for an individual column. Any FSL labels
that are too long for Split column headings will be shown in red. They should
be edited before running Split. To edit one of the labels, press the <Enter> key
or use a mouse to copy, cut, and paste. A Report Heading can also be entered
using the same editing technique.
For table based data files the timestamp is normally the first column and is a
quoted text string (“2002-02-26 10:30:00”). To display these timestamps in the
output you will need to change the column width for the first column to at least
24. If the column width is too small to accommodate the value output, the
string will be highlighted in red and preceded by an asterisk, with the words
“Bad Data” in the lower right corner when the file is processed.
To run Split, select RUN | GO. The hourly data will be split out and stored in
hourly.prn. The results are displayed on the screen as shown below.
NOTE When Split is running on large files, the line counters will update
only every 1000 lines.
8-6
Section 8. Working with Data Files on the PC
Close the Run window. If you wish to save this parameter file for future
reports, choose FILE | SAVE. The file will be saved with a .PAR extension.
Files stored in Table Oriented Binary (TOB) format are converted to Table
Oriented ASCII files when Split uses them. The converter runs in the
background when you run Split to create the output file. You cannot use the
Data Label browser to select the columns of data from a binary file. If you
want to use the Data Label browser you can open the file first using View,
which converts the binary file to ASCII and saves it under a new name, prior to
processing it with Split.
Split’s default output file, a field-separated ASCII format with a *.PRN file
extension, can be processed a second time if desired.
8-7
Section 8. Working with Data Files on the PC
COMMA SEPARATED
115,189,1200,89.6,55.3,25.36,270
115,189,1300,91.3,61.5,27.25,255.4
115,189,1400,92.7,67.7,15.15,220.1
115,189,1500,94.1,69,20.35,260.6
FIELD FORMATTED
115 189 1200 89.6 55.3 25.36 270
115 189 1300 91.3 61.5 27.25 255.4
115 189 1400 92.7 67.7 15.15 220.1
115 189 1500 94.1 69 20.35 260.6
PRINTABLE ASCII
01+0115 02+0189 03+1200 04+089.6 05+055.3 06+25.36 07+270.0
01+0115 02+0189 03+1300 04+091.3 05+061.5 06+27.25 07+255.4
01+0115 02+0189 03+1400 04+092.7 05+067.7 06+15.15 07+220.1
01+0115 02+0189 03+1500 04+094.1 05+069.0 06+20.35 07+260.6
Element 1 = Output Array ID# (115)
Element 2 = Julian day (189)
Element 3 = hour, minute
Element 4 = average temperature in deg. F
Element 5 = average soil temperature in deg. F
Element 6 = average wind speed in mph
Element 7 = wind direction in degrees
8-8
Section 8. Working with Data Files on the PC
For instance, to process two files named TEST.DAT and TEST_1.DAT the
user would select TEST.DAT and TEST_1.DAT as Input Files. Two blank
input file templates will be generated. To change from one template to the
other, click the appropriate tab on the bottom of the screen. Both templates
must be completed before Split will process the data. To merge different output
arrays from the same input file into one array, open the data file once for each
different array.
None
Select his check box to start reading the input file from the beginning.
Last Count
Each time Split runs a parameter file, it keeps track of the number of bytes
it read from the input file and saves this information in the parameter file.
Split can then start where it last left off. This is done by clicking the
Offsets button and selecting the Last Count option. This feature may be
used to process only the new data from a file in which new data are being
appended periodically to the data file.
8-9
Section 8. Working with Data Files on the PC
CAUTION When using the Last Count option, if the Start and Stop
Conditions are specified, they must exist in the newly
appended data or Split will never begin execution.
Specific
By selecting the Specific option and entering a number, Split will “seek”
that position in the file. This option saves time by starting (or stopping)
part way through a large data file. The number specifies the number of
bytes into the file to seek before processing data. A positive or negative
number can be entered. If the number is positive, Split will start reading
from the beginning of a file; if the number is negative, Split will start
reading from the end of a file. All characters, including spaces, carriage
returns, and line feeds, are counted.
In the following figure, Split will skip the first 256 bytes of data before it
begins processing the data in Input File.
8-10
Section 8. Working with Data Files on the PC
Align Array
When using a specific start offset, the number of bytes specified may
cause Split to seek to the middle of a row. Selecting the Align Array
check box will cause Split to begin processing at the beginning of the next
row.
Stop Offset
This number specifies the number of bytes from the beginning of the file
that Split should stop processing the data file.
In the following figure, Split will skip the first 256 bytes of data before
beginning and stop execution on byte 1024.
8-11
Section 8. Working with Data Files on the PC
To break the results into a column for each channel, enter the number of
channels for the Break Arrays value (Output File Tab, Other button).
When processing mixed-array data files using time synchronization, select this
check box if the time stamp is midnight at 2400 of the day just ending. This
will ensure that Split processes the data file correctly.
Time Offset
This field specifies a time offset, in seconds, that should be applied to each
item on the Select line that uses the Date or Edate function to output a date.
The offset can be positive or negative. Each input file can have its own offset
(or no offset) for its Select line.
NOTE The offset will not be applied to Date and Edate functions with
only two parameters. (The two-parameter mode is backwards
compatible with the original Date and Edate functions used in
older versions of Split.)
NOTE The font for Start Condition, Stop Condition, Copy, and Select can
be changed from the Options Menu.
ei[vali]
8-12
Section 8. Working with Data Files on the PC
For example, the data in TABLE 8–1 contains seven elements per Output
Array, representing hourly data. Assume that this data file contains one month
of hourly data. To start processing data at 1500 hours on the first day, the Start
Condition is expressed as 3[1500], where 3 means the third element within the
array and 1500 is the value of that third element.
The element must match this start value exactly to trigger the start condition.
However, when starting based on time, you can enable the “Start-Stop
On/After Time” function to trigger the start of processing when the exact time
is found or at the first instance of data after that time has occurred. This option
is found on the Output tab, Other button.
NOTE Table data files contain the time and date as a single quoted string at the
beginning of each data record. Split handles the dates as long as you
include a colon separator as a placeholder for each of the fields in the
timestamp. 1[Year]:1[Day of Year]:1[Time of Day]:1[Seconds]
Logical “and” and “or” statements can be used when specifying the Start
Condition. A logical “and” statement means that all conditions must be true for
the statement to be true. Up to three conditions can be connected with “and”
statements. If too many “and” statements are used, an error message will be
displayed when you run Split.
The logical “or” statement means that if any of the conditions are true, then the
statement is true. Split allows up to six conditions to be connected with “or”
statements. Additionally, each “or” statement can contain up to three “and”
conditions. As with the “and” statements, if the maximum number of valid
statements is exceeded, an error message will be displayed.
These rules for logical statements also apply to the Stop and Copy Conditions.
2[189]and3[1200]
Element two (the Julian day) must equal 189, and element three (the time in
hours/minutes) must equal 1200.
2[189]and3[1200]and4[92]and5[67]
8-13
Section 8. Working with Data Files on the PC
A range can be specified for vali by putting “..” between the lower and upper
limit. For example:
2[189]and7[200..275]
In this example two conditions must be satisfied to start processing data. First,
the day of year must be 189, and second, element 7 must be between 200 to
275 degrees, inclusive.
In this instance, Split will begin processing data when the date for both files is
one less than the current date (1:1[–1]:1[1200]:1:)and the time is 1200 (1:1[–
1]:1[1200]:1:).
2[–3]:3[–120,60] tells Split to find the closest 60 minute interval that is less
than the PC time minus 3 days and 2 hours. If the PC time is the day of year
159, hour 0017, Split will start reading on data output at 2200 hours on day
155.
2[–3]:3[–120]:4[20,5] tells Split to find the closest 5 second interval that is less
than the PC time minus 3 days, 2 hours and 20 seconds. If the PC time is 27
seconds after noon on day 30, Split will begin reading on data output at 1000
hours and 05 seconds on day 27.
Split can also begin processing a file on a particular month and day. Use the
syntax :E[Month%Day]::, where E is the element that contains the Julian Day,
and Month and Day are either constants or a value related to PC time. For
example:
:2[–1%1]:: tells Split to begin processing on the first day of the previous
month.
8-14
Section 8. Working with Data Files on the PC
This function can be used in both the Start and Stop conditions. It provides a
simple way to create a monthly report. For additional information, refer also to
Section 8.2.3.1.15.2, Using Time Synchronization While Starting Relative to
PC Time (p. 8-39).
CAUTION Split will not start reading if the exact specified starting time
cannot be found, unless you enable the “Start-Stop On/After
Time” feature. The interval (5 minutes, 60 minutes, and 5
seconds in the examples above) must be evenly divisible
into 60 minutes.
The Stop Condition is expressed with the same syntax as the Start Condition. If
the Stop Condition parameter is left blank, Split will execute until the end of
the file. As with the Start Condition, logical “and” and “or” statements can be
used when specifying the Stop Condition (Section 8.1.3.1.3, Start Condition (p.
8-12)), as well as stopping based on PC time.
The array or record containing the Stop Condition is not included in the output
file. If the stop value is not found, Split will display a dialog box that gives the
option to select a new file and continue processing the data. This feature is
useful when data are contained in more than one data file.
The “Start-Stop On/After Time” function can be used with a Stop Condition.
This will stop processing of the file when the exact time is found or at the first
instance of data after that time has occurred. This option is found on the
Output tab, Other button.
8.2.3.1.4.1 “C” Option: Formatting Event Tests Containing Conditional Output Arrays
The C option is used to combine data from two or more conditional arrays onto
one Split output line. A conditional array is one that is only output when a
defined event occurs.
8-15
Section 8. Working with Data Files on the PC
Assume that two or more conditional Output Arrays with unique Output Array
IDs compose a test period, followed by an unconditional Output Array that
defines the end of a test. The unconditional “end of test” Output Array is at the
end of each test, but the conditional Output Arrays may or may not be present.
The data file is comprised of several of these tests.
As an example, let’s look at a vehicle test application. The start of the test is
when the vehicle is turned on, and the end of the test is when the vehicle is
turned off. The conditional output arrays could be:
The unconditional array data (the stop condition) would be output to a unique
array when the engine is turned off. By processing the data with Split using the
C option, the data collected during each test could be merged on to one line,
with blanks inserted if a set of data didn’t exist (e.g., if the engine temperature
never exceeded the defined limit).
• An Input File must be set up for each array ID in the test. The first Input
File is configured on the Input File tab that appears when you open Split.
Additional Input Files are added by choosing Edit | Add Data File from
the Split menu. The same data file will be used as the Input File for each
array.
• Type in the array ID in the Copy field of the Input File tab for each array.
The array ID is the first element of a data file, so the line should read
1[123], where 123 is the actual array ID you want to process.
• In the Select field, type in the number for each element (data value) you
want to be output in the report.
• In the Stop Condition field, type in a “C,” followed by the ID of your stop
condition array. If your “end of test” array was array ID 200, the Stop
Condition field would read: C,1[200]. This should be typed into the Stop
Condition fields of each array, including the “end of test” array.
Set up the Output File as you would for any Split process. If you are including
column headings, the arrays and elements will appear in the order they are
listed on the Input File tabs. That is, the first column will be Input File number
1, element number 1; the next column is Input File number 1, element number
2… Input File number 2, element number 1 follows in the column immediately
after the last element of Input File number 1.
8-16
Section 8. Working with Data Files on the PC
This table contains four different output arrays: 100, 101, 102, and 200. During
the first test, data was output from all three conditional arrays (100, 101, and
102), with 200 signaling the end of the test. During the second test, data was
output from arrays 100 and 102. During the third test, data was output from
arrays 100 and 101.
To process these files using the C option, the parameter file would be set up as
follows (assuming the name of our data file is Data_1.DAT):
NOTE The :(number) after the data file name is inserted automatically by
Split.
100 12.1 10 32.6 101 92.7 67.7 102 56.1 48.7 98 220.1
100 12.5 9.89 30.1 102 56.2 50 100.5 210.6
100 13.1 10.1 33.1 101 94.1 69
8-17
Section 8. Working with Data Files on the PC
When Split is run, the resulting data file will look similar to TABLE 8-3. Each
line of data represents one test. Notice that blanks were inserted if the data set
(conditional array) did not exist.
If only hourly Output Arrays were contained in the Input File, the Copy line
could be left blank. If other Output Arrays are present which need not be
included in the Time Series processing, a logical Copy condition would be the
Output Array ID of the hourly output.
The Trigger on Stop Condition functions the same for multiple Input files as it
does for a single Input File. If the option is enabled on several Input Files, and
the Stop Conditions do not occur at the same point in each file, when a file’s
Stop Condition is met, its time series data are output and blanks are output for
data selected from the other Input Files.
Say, for example, that you were interested in the average value of the first data
point (element 2) for each test, in the data set listed in TABLE 8-2. The Input
File template would look like that shown in TABLE 8-4.
8-18
Section 8. Working with Data Files on the PC
8.1.3.1.5 Copy
The Copy Condition tells Split which arrays should be used for the output data.
After the Start Condition is satisfied, and before the Stop Condition is met, the
Copy condition must be satisfied before any data will be processed according
to Select line instructions. If the Copy condition is left blank, all arrays are
processed between the Start and Stop values. Syntax for the Copy condition is
similar to the Start and Stop values mentioned above. Logical “and” and “or”
statements (see Section 8.1.3.1.3, Start Condition (p. 8-12)) can be used when
specifying the Copy condition.
For example, referring to TABLE 8–1, if only those hours during day 189
when the temperature was above 90 and the soil temperature was below 62 is
desired, or, during day 189 when the average wind speed was below 21 while
the wind direction was between 255 to 265 is desired, the Copy condition
would be:
1[189]and4[90..150]and5[0..61.99]or1[189]and6[0..20.99]and7[255..265]
Only Output Arrays with hours 1300 and 1500, TABLE 8–1, conform to the
above Copy conditions.
Time Ranges
When specifying a Copy condition, a range of time values can be specified
instead of a single time. If the element being tested falls within the range, the
Copy condition is satisfied and the data is processed. A range is indicated by
entering two periods between the first and last values of the range.
Examples:
Table-based
Array-based
8-19
Section 8. Working with Data Files on the PC
(This assumes 2 is the year element, 3 is the day element, and 4 is the
hour/minute element.)
8.1.3.1.6 Select
The Select line specifies which elements of an Output Array are selected for
processing and/or output to the specified Output File. The Select line becomes
operable only after the Start Condition and Copy condition are met, and before
the Stop Condition is satisfied. If the Select line is left blank, all elements in
output arrays meeting the Start Condition and Copy conditions are output to the
Output File.
Processing is accomplished through arithmetic operators, math functions,
spatial functions, and time series functions.
8.1.3.1.7 Ranges
Element numbers may be entered individually (e.g., 2,3,4,5,6,7), or, in groups
(e.g., 2..7) if sequential. Range limits (lower to upper boundary conditions)
may be placed on elements or groups of elements specified in the Select or
Copy lines. For example, 3[3.7..5],4..7[5..10] implies that element 3 is selected
only if it is between 3.7 and 5, inclusive, and elements 4,5,6, and 7 must be
between 5 and 10, inclusive.
If range limits are used in the Select condition, when Split is run, any data
which are outside of the specified range will be highlighted according to the
options chosen for the output file. TABLE 8-5 summarizes what each option
produces on the screen and in the output file if out of range data are
encountered. This type of range testing is a quick way to identify data
problems.
8-20
Section 8. Working with Data Files on the PC
TABLE 8-5. Effects of Out of Range Values for Given Output Options
RPT File or
Output Option Screen Display* PRN File Printer Output
Report = None; No bad values displayed in red and blanks inserted for N/A
other options defined preceded by asterisk; the text “bad data” bad values
(default) highlighted in a red box at bottom right
of screen
Report = File or bad values displayed in red and blanks inserted for bad values
Printer; no other preceded by asterisk; the text “bad data” bad values preceded by
options defined highlighted in a red box at bottom right asterisk
of screen
Report = None; bad values displayed in red and abc inserted in N/A
replacement text (abc) preceded by asterisk; the text “bad data” place of bad values
in “Replace bad data highlighted in a red box at bottom right
with” field of screen
Report = File or bad values displayed in red and comment inserted bad values
Printer; comment in preceded by asterisk; the text “bad data” in place of bad preceded by
“Replace bad data highlighted in a red box at bottom right values asterisk
with” field of screen
Report = None; only lines with bad data are displayed; only lines with bad N/A
“Display only bad bad values displayed in red and data output; blanks
data” option enabled preceded by asterisk; the text “bad data” inserted for bad
highlighted in a red box at bottom right values
of screen
Report = File or only lines with bad data are displayed; only lines with bad only lines with bad
Printer; “Display only bad values displayed in red and data output; blanks data output; bad
bad data” option preceded by asterisk; the text “bad data” inserted for bad values preceded by
enabled highlighted in a red box at bottom right values asterisk
of screen
*The Screen Display box must be checked; if not, no data will be displayed on the Split Run screen.
NOTE In this instance, out of range data refers to data outside of the
specified output range. It is not to be confused with out of range
data generated by the logger.
8.1.3.1.8 Variables
Variables can be assigned names in the Select line. For example, x = 4–5∗(6∗
3.0) means that x is equal to element 6, times the number 3, times element 5,
subtracted from element 4. A numeric value is distinguished from an array
element by the inclusion of a decimal point. Variables must be declared before
they can be used in the Select line. A variable name must start with an alpha
character, can include numbers and must not exceed eight characters. Variable
names can start with the same character but they must not start with another
complete variable name (e.g., the variable XY is not valid if there is also the
variable X). A comma must follow each variable statement, as with all
parameters in the Select line. Once the variables have been declared they can
be used later in the Select line (i.e., x=4–5∗(6∗3.0), y=6/3,2,3,6,7,7∗x,6+y).
8-21
Section 8. Working with Data Files on the PC
NOTE Variables can be defined in the first four Input File’s Select lines
only, but may be used in subsequent Input File’s Select lines.
8-22
Section 8. Working with Data Files on the PC
The following array of ASCII data will be used for all Mathematical function
examples.
0105 0176 1200 –07.89 55.10 12.45 270.5
8-23
Section 8. Working with Data Files on the PC
8-24
Section 8. Working with Data Files on the PC
2. at the end of a data file (or within a range specified by Start and Stop
Conditions)
When the Trigger on Stop Condition (or F option) is used, any time series data
defined in the Select line is output each time the Stop Condition is met. Refer
to Section 8.2.3.1.4.2, Trigger on Stop Condition (F Option) Output of Time
Series (p. 8-18), for more information on the Trigger on Stop Condition.
Results which are output at the end of a file or a range of data are referred to as
Final Summaries. A typical select line that would produce a Final Summary is:
1,2,3,4,Avg(4)
This line would output values for elements 1 through 4 each time an array was
output. Additionally, an average value for element 4 would be calculated for
the entire file and output as the last line of data in the output file.
1,2,3,4,Avg(4;24)
This line would output values for elements 1 through 4 each time an array was
output, and an average value for element 4 would be calculated every 24th array
and output as an additional column in the file. An additional summary would
occur for an Interval Count if the count was not evenly divisible into the
number of output arrays present in the Input File. The summary, in this case, is
calculated from an incomplete interval count.
The date( ) function can be used for the interval in a time series function to
produce monthly output. Refer to the Monthly summary example in Section
8.1.3.1.12, Special Functions, Details, and Examples (p. 8-29).
NOTE When Date and Edate are used within other functions they must
be used with the older format Date(doy;y) and Edate(doy;y)
instead of using the extended date functions. For example
AVG(1;Date(2;2002.0)). The decimal is needed to indicate a fixed
number. Numbers without the decimal are interpreted as element
IDs.
The interval count in a Time Series Function is optional and does not require a
decimal point. To determine the interval, Split counts the number of arrays
which meet the specified conditions (Stop, Start, and Copy). If the time
synchronize function is enabled, the Time Series functions remain
synchronized to the starting time even if a complete array is missing from the
input data. When elements are missing, the Time Series calculations are based
on the actual number of elements found.
8-25
Section 8. Working with Data Files on the PC
The following set of weather data from Mt. Logan in northern Utah gives a
total of seven elements each hour. This Field Formatted output, with title and
column headers, was generated by Split. These data are used in the following
examples of Time Series functions.
Avg(x;n) returns the average of element x over a full data set or every
nth value.
Examples:
Avg(3) = 59.898 (average daily temp)
Avg(3;4) = 57.36 (average 4 hour temp)
56.493 (average 4 hour temp)
60.708 (average 4 hour temp)
61.998 (average 4 hour temp)
66.148 (average 4 hour temp)
56.683 (average 4 hour temp)
8-26
Section 8. Working with Data Files on the PC
NOTE Blanks and Count are functions designed for checking the
integrity of the data file. A common use for these two functions is
“100.*BLANKS(x;n)/BLANKS(x;n)+COUNT(x;n)” which gives
the percentage of holes (bad data) in the file.
Max(x;n) returns the maximum value of element x over a full data set
or every nth value.
Examples:
Max(5) = 17.12 (max WS for day)
Max(5;12) = 10.41 (max WS for 12 hours)
17.12 (max WS for 12 hours)
Min(x;n) returns the minimum value of element x over a full data set
or every nth value.
Examples:
Min(7) = 4.23 (min std. dev. of WS for day)
Min(3;8) = 55.33 (min temp for 8 hours)
59.79 (min temp for 8 hours)
55.22 (min temp for 8 hours)
RunTotal(x;n) returns a running total of element x for every line in the data
set. If an nth value is specified, a running total will be output
every nth value.
Example: RunTotal(5) =
5.85
14.12
21.87
29.47
39.88
48.87
:
:
:
166.76
182.38
199.50
211.36
211.36
8-27
Section 8. Working with Data Files on the PC
SmplMax(x;y;n)
looks for a maximum value in element x and samples
element y when the maximum is found. If an nth value is
specified then it outputs the sample on a maximum every nth
value, otherwise it outputs the sample on a maximum at the
end of file.
Examples:
SmplMax(5;(3)) = 55.48 (on max wind speed sample
temperature)
SmplMax(5;(3,6);8) = 56.57 307.3
60.93 317.5
55.48 338.7
(on max wind speed sample temperature and wind direction
every 8 hours)
Total(x;n) returns the total of element x over a data set or every nth
value.
Examples: Total(5) = 211.36 (daily wind run)
8-28
Section 8. Working with Data Files on the PC
The Mt. Logan data set is used for the Special Function examples. These
functions are helpful in converting time fields to formatted timestamps and
formatting the output. Since one of the main differences between mixed-array
data files and table based data files is the time format, these functions can be
used to convert between file types.
NOTE If you are processing the data file in multiple passes including
formatting of the date and time fields, you should put the date
processing in the final pass. Split cannot read all of the timestamp
formats that it can produce. For example, the quoted timestamp in
table based data files has a specific structure. Any changes to the
structure will make the timestamp unreadable for Split.
Crlf returns a carriage return and line feed where the Crlf is
placed in the parameter file.
Examples:
Smpl(“Max Temp”;24),Max(3;24),
Smpl(Crlf;24),Smpl(“Max RH”;24),Max(4;24)
= Max Temp 67.33
Max RH 38.8
The Crlf is placed after the maximum temperature 67.33 so
that the maximum RH is on the next line.
8-29
Section 8. Working with Data Files on the PC
Make sure that the column widths are big enough for the
label to fit. Otherwise the output will indicate Bad Data.
Examples:
“Max Temp” =
Max Temp (outputs Max Temp
Max Temp 24 times)
.
.
.
Max Temp
Line numbers each line written to the report file or printer. This
differs from the Count function in that Count looks at how
many lines were read.
Examples:
Line, 4, 5 =
1 17.42 5.855
2 17.65 8.27
3 17.76 7.75
4 18.89 7.6
5 19.6 10.41
6 23.32 8.99
7 24.79 9.52
. .
. .
. .
19 24.75 7.08
20 26.03 8.76
21 27.45 11.81
22 35.46 15.62
23 38.8 17.12
24 37.13 11.86
8-30
Section 8. Working with Data Files on the PC
smpl(.PA,n) Outputs the data to the printer or .RPT file with n lines per
page.
Examples:
2, 3, Smpl (.PA;12) =
100 58.56
200 57.48
. .
. .
. .
1100 61.34
1200 60.61
1300 61.01
1400 60.93
. .
. .
. .
2300 55.48
0 55.22
8-31
Section 8. Working with Data Files on the PC
NOTE Split will mark the date as Bad Data if the time and date resulting
from the conversion will not fit in the specified column width. The
on-screen display and the report file will precede the date with
asterisks. In the .PRN output file, Split uses the Bad Data string.
When Date and Edate are used within other functions they must
be used with the older format Date(doy;y) and Edate(doy;y)
instead of using the extended date functions as shown in the table.
For example AVG(1;Date(2;2002.0)). The decimal is needed to
indicate a fixed number. Numbers without the decimal are
interpreted as element IDs.
Assume that in a mixed array data file, element 2 is Year, element 3 is Day of
Year, element 4 is Hour/Minute, and element 5 is Seconds.
If a time element is missing from a mixed array data file, use a valid constant
instead.
If processing a table-based data file, use a 1 for all time elements (assuming the
time stamp is the first element in the data file). For the examples above:
8-32
Section 8. Working with Data Files on the PC
NOTE When processing a data file from a mixed array datalogger, if the
time stamp uses midnight as 2400 with “today’s” date, the date
function will convert that time stamp to 0000 hours with
“tomorrow’s” date. The “No Date Advance” function can be used
to stop the date from rolling forward (Other button, No Date
Advance check box).
The Date function can be used to produce a monthly summary of daily time
series data by using Date( ) for the interval in the time series function. This will
trigger time series output for the first day of each month. The syntax is
avg(7;date(3;2)), where you want to take a monthly average of element 7, and
the day of year is contained in element 3 and the year in element 2. If you have
data recorded on a once per minute or once per hour basis, it must first be
processed into a 24 hour summary for this function to produce the output
expected.
NOTE When Date and Edate are used within other functions they must
be used with the older format Date(doy;y) and Edate(doy;y)
instead of using the extended date functions. For example
AVG(1;Date(3;2)). When used with table based data files the
format would be AVG(1;Date(1;1)).
When producing a monthly summary and outputting the month along with the
data, you might want to set up the value for the month as “month –1”, to
correctly reflect the month that the data actually represents.
8-33
Section 8. Working with Data Files on the PC
8-34
Section 8. Working with Data Files on the PC
The following screen shows the output file setup including the column
headings and the units.
This .PAR file produces a wind chill summary of the Mt. Logan Peak data set.
The formula for calculating wind chill is given as follows:
Te = 33–(h/22.066)
where
Te = Wind Chill equivalent temperature, degrees C
h = ((100V)0.5 + 10.45–V)(33–T)
where
h = Kcal m–2 hr–1 wind chill index
v = wind speed in meters/second
T = temperature in degrees C
Note that at wind speeds between 0 to 4 mph (0 to 1.8 m/s), the wind chill
should be ignored because this formula results in wind chill temperatures that
are greater than the ambient temperature. The National Weather Service
includes wind chill in reports only when temperatures drop below 35°F (1.7°
C).1 The formula is for example purposes and is not endorsed by Campbell
Scientific as a standard.
When this .PAR file is executed, the following output is displayed on the
screen.
8-35
Section 8. Working with Data Files on the PC
Reference
1
“Wind Chill Errors”, Edwin Kessler, Bulletin of the American Meteorology
Society, Vol. 74, No. 9, September 1993, pp 1743–1744.
8-36
Section 8. Working with Data Files on the PC
ei[day]:ei[hrmn]:ei[seconds]
Referring to TABLE 8–1, to identify the day of year for a mixed-array data
file, type:
2[189]::
:3[1200]:
::4[5]
A single colon is assumed to be between day and hrmn (e.g., 2[189]: means
day, :3[1200] means hours, and 2[189]:3[1200] means day and hour-minute).
When the time synchronize function is used, a time interval must be specified
in the Copy line of the first data file. For example, 4[60] in the Copy line will
create a synchronized file containing the data from the input files that occurred
every 60 minutes. If no time interval is specified in the Copy line then the time
specified in the Start Condition becomes simply a starting time with no time
synchronization.
8-37
Section 8. Working with Data Files on the PC
Typically, the starting time specified must actually be found in the input file
before the Start Condition is satisfied (e.g., if the input file starts at 1100 hrs
and 1000 hrs is entered for the starting time, with no day specified, Split will
skip over arrays until it reaches 1000 hrs the next day). However, the Start-
Stop On/After Time function can be enabled (Output tab, Other button) to
trigger the start of processing when the exact time is found or at the first
instance of data after that time has occurred.
Table-based dataloggers
Because the time stamp for a table-based datalogger is all one string, and
therefore read by Split as one element, the syntax is somewhat different. All
elements in the time stamp are specified by a 1 (if the time stamp is the first
item in each row of data).
The 1s in the string identify the position of the time stamp in the line of data.
Each colon represents a portion of the time stamp. The format is
1[year]:1[day]:1[hour/minute]:1[seconds]. The colons in the time stamp must
be present or the function will not work correctly.
NOTE Time synchronization can only be done for data from a single year.
It will not work over a year boundary.
Time elements can be identified without specifying a starting time (e.g., 2:3). If
you are working with only one file, Split will begin processing that file at the
first record in the file. If any gap in the data is found, blank data (or the
“Replace Bad Data With” text) and a carriage return line feed will be inserted
for each line of missing data. Note that Split will also detect a gap in data if, for
instance, you specify a start time of 2[92]:3 (start at Julian day 92) and your
hour/minute for day 92 starts at 9:30 a.m. The time between the start of the day
(0000) and 9:30 a.m. will be considered missing data. Blanks (or the “Replace
bad data with” text) and a carriage return line feed will be inserted at the
beginning of the PRN file for each “missed” output interval.
If you are working with two or more files, once Split starts processing the files
(based on the time of the first record of the first file), if no data exists for the
other file(s), blank data will be inserted.
If multiple input files are given specific starting times, Split starts the output at
the earliest specified starting time. In a PRN file, Blanks or the comment
entered in the “Replace bad data with” field are inserted for values from other
input files until their starting times are reached. In a RPT file only blanks are
used.
NOTE When using time synchronization with a mixed array data file,
with a midnight time stamp of 2400, you will need to select the
Other button, Midnight at 2400 hours check box.
8-38
Section 8. Working with Data Files on the PC
The interval can be given tolerance limits by following the interval with a
comma and the tolerance. For example, if 3 is the hrmn element, and the time
interval is 60 minutes +/–2 minutes, the syntax is 3[60,2].
Table based data files need to use the same time format as described in Section
8.1.3.1.3, Start Condition (p. 8-12). You can specify the interval for time
synchronization on table files as ::1[60]: which will give you an output interval
of 60 minutes.
If the time synchronize function is enabled and data are missing at one or more
of the time intervals specified, then a blank (or the comment entered in the
“Replace bad data with” field) is output to the Output File. See TABLE 8-5.
Start Condition
2[–1]:3[50]:
Copy Condition
1[106]and3[60,10]
Where:
element 1 is the array ID
element 2 is the Julian day
element 3 is the hour/minute
The Start Condition directs Split to begin processing data when the time is one
day prior to the current PC time and when the hour/minute value is equal to 50.
The 1[106] in the Copy Condition specifies the array from which the data
should be copied. The 3[60,10] indicates that the interval for the time stamp is
60 minutes and designates a 10 minute time window on each side of the top of
the hour in which Split should look for the hour/minute data (10 minutes before
the hour, 10 minutes after the hour).
The second file’s Copy Condition should include only the array from which to
copy the data. No interval is necessary.
8-39
Section 8. Working with Data Files on the PC
Split will assign this file an extension of .PRN if an extension is not specified by
the user. Whenever an Output file name is entered, regardless of extension, an
Output file is created only when the RUN | GO menu option is selected.
If the file name you have selected already exists, you can use the “If File Exists
Then” drop-down list box to determine what action Split will take. By default,
each time a PAR file is run the existing output files (PRN, RPT, and HTM) are
overwritten (Overwrite option). When Append is selected, the PRN file will
not be overwritten — the new data will be added to the end of the existing file.
However, the RPT and HTM files will be overwritten. If Create New is
selected, Split will create all new files using the original file name and
appending an _0, _1, and so on to each subsequent run.
In Append mode, if an HTM or RPT file is needed with all the data, you will
need to run the PRN created by Split through the program a second time. If the
Output File name is left blank, Split does not write data to an Output File on
disk; rather, it will display the processed values on the screen if the Screen
Display box is checked. If Screen Display is not enabled, no data will be
displayed on the Split RUN screen.
CAUTION The Output file name cannot be the same as the Input file
name. Split will display an error message if this condition
occurs.
Several output options may be specified to alter the default output to the file.
Some are located on the main OUTPUT FILE screen and some are made
available by pressing the Other button.
8-40
Section 8. Working with Data Files on the PC
The Custom file format uses the regional settings in the Windows operating
system to determine the decimal symbol and the separator used with data values.
In the Regional Settings for Numbers, the decimal symbol uses the character
specified in the Decimal Symbol field; the separator uses the character specified in
the List Separator field. These settings are typically found in Control Panel |
Regional Settings (or Options), Numbers tab. This allows users who are used to
the comma “,” as the decimal and the period “.” as a data separator to see the
output data in that format.
8-41
Section 8. Working with Data Files on the PC
Screen Display
The Screen Display field controls writing the processed data to the screen. To
write to the screen, check the box. For faster execution, clear the box to omit
writing to screen. The data will then be written to the file only.
Report
A report, with page and column headings, can be sent to a file or printer. There
are three report options: File, Printer, HTML. One or more can be selected. A
report sent to a file has the extension of .RPT. If the report is sent to a printer,
the printer must be on-line. In all cases a .PRN output file is created. A basic
HTML file can be created containing the formatted report data. The HTML file
can be used as a display of the formatted data output in a web browser.
NOTE To remove page breaks in the HTML file, enable the “No FF”
option.
Other
The Other button provides access to the dialog box shown below.
8-42
Section 8. Working with Data Files on the PC
Replace bad data with – The text in the field, to the right of this option,
is entered into the .PRN output file data set if data are blank, bad, or out of
range. See TABLE 8-9 for definition of blank or bad data. Whatever text
string the user enters in the field will be entered if a blank or question
mark is in the data or if data are out of range. This option is useful when
the Output file is imported into a spreadsheet program, such as Excel.
Only display lines with bad data – Outputs only those arrays containing
one or more Out of Range elements. If a report is generated, an asterisk
precedes the Out of Range value in the .RPT file.
Start-Stop On/After Time – In most instances, Split will not start or stop
processing a file unless the exact start condition is found. However, when
starting or stopping based on time, you can enable Split’s Start-Stop
On/After Time option. This will trigger the start (or end) of processing
when the exact time is found or at the first instance of data after that time
has occurred (which meets other defined criteria in the PAR file).
Time Sync to First Record – This option is used with the time-sync
function. It allows you to set specific times in the Start Condition, but
have synchronization start at the first record in the file that meets the Start
Condition. This may avoid an output file that starts with blank lines.
For example, you have table-based data file(s) containing 15 minute data.
Your first data file starts on Sept 9th at 12:15 p.m. You want to time sync
the files and output only the data that occurs at midnight.
You need to specify ‘0’ for the hour/minute field in the Start Condition or
the output will contain the data that occurs each day at 12:15. Therefore,
you would use:
8-43
Section 8. Working with Data Files on the PC
Because you have specified a time in the Start Condition, but not the day,
Split assumes the first day of the year. Therefore, by default, you will
have blank lines in your output file for each day from Jan 1st to Sept 9th.
Using the Time Sync to First Record option will avoid these blank lines.
Match files – This option compares two files of the same data. If good
data exists in one and not the other (question marks), then Split will fill
the OUTPUT file with the good data. This is used to get a more complete
record from an error ridden file (e.g., one recorded at freezing
temperatures by reading a tape twice and running both files through Split).
CAUTION For the Match files option to produce a correct Output File,
the differences between the two Input Files can only be
question marks. Both files must have the same Start
Condition or the beginning of both files must be the same.
Transpose file – Transposes the rows and columns of the input file. Only
one Input File can be transposed at a time and no Select options can be
specified. A maximum of 26 arrays are transposed per pass of Split.
No FF – Suppresses form feeds and page breaks in RPT and HTML files.
When this option is selected, a header appears on the first page only. This
option is used for printing reports on continuous feed paper or for
displaying HTM files in a browser.
Break arrays – This option breaks up the Output Array into new arrays
that are #+1 elements in each new array. Split automatically assigns an
array ID number equal to the first element in the first array. Only one
Input File may be specified. Start, Stop, and Copy Conditions may be
specified, but the Select line must be left blank.
NOTE The Break Arrays function works only for mixed array data. It is
typically used when processing data from burst measurements.
8-44
Section 8. Working with Data Files on the PC
summary of the left over values and the Time Series Heading from the
report.
At Julian Day 151 (May 31) 2400 hours, the date function produces an
output of June 1 00:00 hours. The date can be stopped from rolling
forward by using the No Date Advance check box. The output will then
be similar to:
Caution should be used when applying the date function and enabling or
disabling No Date Advance, since it is possible to produce an incorrect
date. For instance, using the above example if you were to enter the
following into your select line:
3,edate(“hh:mm”;4;3;2)
with the No Date Advance enabled, you would get the output:
edate(“mm/dd/yy”;4;3;2),4,6,7
with the No Date Advance disabled, you would get the output:
No Dashes – When the No Dashes check box is selected, the dashed line that
typically appears under the column headings will not be displayed. This option
affects all output types (PRN, RPT, HTM, and printed page).
8-45
Section 8. Working with Data Files on the PC
Heading and Column Headings from being printed at the bottom of the report.
The “left over” summary data will still be printed.
When Time Series functions are used in the Select field without an interval,
they appear as a final summary at the end of the report. They can be labeled by
entering a title into the Time Series Heading field at the bottom of the Output
File page. Time Series interval summaries cannot be assigned individual titles
directly, but you can use special functions such as “Label” and “Crlf” to create
column headings and special formatting.
“PCDATE” within the Report Heading inserts the computer’s current date
(Month-Day-Year). For the European format (Day-Month-Year), enter
“PCEDATE”.
Column headings associated with Time Series outputs are repeated for Final
Summaries if a title for the Final Summary is requested on the headings for
report line.
The number of digits to report to the right of the decimal point is entered in the
Decimal field and can be set independently for each column. The value output
will be rounded to the specified number of digits. Leave this field blank if you
do not want to round the data to a specific number of digits.
Column headings can be entered using Split’s Data Labels Function (Labels |
Use Data Labels).
8-46
Section 8. Working with Data Files on the PC
SPLITR LOGAN/R
The /R switch should follow immediately after the parameter file name with no
space between the two. If a space is used, the following message will be
displayed “There was a problem opening the input file. File could not be found
or may be in use.”
SPLITR /H LOGAN
The /H switch must be positioned after SPLITR but before the parameter file
name, and a space is required between the executable name and the switch.
8-47
Section 8. Working with Data Files on the PC
NOTE When using the /M switch in a batch file, the behavior may depend
on your Windows version. In some cases, the files will be
processed simultaneously, while in other cases, the files will be
processed sequentially. It may be possible to change this behavior
using the Windows "start" command.
Batch files process each command in succession, without waiting for execution
of a command to be completed before proceeding to the next unless they are
configured to do so. If multiple parameter files are being processed using Splitr
in a batch file, there are no conflicts because only one copy of Splitr can be
active at any one time (unless the /M switch is used. However, if other
commands are used along with Splitr (such as opening the file in a spreadsheet,
copying it to an archive directory, or appending it to an existing file) these
commands might be executed before Splitr finishes processing data.
The Windows Start /w (wait) command can be added to a batch file command
line to delay execution of the next command until the first command has
finished. The Start command has different arguments depending upon the
operating system you are using. Refer to your computer’s on-line help for
information on this command.
8-48
Section 8. Working with Data Files on the PC
Command line switches can be used to control these options for the output and
input files. The switch is added immediately after the input or output file name.
NOTE In most instances, full path names to the Splitr executable and the
input and output file names must be used. In addition, if long file
names are used in the path, you may need to surround the path and
file name by double quotes.
These switches are entered after the output file name; e.g., Splitr Test.par/r
Input.dat Output.prn/P
/P Sends the output to a printer. This is the same as checking the Printer
box for the Report type on the Output File tab.
/R Creates a formatted RPT file. This is the same as checking the File
box for the Report type on the Output File tab.
/W Creates a simple HTML file. This is the same as checking the HTML
box for the Report type on the Output File tab.
/A Appends the output to the end of an existing file. This is the same as
selecting Append for the If File Exists option on the Output File tab.
/L Creates a new output file with a different name if a file exists. This is
the same as selecting Create New for the If File Exists option on the
Output File tab.
/O Turns the screen display off when Split is processing the PAR file.
This is the same as clearing the Screen Display check box on the
Output File tab.
/6..9 Sets the default width for all the columns in the report. This is the
same as entering a value in the Default Column Width field on the
Output File tab.
/[text] Sets the text that will be used in the place of bad data. This is the
same as the text string used in the Replace Bad Data field that is
found under the Other button of the Output File tab.
8-49
Section 8. Working with Data Files on the PC
/M Compares two input files and creates an output file with a complete
data set comprised of both files. This is the same at the Match Files
option that is found under the Other button of the Output File tab.
The two input file names are separated with a comma but no spaces.
Example: Splitr Test.par/r Input1.dat,Input2.dat Output.prn/M
/S Writes the output file without a form feed command after each page.
This is the same as the No FF (form feed) option that is found under
the Other button of the Output File tab.
/G Outputs only the data marked as “bad” to the file. This is the same as
the Only Display Lines with Bad Data check box that is found
under the Other button of the Output File tab.
/D Enables the No Date Advance function, which keeps the date for
midnight from rolling to the next day. This is the same as choosing
the No Date Advance check box that is found under the Other
button of the Output File tab.
/H Removes the dashed lines from the heading of the RPT file. This is
the same as choosing the No Dashes check box that is found under
the Other button of the Output File tab.
/U Removes the record number from TOB files that are processed with
Split. This is the same as choosing the No Record Numbers from
8-50
Section 8. Working with Data Files on the PC
TOB Files check box that is found under the Other button of the
Output File tab.
/E Begins processing the file, or stops processing the file, on or after the
Start or Stop condition when starting or stopping based on time (the
default is to start only if the exact start condition is found). This is the
same as choosing the Start -Stop On/After Time option that is found
under the Other button of the Output File tab.
/Bnnn Breaks a long array into multiple lines, where nnn is the number of
values to place on each line. This is the same as choosing the Break
Arrays check box that is found under the Other button of the
Output File tab.
These switches are entered after the input file name; e.g., Splitr Test.par/r
Input.dat/L Output.prn
/nnn Begins processing nnn bytes into the file. If /nnn..mmm is used, then
processing begins at nnn bytes into the file and stops at mmm bytes
into the file. This is the same as setting a specific Start and Stop
offset, which is found under the Offsets/Options button of the Input
File tab.
/L Begins processing the file at the byte value where processing last
stopped. If /L..mmm is used, then processing begins where it left off
and stops at mmm bytes into the file. This is the same as enabling
Last Count, which is found under the Offsets/Options button of the
Input File tab.
/Bnnn Specifies the file type as Burst data. nnn indicates the size of the
arrays. This is the same as selecting Burst Format for the File Info
field on the Input File tab.
/F Specifies the file type as Final Storage (binary) data. This is the same
as selecting Final Storage Format for the File Info field on the Input
File tab.
/M Changes the value for midnight to 2400 instead of 0000. This is the
same as selecting Midnight is 2400 Hours check box found under
the Offsets/Options button of the Input File tab.
8-51
Section 8. Working with Data Files on the PC
“c:\Program Files\campbellsci\LoggerNet\splitr.exe”
c:\Campbellsci\SplitW\switch-test.par input1a.dat Output.prn/E/H/W 4[1200]: ,
, 1..6
where
PAR file: switch-test.par
Input file: input1a.dat
Output file: output.prn
Other outputs: Output.HTML
Start condition: on or after 1200
Stop condition: end of file
Copy condition: none
Elements: 1 through 6
8.2 CardConvert
CardConvert is a utility that is used to quickly read and convert binary
datalogger data that is retrieved from a compact flash, microSD, or PCMCIA
card. The converted data is saved on the user’s PC.
8-52
Section 8. Working with Data Files on the PC
Press the Select Card Drive button to bring up dialog box that helps you
browse for the drive assigned to the card reader. Note that you can also select a
directory on your hard drive in which binary data files have been copied. When
a card drive or directory is selected, any binary files found with a *.dat
extension will be displayed in the Source Filename column in CardConvert.
By default, the converted data files will be saved to the same drive or directory
as the source files. To change the destination, press the Change Output Dir
button. Once again you will be provided with a dialog box that helps you to
browse for the desired drive or directory. When the drive or directory is
selected, the path and the filename that will be used for the converted files will
show up in the Destination Filename column.
The default filename for a converted file is comprised of the table name in the
datalogger program, along with a prefix that reflects the file format, and a *.dat
extension. For instance, the default name for a table called MyData stored in
TOA5 format would be TOA5_MyData.dat.
You do not have to convert all files that are found in the selected directory.
Select one or more files for conversion by selecting or clearing the check box
beside the individual file name. If a box is checked the file will be converted; if
a box is cleared the file will not be converted. To quickly select or clear all
check boxes, choose Options | Check All or Clear Check All from the
CardConvert menu.
The list of files displayed for a particular drive or directory can be updated by
selecting Options | Rebuild File Lists from the menu. Any new files that have
been stored since you last selected the drive (or since the last rebuild), will be
added to the list.
Tip: Right-click within the file list to display a shortcut menu containing the
items on the Options menu.
8-53
Section 8. Working with Data Files on the PC
If an array ID is desired, select the Include Array ID check box and enter
a value into the field. The value can range from 1 to 1023. The array ID
will be the first value in the array of data.
The Max and Min Timestamp Options is used to determine the type of
timestamp that will be used for Maximum and Minimum outputs that
include a timestamp along with the value. You can choose to output No
Timestamp, a timestamp that includes Hours/Minutes/Seconds (produces
two values, hhmm and seconds), a timestamp that includes Hours/Minutes
only, or a timestamp that includes Seconds only.
The file format is reflected in the default filename by the prefix of TOA5,
TOB1, CSV, or CSIXML added to the table name.
8-54
Section 8. Working with Data Files on the PC
option, the first file stored uses a _1 at the end of the root file name and the
number is incremented by one with each new file saved.
Use Filemarks and Use Removemarks can be selected at the same time, to
create a new file from the data table any time either of the marks is
encountered.
Use Time – This option is used to store the converted data into files based on
the timestamp of the data. When the Use Time check box is selected, the Time
Settings button becomes available. This button opens a dialog box that is used
to set a Start Date and Time, along with an Interval, which are used to
determine the time frame for the data that goes into each file. Note that the
Start Date and Time are not used to specify the actual time and date to begin
processing the file; rather, they are used as a reference for the file interval.
Processing always starts at the beginning of the file.
Convert Only New Data – When this option is selected, only data that has
been collected since CardConvert’s last conversion of the specified file(s) will
be converted. The first time CardConvert is used on a file, all data will be
converted. On subsequent conversions, only new data will be converted.
However, if CardConvert cannot tell what data is new (i.e. if data on the card
has wrapped since the last conversion), all data will be converted. This option
can be used with Append to Last File to create a continuous file with no
repetition of data.
Create New Filenames – When the Create New Filenames option is selected,
CardConvert will add a _01 to the filename, if a file of the same name is found
(e.g., TOA5_Mydata_01.dat). If a *_01.dat file is found, the file will be named
with a _02 suffix. If the Create New Filenames check box is cleared and a file
with the same name is found, you will be offered the option to Overwrite the
existing file or Cancel the conversion.
The Create New Filenames option is disabled when the Use Filemarks, Use
Removemarks, or Use Time option is enabled.
Append to Last File – When this option is selected, converted data will be
appended to the end of the destination file. If the destination file does not exist
when a conversion is done, a new file will be created. On subsequent
conversions, converted data will be appended to the end of that file. If the
header of the new data does not match that of the data in the destination file, an
error will be generated. This option is most useful with the Convert Only New
Data option to create a continuous file with no repetition of data.
8-55
Section 8. Working with Data Files on the PC
Store Record Number – By default, the record number for each row of data is
stored in the data file. This record number can be omitted from the converted
file by clearing the Store Record Number check box.
Store Time Stamp – The time stamp can be omitted from the file by clearing
the Store Time Stamp check box.
If a conversion is in progress and you wish to stop it, press the Cancel
Conversion button.
In some instances, data on a card can become corrupted. Corruption can occur
if the card is subjected to electrostatic discharge or if it is removed when data is
being written to the card (e.g., the card is removed from the CFM100 without
pressing the Card Control button to stop data storage to the card). This
corruption can be at the beginning of the data file or anywhere within the
stored data. Using the standard conversion option, CardConvert will stop if it
encounters a corrupted frame of data because it assumes it has come to the end
of the data file. If corrupted frames of data are found at the beginning of the
file, CardConvert will display a message indicating that no data could be found
on the card. If corrupted frames of data are found within the data file, you may
get some, but not all, of the data that you expect in the converted file.
8-56
Section 8. Working with Data Files on the PC
CardConvert offers a repair option, which will attempt to scan the card for
good frames of data and output that data to a new binary file (the original file is
unchanged). To start the repair of a file, highlight the suspected corrupt file in
the list of Source Filenames and right-click to display a floating menu. Select
the Repair File option from the list. The repair process will create a new
TOB3 file (the default name is Repair_existingfilename), which can then be
converted to an ASCII file using the standard CardConvert process.
When CardConvert comes to what it believes is the end of the data file during
the repair process (the end of valid frames), it will stop and display a message.
The message prompts the user either to continue searching the file for more
good data frames or to stop the repair process. CardConvert displays the last
time stamp for data in the repaired file. If you think there should be additional
data on the card, you can continue to run the repair process. If it appears that all
the data has been stored to the new file, you can stop. The option to continue
processing the file allows you to recover all good data on a card with more than
one corrupted frame.
Note that CardConvert can repair only TOB2 or TOB3 files. TOB1 files cannot
be repaired.
When running CardConvert from a command line, you can designate the CCF
file using the command line option runfile. For example,
“C:\Program Files\Campbellsci\CardConvert\CardConvert.exe”
runfile=“C:\Campbellsci\CardConvert\myfile.ccf”
The above command line will run CardConvert using the settings contained in
myfile.ccf.
8-57
Section 8. Working with Data Files on the PC
NOTE The path to the CCF file should be specified. It will not default to
the CardConvert working directory.
8-58
Section 9. Automating Tasks with Task
Master
The Task Master is an application that is used to set up a Task that can be run on a defined
schedule or based upon a data collection event from a datalogger. A Task can be data
collection from another datalogger, FTP of a collected file, or anything that can be
executed in a computing environment, such as a command line operation, a program
executable, a batch file, or a script.
The Task Master is often used to post-process data files using Split after data has been
collected from a station. It can also be used to launch third party software utilities such as
a command line FTP client to send data to the Internet or a phone dialer to call a pager or
phone upon an alarm condition.
Also note that when running LoggerNet as a service, tasks being run by the Task Master
cannot interact with the desktop. Therefore, any tasks set up in the Task Master should not
require any user interaction.
The Task Master can be opened from the main category on the LoggerNet
Toolbar.
When the Task Master is opened, all of the dataloggers in the network, and any
tasks that may have already been defined, are displayed on the left side of the
window.
There are two types of Tasks that can be created: a Scheduled task and an Add
After task. If a task is shown attached to a datalogger as Task_1 in the example
below, the task execution will be based on a datalogger collection event.
Task_2 is not linked with a datalogger and will run as a scheduled event.
9-1
Section 9. Automating Tasks with Task Master
You can create complex combinations of tasks by linking tasks to other tasks
or multiple tasks to one datalogger. A task linked to another task has no start
options but will execute the specified action following the completion of the
parent task. Multiple tasks linked to a datalogger will execute based on the
conditions specified for the start of the task. This allows one task to be run after
successful data collection and another if data collection fails.
To add a scheduled task, click the Add Scheduled button or select Add
Scheduled from the Edit menu. A new task will be added to the list of tasks
below the list of dataloggers. You can then set up the conditions for the task
with the options on the right side of the window.
To delete a task click to highlight it and click Delete or select Delete from the
Edit menu. The selected task will be deleted. If there are any tasks linked to the
task, they will move up and take the place of the deleted task.
9-2
Section 9. Automating Tasks with Task Master
Tasks can be renamed by selecting the task and then clicking again on the task
name. The name will turn into a text edit box and you can create your own task
name.
There is also a right click menu that will allow the same Add Scheduled, Add
After, Rename, and Delete functions as described above.
9-3
Section 9. Automating Tasks with Task Master
• After Any Call – After a scheduled collection, after using a Collect Now
button, after a call-back, or after another task that calls the associated
station. The task is triggered regardless of whether or not the data
collection was successful
• After Some, Not All, Data – After some, but not all data is collected
during a scheduled collection, when using a Collect Now button, or when
another task calls the associated station.
• After Failed Retry – Whenever a retry fails. This can be the failure of a
primary retry, a secondary retry, or a retry using a Collect Now button.
(Scheduled collection failures and “Collect Now” failures both increment
the same retry counter.)
• After One Way Data – After data is received from a datalogger executing
a SendData instruction or when data is collected via Data Advise.
• After File Closed – After any data file being written to is closed. The task
is triggered by a scheduled collection or by using a Collect Now button. A
datalogger call-back, One Way Data, Data Advise, or another task that
calls the associated station, which causes a file to be written to and closed,
will also trigger the task.
The option %f can be used in the command line options to represent the
just closed data file. This condition is especially useful when performing
post-processing on a data file that has been created using Create Unique
File Name as the File Output Option. In this case the user does not know
9-4
Section 9. Automating Tasks with Task Master
the file name ahead of time. Therefore, the %f option can be used to insert
the file name in the command line options.
• After Any Data Collected – After data is collected by any means, and the
call is terminated for any reason (success or failure). The task is triggered
by a scheduled collection, by using a Collect Now button, a datalogger
call-back, or another task that calls the associated station which causes any
data to be collected.
When the station’s File Retrieval Mode (on the Setup Screen’s File
Retrieval tab) is set to Follow Scheduled Data Collection, the task is
triggered by a scheduled collection, by using a Collect Now button, by a
datalogger call-back, or by another task that calls the associated station
and causes a file retrieval. An attempt to retrieve the file(s) will be made at
the scheduled time, only if scheduled collection is enabled. However,
when a manual poll/Collect Now is performed, an attempt to retrieve the
file(s) will be made regardless of whether scheduled collection is enabled
or not.
When the station’s File Retrieval Mode is set to New Schedule, only the
new schedule will trigger file retrieval and thus, the task. Attempts to
retrieve the file(s) will be made following the new schedule, whether
scheduled collection is enabled or not.
9-5
Section 9. Automating Tasks with Task Master
9-6
Section 9. Automating Tasks with Task Master
9.1.1.3.2 Calendar
Set the Hours of the Day, Minutes of the Hour, Days of Month, Days of Week,
and Months on which the task should be executed. The task will run when ALL
of the specified settings are met. If a setting is left blank, it will always apply.
For example:
To execute a task on the first day of every month at 8:00 a.m., set the
Hours of the Day to 8, the Minutes of the Hour to 00, the Days of the
Month to 1, and leave the other settings blank.
To execute a task every Tuesday at 6:15 a.m., set the Hours of the Day to
6, the Minutes of the Hour to 15, the Days of the Week to 3-Tuesday,
and leave the other settings blank.
To execute a task on the first Monday of every month at midnight, set the
Hours of the Day to 00, the Minutes of the Hour to 00, the Days of the
Month to 1, 2, 3, 4, 5, 6, 7, the Days of the Week to 2-Monday, and leave
the other settings blank.
9-7
Section 9. Automating Tasks with Task Master
To execute a task on the fifth day of every quarter at midnight, set the
Hours of the Day to 00, the Minutes of the Hour to 00, the Days of the
Month to 5, the Months to 1-January, 4-April, 7-July, 10-October, and
leave the other settings blank.
After specifying the desired schedule, press the View Schedule button to bring
up a calendar that shows the current defined schedule. Verify this is the desired
schedule. (You can zoom in on an area of the schedule by dragging your mouse
from top-left to bottom-right.)
9-8
Section 9. Automating Tasks with Task Master
From this tab, select a sub-tab to set up the action(s) that should be performed
by the task. If multiple check boxes (Execute File, Call Station, FTP Settings)
are selected, the actions will be launched at the same time.
Execute File – Select this check box to execute a file or command when a task
event is triggered. Use the Browse button to the right of the File Name field to
select the file, or type in the name and path directly. If command line options
should be passed to the executable, enter those into the Command Line
Options field. A Start In Directory for the executable can be typed in directly,
or you can browse for it. Select the Run Minimized check box to have the file
executed in a minimized state. When minimized, it will appear as a Windows
taskbar item but will not open on your desktop.
9-9
Section 9. Automating Tasks with Task Master
The File Name and Start In fields can contain these predefined symbols that
will be expanded by the LoggerNet server:
NOTES Always enter the full path when specifying the file to execute.
Otherwise, the file may not be found or may not run as expected.
Call Station – Select this check box to trigger a call to a station when a task
event is triggered. Data will be collected according to settings in the Setup
Screen for the datalogger. Use the drop-down list box to select the station that
will be called.
FTP Settings – This check box is only available when configuring an Add
After task with the Station Event Type set to After File Closed. Select this
check box to transfer the file to a designated FTP directory with the following
settings:
Host Address
The FTP server to which the file will be sent.
User ID
The username on the FTP server.
9-10
Section 9. Automating Tasks with Task Master
Password
The user’s password on the FTP server
Remote Folder
Selects the folder on the FTP server to which the file will be transferred.
The file will be saved to this folder under the FTP server’s FTP root
directory. Press the button to browse to the desired directory.
FTP Protocol
Use the check box to select the FTP Protocol to use. The options are FTP
(File Transfer Protocol), SFTP (SSH FTP), and FTPS (FTP over TLS).
The FTP Queue Size determines how many files the Task Master will
keep in the queue to attempt to FTP again.
PASV is limited to IPv4, while EPSV works with any network protocol.
However, since EPSV is an extension of PASV, it is not necessarily
supported by all FTP servers. Therefore, this checkbox must be checked if
the FTP server is using IPv6. Otherwise, it should be checked only if you
are certain the FTP server supports EPSV.
Any data file associated with the designated station will be transferred,
whenever that file is closed. (Therefore, a tables File Output Option on the
Setup Screens Data Files tab must be set to anything but “No Output File” in
order for the tables collected data to be transferred.) If more than one file is
closed (i.e., multiple tables are collected and written), all of the files are
transferred. If a failure occurs, the failure information will be written to the log
file described below.
NOTE When the Task Masters Pause All Tasks option is selected, no
tasks will be triggered. Therefore, any files that would have been
FTPd, if tasks were not paused, will not be added to the FTP
Queue.
The Task Master keeps a log of the FTP transactions that are performed. The
current log file is found in <working directory>\sys\bin\ftplog.txt. Once a log
file reaches about 1 Meg in size, it is baled and the name changed based on the
time it was baled. The format of the new file name is
“ftplog_YYMMDDhhmm.txt”. The Task Master will maintain up to five log
files. At that point, the oldest one will be deleted each time a new one is
written.
9-11
Section 9. Automating Tasks with Task Master
Example #1:
The following configuration will run Splitr.exe and process the parameter file
named mendon monthly.par. If your parameter file name includes spaces (as
with the example shown below), you will need to put quotes around the entire
string or an error will be returned.
The Start In directory indicates the directory in which the Split parameter file is
found.
Example #2:
If LoggerNet security is enabled, the command line options must also include
the username and password as shown below:
–user=“username” –password=“password”
9-12
Section 9. Automating Tasks with Task Master
If you have used a command line argument to change LoggerNet’s default port
number, the command line options must also include the server address and
port number as shown below:
NOTE The files contained in the backup will be based on a saved backup
configuration file. To save a backup configuration, choose
Network | Manual Backup from the Setup Screen’s menu.
Proceed through the Backup wizard. At the last step, choose Save
Configuration. The configuration will be saved to
C:\CampbellSci\LoggerNet\Backup.Configuration.
Example #3:
9-13
Section 9. Automating Tasks with Task Master
Example #4:
The following configuration will set up a task to perform a clock check on the
datalogger that is named CR1000_IP in the network map.
9-14
Section 9. Automating Tasks with Task Master
Task Name – The name that was given to the task when it was set up.
Action – Indicates whether the task will Call Station, Execute File, FTP File,
or perform multiple actions.
Event Type – This column indicates what type of event will trigger the task. It
is only applicable to Add After tasks. The event types are listed above (Station
Event Types).
Event Trigger – For a Scheduled Interval task, the schedule for the task will
be listed in the format DD HH:MM:SS followed by the word “Interval”. For a
Scheduled Calendar task, the word “Calendar” will be displayed. For an Add
After task, the device which the task is dependent upon will be listed.
Last Time Run – The last time that the task was run by LoggerNet.
Next Time to Run – The next time that the task is scheduled to be run. If the
task is not a scheduled task (interval or calendar), this field is not applicable
and will be left blank. If Pause Tasks is selected, this field will read Paused.
9-15
Section 9. Automating Tasks with Task Master
Pending Actions – Species the number of actions that are currently pending
for this task. This value will be zero if there are no actions currently pending.
An increasing number may indicate that you are attempting to run the task
faster than is possible.
Last File Run Started – The last time the attempt to execute the file was
started.
Last File Run Finished – The last time the attempt to execute the file was
finished.
Last File Run Outcome – The outcome of the last attempt to execute the file.
This can have one of the following values:
• Failed
• Started
• Does Not Exist
• Timed Out
Last File Run Exit Code – Windows system exit code of the last attempt to
execute the file.
Last Poll Started – The last time that polling of the specified station was
started.
Last Poll Finished – The last time that polling of the specified station was
finished.
Last Poll Outcome – The outcome of the last poll. This can have one of the
following values:
• Not Polled
• Success
• Security Failure
• Communication Failure
• Communications Disabled
• Bad Table Definitions
• Task Disabled
• Datalogger is Locked
• File Write Failure
• Datalogger is not Valid
Last FTP Started – The last time the attempt to FTP a file was started.
Last FTP Finished – The last time the attempt to FTP a file was finished.
Last FTP Outcome – Outcome of the Last FTP attempt. This will have a
value defined under Exit Codes here: http://curl.haxx.se/docs/manpage.html
In the example above, Task_1 will be run after any scheduled call with the
CR1000 station. Task_2 is a scheduled task, run on a 1 hour interval. The next
time the task will run is 05:00 p.m. on April 5, 2013. Task_3 is a calendar task.
The next time the task will run is 05:00 p.m. on April 5, 2013. Task_4 will be
run after a file is retrieved from the CR3000 station.
9-16
Section 9. Automating Tasks with Task Master
To temporarily stop all tasks from running, select the Pause All Tasks check
box. This will stop all tasks until the check box is cleared.
A task can be triggered to run even if its trigger condition has not occurred.
Highlight a task from the list, and select the Run Selected Task button. This is
a good way to ensure that your task will run correctly before enabling the Task
Master.
For remote administration of the Task Master, the following conditions must be
met:
9-17
Section 10. Utilities Installed with
LoggerNet
Along with LoggerNet’s server, clients and program editors, we also install several
utilities. These are launched either from the Utilities category of the LoggerNet toolbar or
from a command line calling the executable itself. These utilities include Device
Configuration Utility, an application that uses a serial or an IP port to configure
Campbell Scientific dataloggers and communications devices, CoraScript, a utility to
configure and run LoggerNet from a command line, File Format Convert, an application
that is used to convert data files from one format to another, and Toa_to_tob1, a command
line utility to convert TOA5 files to the TOB1 format.
10-1
Section 10. Utilities Installed with LoggerNet
The DevConfig window is divided into two main sections: the device selection
panel on the left side and tabs on the right side. After choosing a device on the
left, you will then have a list of the serial ports (COM1, COM2, etc.) installed
on your PC.
10-2
Section 10. Utilities Installed with LoggerNet
If the device is using PakBus encryption, you will need to enter the key in the
PakBus Encryption Key field.
You’ll be offered a choice of baud rates only if the device supports more than
one baud rate in its configuration protocol. The page for each device presents
instructions about how to set up the device to communicate with DevConfig.
Different device types will offer one or more tabs on the right.
When the user presses the Connect button, the device type, serial port, and
baud rate selector controls become disabled and, if DevConfig is able to
connect to the device, the button will change from “Connect” to “Disconnect”.
The tabs on the right side of the window will be replaced with tabs that
represent the various operations that are available for that device in a connected
state. These operations can vary from device to device.
DevConfig can send operating systems from the Send OS tab to all Campbell
Scientific devices with flash replaceable operating systems. An example for the
CR1000X is shown below:
10-3
Section 10. Utilities Installed with LoggerNet
The text at right describes any interface devices or cabling required to connect
the PC to the device. Screens for other devices vary only in the text on the right
side. This screen differs from other screens that are available in DevConfig in
that it can be accessed from either a connected or disconnected state.
When you click the Start button, DevConfig offers a file open dialog box to
prompt you for the operating system file (usually a *.obj file). You may be
required to cycle power the device or press a special “program” button. When
the device issues the appropriate prompts, DevConfig starts to send the
operating system:
10-4
Section 10. Utilities Installed with LoggerNet
When the operating system has been sent to the device, a message dialog will
appear similar to the one shown below:
10-5
Section 10. Utilities Installed with LoggerNet
The default for the Terminal tab is to only show characters that are returned
from the device. However, if the Echo Input check box is enabled, the screen
will also display the characters actually typed by the user.
The All Caps check box controls whether the keyboard input will be forced to
upper case before the characters are sent to the device. It will be disabled for
some device types that require upper case input.
10-6
Section 10. Utilities Installed with LoggerNet
Clicking Connect puts DevConfig into Terminal emulation mode on the Serial
Port and at the Baud Rate selected.
10-7
Section 10. Utilities Installed with LoggerNet
DevConfig Backup menu. A wizard will appear to guide you through the
backup process.
After downloading the operating system (or any other time you want to restore
the datalogger to its state at the time of the backup), select Restore Datalogger
from the Backup menu. A wizard will again appear to guide you through the
restoration process.
10-8
Section 10. Utilities Installed with LoggerNet
In order for the data recovery procedure to succeed, the following conditions
need to be met:
• The datalogger needs to have the same settings that it had when the
data was logged.
To recover data from a datalogger in this condition, select Data Recovery from
the Backup menu. A wizard will appear to guide you through the recovery
process.
10-9
Section 10. Utilities Installed with LoggerNet
10.2 CoraScript
10.2.1 CoraScript Fundamentals
CoraScript is a command line interpreter that reads its commands as text from
its standard input device and writes the results of those commands as text to its
standard output device. This style of input and output makes it possible to
externally control the LoggerNet server operation using input and output
redirection. It also makes it possible to string together commands in scripts that
can be executed from the command line.
CoraScript 1,1,1,30
10-10
Section 10. Utilities Installed with LoggerNet
(;). The semicolon tells CoraScript that the command is complete and ready to
execute.
There is an extensive on-line help file available for CoraScript. To bring up the
help file, type “help;” on the command line. (Make sure to include the
semicolon ‘;’ at the end and leave off the quotes.) Read through the directions
and try some examples.
• Always end the command with the semicolon (;) character. CoraScript
uses the semicolon to mark the end of the command input and will not
process anything until it is detected.
• A response preceded by a plus sign (+) indicates that the command was
successfully processed.
10-11
Section 10. Utilities Installed with LoggerNet
For a more detailed explanation of the interpretation of the symbols and syntax
refer to the CoraScript help.
To set the configuration setting for any device, use the set-device-setting
command. As with get-device-setting the device is referred to by name and the
setting by number.
create-backup-file;
If executed successfully, you will see something similar to the line below:
+create-backup-file, C:\CampbellSci\Loggernet\2005-3-1_14-15-
01.snapshot;
Where the directory is the path in which the backup file is stored, and the file
name reflects the date and time the snapshot was created.
Create-backup-file has three options. You can specify the path and filename
instead of using the default, include additional files in the backup image that
would not otherwise be saved, and specify whether or not the data cache will
be stored with the image. By default, the data cache will not be saved, so it
may be a good idea to include at least this option if your intent is to fully
restore LoggerNet to the exact state it was in when the backup was created. In
this instance, the command would be:
+create-backup-file include-tables=“true”;
10-12
Section 10. Utilities Installed with LoggerNet
restore-snapshot 2005-3-1_14-15-01.snapshot;
By default, when the network is restored, LoggerNet will first delete all files
from the LoggerNet working directory. However, you can override this default
by using the clear=“false” option after the filename.
Refer to the CoraScript on-line help for more information on these two
commands and their associated options.
List-holes
Purge-holes
Delete-holes
10.3.1 Overview
File Format Convert is used to convert data files from one format to another. It
can also perform the following functions:
10-13
Section 10. Utilities Installed with LoggerNet
More than one of the above functions can be performed in one pass.
• TOA5
• TOACI1
• TOB2
• TOB3
• TOB1
• CSIXML
To:
• TOA5
• TOACI1
• TOB1
• CSIXML
• CSV
NOTES File Format Convert cannot produce TOB2 or TOB3 files, and it
cannot read CSV files.
Some file headers have less information than other formats. If you
convert from a file with more information in the header to one with
less, information will be lost. If you convert from a format with
less information, some fields will be left blank.
Some formats (e.g. TOB1) store string in fixed length fields and
have headers that specify how big that field is. Other formats use
variable length strings. If you convert from a format that uses
variable lengths to a fixed length, the length is assigned to 64. If
the string is longer than this, it is truncated.
Converting a File
If a conversion is in progress and you wish to stop it, press the Abort button.
10-14
Section 10. Utilities Installed with LoggerNet
Log File
If the Write Log File check box is checked, a “Log.Txt” file will be created in
the same directory as the source data file. The log.txt file will be overwritten if
it exists.
10.3.2 Options
Check
Both can be checked in the same pass. If a file is written, other options are
available.
Files can be baled if missing records are found. See the Bale based on
information below. When checking timestamps, “null” records can be written
to “fill” missing records. See Missing Records information below.
File
Check Write File to cause an output file to be created. The file will be created
in the same directory as the source file. The base name will be the same as the
source name with the new format prepended. For example, test.dat becomes
TOA5_test.dat. Use the drop-down list to select the format of the new file.
For all output options except TOACI1, the Browse button to the right of the
field becomes available and can be pressed to set additional file output options.
File Naming
Date Time Filename – When this option is selected, the date and time of the
first record of data in the file will be appended to the end of the base file name.
The suffix includes a four digit year, a two digit month, a two digit day of
month, and a four digit hour/minute. When this option is selected, Use Day of
Year becomes available. If this option is selected, the Julian day (day of year)
will be used in the suffix instead of the month and day of the month.
Create New Filenames – When the Create New Filenames option is selected,
File Format Convert will add a _1 to the filename, if a file of the same name is
found (e.g., TOA5_Mydata_1.dat). If a *_1.dat file is found, the file will be
named with a _2 suffix. If the Create New Filenames check box is cleared and
a file with the same name is found, you will be offered the option to Overwrite
the existing file or Cancel the conversion. (Note that if any of the baling
options are selected, new filenames will automatically be created as described
below.)
Missing Records
If Timestamps are checked (see Check section above), then missing records
can be filled. These will be Null records. A timestamp and record number will
be added. Values will be “NAN”. Strings will be empty, etc.
10-15
Section 10. Utilities Installed with LoggerNet
Prompt – Shows what records are missing and lets you choose to fill or not. If
you have big gaps (e.g. bad timestamp), filling can be quite slow.
Bale based on
This allows a file to be broken into smaller files. A new file is started based on:
Remove Marks – A new file is created when a remove mark is found in the
data file (TOB3 only).
File Marks – A new file is created when a file mark is found in the data files
(TOB3 and TOB2 only).
Bale Info
Use to specify the Start Time and Interval and start time for baling based on
time.
10.4 Toa_to_tob1
This utility is used to convert TOA5 (ASCII Table Data) files to TOB1 (Binary
Table Data) format. By default, it is located in C:\Program
Files\Campbellsci\LoggerNet.
Note that if the utility does not reside in the same directory as the data files, the
entire directory paths must be used. Also note that the utility will overwrite any
existing file with the same name as output_filename. So, use caution in
specifying the output_filename.
10-16
Section 11. Utilities Installed with
LoggerNet Admin and LoggerNet
Remote
The LoggerNet Admin and LoggerNet Remote packages include several additional utilities or
client applications that can be useful for the management of larger networks. The main
difference between these two packages is that LoggerNet Admin includes the LoggerNet
server and client applications/utilities, while LoggerNet Remote includes only the
clients/utilities. LoggerNet Remote was developed specifically to provide remote management
capabilities for an existing LoggerNet network.
The utilities installed with LoggerNet Admin and LoggerNet Remote are Hole Monitor, Data
Filer, Data Export, LN Server Monitor, LoggerNet Service Manager, and Security Manager.
The LoggerNet Service Manager, which allows LoggerNet to be run as a service, is discussed
in the installation notes (Section 2, Installation, Operation and Backup Procedures (p. 2-1)).
The LN Server Monitor, which monitors the status of a remote LoggerNet server or a
LoggerNet server being run as a service, is discussed in Section 6, Network Status and
Resolving Communication Problems (p. 6-1). The remaining utilities are discussed below.
11-1
Section 11. Utilities Installed with LoggerNet Admin and LoggerNet Remote
Enter the IP address or alias for the LoggerNet server (e.g., LocalHost), leave
the User Name and Password fields blank, and press OK. A wizard is launched
to help you set up an Administrator Account, which will be used for managing
the security for the LoggerNet network. Follow the instructions on screen to set
up the account. Once the setup is complete, the Security Manager will display
its main window, and from here, you can begin setting up user accounts.
When setting up new accounts, one of five levels can be assigned to each user.
Multiple accounts with Full Administrator rights can be set up, if desired. Only
users with Full Administrator rights can open and make changes in the Security
Manager (regardless of whether or not security is enabled).
Once the security accounts have been set up, select the Enable Security check
box to turn on security for the LoggerNet server.
Account Name Enter the name to be used for the account. This name will
be typed in each time the user connects to the LoggerNet
server using a client application.
Password Enter the password for the account. Passwords are case
sensitive. As you type, each character will be represented
on the screen with an asterisk.
11-2
Section 11. Utilities Installed with LoggerNet Admin and LoggerNet Remote
Confirm Password Enter the password for the account a second time.
Security Level Use the list box to select one of five security levels for the
user:
Read Only – The user can view data values and status
information but has no other rights.
Operator – The user can view data values, check the
clock, and collect data. He cannot make changes to the
datalogger program, datalogger settings, or the server
settings.
Station Manager – The user can view data values, check
and set the clock, and collect data. The user can send
programs to the datalogger and change datalogger
settings. The user cannot make changes to the server.
Network Administrator – The user has full access rights
in all LoggerNet clients, except the Security Manager.
Full Administrator – The user has full access rights in all
LoggerNet clients, including the Security Manager.
If an option in the LoggerNet user interface is not applicable for the security
level of the user logged in to LoggerNet, that option will be disabled. The
following table provides an overview of the functions available to each level of
security.
TABLE 11-1. Security Manager Access Table
11-3
Section 11. Utilities Installed with LoggerNet Admin and LoggerNet Remote
11-4
Section 11. Utilities Installed with LoggerNet Admin and LoggerNet Remote
Deleting an Account
To delete an account, highlight it and press the Delete button. When you are
logged in to the Security Manager under the Administrator Account, you
cannot delete that account. To delete it, log in under a different account with
Full Administrator rights.
Editing a Password
To edit the password for an account, highlight that account, press Edit
Password and enter the new information in the resulting dialog box.
Special Access
Users who have a security level of Read Only or Operator can be granted
Station Manager access to selected datalogger stations. To do this, highlight the
user and select Edit | Advanced. From the resulting dialog box, select one or
more stations to grant access to by moving them from Stations Available field
into the Selected Stations field.
A hole is any discontinuity of data in the LoggerNet server’s data cache for a
datalogger. Holes can occur if the server is unable to collect data from a
datalogger because of communication failure, or if packets sent to the server
from the datalogger are out of order because of a marginal communications
link. If the data can be retrieved from the datalogger, then it is a collectable
hole. If the data has been overwritten by the datalogger, then it is an
uncollectable hole.
11-5
Section 11. Utilities Installed with LoggerNet Admin and LoggerNet Remote
The list of stations in the LoggerNet datalogger network is displayed on the left
side of the Hole Monitor Utility’s main window. You can monitor hole
collection for all stations by enabling the Select All Stations check box above
this list, or you can monitor a subset of these stations by clearing this check
box and selecting the check box to the left of each datalogger that you want to
monitor. If a hole is detected in the data for a datalogger that is being
monitored, an informational record for the hole will be displayed on the right
side of the window. The fields in the record are:
Station The name of the datalogger for which a hole has been detected.
Table The name of the table in the datalogger that has the hole.
State The current state of the hole. The state options are:
detected – This state, printed with black text, indicates that the
hole has been detected but attempts have not yet been made to
collect it.
11-6
Section 11. Utilities Installed with LoggerNet Admin and LoggerNet Remote
collected – This state, printed with green text, indicates that the
server has succeeded in collecting the hole. Once the hole is
collected, the information will be kept in the grid for a period of
about fifteen seconds and then it will be removed.
lost – This state, printed in red text, indicates that a hole could not
be collected because the data records no longer exist in datalogger.
This can happen when the datalogger overwrites its oldest records
before the LoggerNet server was able to collect those records.
Begin The beginning record for the hole that has been identified.
Current The record currently being collected for the hole that has been
identified.
End The last record in the hole that has been identified.
Message Description
hole detected The server has detected a new range of records that need
to be collected.
hole lost The server has determined that the range of record
numbers are no longer collectable.
hole collect started The server has started a data collection operation that
will involve a range of records.
11-7
Section 11. Utilities Installed with LoggerNet Admin and LoggerNet Remote
Because Data Filer retrieves data from LoggerNet’s data cache (and not the
datalogger directly), LoggerNet must first collect the data from the datalogger
before it is available for use by the Data Filer. Data collection in LoggerNet
can be performed manually by a user or automatically by setting up a data
collection schedule.
NOTE The Username and Password fields are required only if security
has been set up on the LoggerNet server to which you are trying
to connect.
Each time you start the Data Filer, you will be prompted to enter this
information. However, the Automatically log in to this server check box can
be selected to skip this window and use the information from the last session.
To specify a different LoggerNet server, select the File | Select Server menu
option.
11-8
Section 11. Utilities Installed with LoggerNet Admin and LoggerNet Remote
Tip: Quickly choose all tables for the highlighted datalogger by selecting
the Select All check box.
This option is used to specify what data will be retrieved from the LoggerNet
data cache and stored on the remote computer by the Data Filer:
All the Data – Retrieves all records from the selected tables.
Data Since Last Collection – Retrieves all uncollected records from the
selected tables.
Newest Number of Records – Retrieves a specific number of records
from the selected tables by backing up the number of records entered in
the Number of Records field and retrieving all data forward.
Specific Records – Allows you to specify a beginning record number and
the number of records to collect after that record. The range of records to
retrieve is specified by completing the Starting Record # and Number of
Records fields.
Data from Selected Date and Time – Allows you to specify a span of
time for data collection. When this option is selected, the Starting
Date/Time and Ending Date/Time fields will be enabled.
File Mode
This option is used to determine how data will be stored in relation to existing
data files with the same name:
Append to End of File – Adds new data to the end of the existing data
file.
Overwrite Existing File – Replaces the existing file with a newly created
file.
Create New File – Renames the existing file with a *.bak extension, and
stores the new data with the specified file name. Subsequent *.bak files
will be named *.bak1, *.bak2, etc. The most recently *.bak file will have
the highest number.
11-9
Section 11. Utilities Installed with LoggerNet Admin and LoggerNet Remote
File Format
This option is used to determine the format in which the data file will be saved:
TOACI1 – Data is stored in a comma separated format. Header
information for each of the columns is included.
TOA5 – Data is stored in a comma separated format. Header information
for each of the columns is included, along with field names, units of
measure (if they are available), and output processing types (average,
sample, total, etc.).
TOB1 (binary) – Data is stored in a binary format. Though this format
saves disk storage space, it must be converted before it is usable in other
programs.
CSV – Data is stored in a comma separated format, without any header
information. This format is easily imported into spreadsheet applications.
Record Information
Select the Include Timestamp check box to have timestamps included in your
data. If the check box is not selected, timestamps will not be included.
Select the Include Record Number check box to have record numbers
included in your data. If the check box is not selected, record numbers will not
be included.
11-10
Section 11. Utilities Installed with LoggerNet Admin and LoggerNet Remote
The Starting Date/Time and Ending Date/Time fields are used when the
Collect Mode is “Data from Selected Date and Time”. The two fields are used
to specify a range of records to collect, based on the records’ time stamps.
To complete a date field, type in a date directly or click the arrow to the right
of the field to display a calendar from which to choose a date. To complete a
time field, type in the time directly or use the arrows to the right of the field to
increase or decrease the highlighted time value.
The stored file can be viewed by pressing the View Data File button. The Data
Filer uses LoggerNet’s View Pro utility to display the ASCII file.
NOTE Because the data cache is updated based on data collection from
the datalogger, there could be additional records stored in the
datalogger’s memory which have not yet been retrieved to the data
cache.
11-11
Section 11. Utilities Installed with LoggerNet Admin and LoggerNet Remote
Tables in dataloggers are configured as ring memory. Eventually, they will fill
and the oldest records will be overwritten with newer ones. The LoggerNet
data cache, too, is configured as ring memory, but sized to hold twice the
number of records that can be stored in the datalogger (default size). When the
datalogger compiles its program, it starts with record number 0; therefore, if
something causes the datalogger to recompile its program (such as sending a
program to the datalogger or using a keyboard display to alter the program
slightly) all of its tables will start with record number 0 again. Therefore, the
record numbers reflected in the Data Information table may appear to be
incorrect.
11-12
Section 11. Utilities Installed with LoggerNet Admin and LoggerNet Remote
Once the data tables to be exported are specified, the user selects an output
socket port and the export utility will begin “listening” for a request from a
remote application to send data. When the connection to the application is
established, data export is initiated.
The options that determine the operation of Data Export are set from the dialog
box opened from Data Export’s Edit | Options menu item. There are five
options as described below:
Listening Port Number – The Listening Port Number is the port number that
Data Export will monitor for a request for data. The default port number is
1200; this can be changed to any valid four-digit port number.
Starting Options – There are two options for choosing what should be the first
record exported when data export is first started. If the Get All Data option is
chosen, Data Export will attempt to export all available data from the data
cache for the specified datalogger tables. When Start with Newest Record is
chosen, export will begin with the most recent record. In this instance, no
historical data will be exported from the data cache. Note that this setting
applies only to the first time data export is initiated for a table. Subsequent data
export sessions will begin exporting after the last known exported record.
Data Format – This is the format in which the data should be exported. If the
RTMS Format option is selected, the data is formatted to be received by an
RTMS compatible computer. RTMS (real-time monitoring software) is a
format developed by CSI for communication between OS/2 operating systems
and table-based dataloggers. If Standard Format is selected, the data is
formatted as an ASCII comma separated record format that includes header
information. The protocols for both formats are described in later in this
section.
You can run multiple instances of the Data Export application by specifying a
different initialization directory for each instance. This is done by adding the
directory information to the command line of the shortcut that starts the
application. An example of this command line would be:
11-13
Section 11. Utilities Installed with LoggerNet Admin and LoggerNet Remote
For example a shortcut with the following as the command line in the “Target”
window would start Data Export using the initialization file stored in the
directory “c:\Campbellsci\LoggerNet\SD1”.
The most typical use for the Data Export functionality is a situation where the
customer has a database or file system that is already integrated with data
management procedures. The custom data retrieval client gets the data from the
socket provided by the Data Export and writes it to the customer’s database or
file.
LoggerNet
Socket Data Custom Data Customer Files/
Communications
Export Client Retrieval Client Database
Server
The LoggerNet server has the responsibility to see that every collectable record
is collected from the network of dataloggers. The collected data is stored in the
data cache of the server. When the Data Export client is first initialized it sets
up the socket and then waits for a data retrieval client to connect. Once the data
retrieval client connects, the Data Export client gets the records for the selected
tables from the server data cache, and sends them one at a time to the custom
data retrieval client.
To ensure that all of these records are transferred to the client, Data Export uses
an acknowledgment scheme. The basic idea behind the protocol is that as each
record is sent to the client, the client will report the Station Name, Table Name,
and Record Number back to the server after it has secured that record. The
server uses the acknowledgment to mark the progress of the transfer. When the
session is broken, or if the Data Export doesn’t receive the acknowledgment,
11-14
Section 11. Utilities Installed with LoggerNet Admin and LoggerNet Remote
the unsent records remain in the LoggerNet server’s data cache. The Data
Export maintains transfer progress information on disk so that if the server
goes down or there is another problem with the transfer, it can recover and
continue to transfer all collectable records.
The record acknowledgment allows Data Export to ensure that every record it
intended to send was successfully received by the client. This capability,
coupled with reasonable algorithms that make sure the LoggerNet server
receives every record logged by the datalogger, allows for reliable data
collection.
The custom data retrieval client is a software application that connects to the
socket provided by the Data Export application. It can be programmed to run
on any computer platform that is configured to support TCP/IP as long as there
is a computer network connection available to the host computer where the
Data Export application is running.
When a connection is established, the Data Export will send one data record as
soon as it is available. The first data record sent depends on the Data Export
option settings.
When the data retrieval client receives the record, it must parse the data and
return the acknowledgment message to the Data Export. The acknowledgment
message consists of the name of the datalogger, the name of the table, and the
record number of the record received.
The custom data retrieval client is programmed to write to the database or file
system defined for the user’s data handling process.
11-15
Section 11. Utilities Installed with LoggerNet Admin and LoggerNet Remote
of the datalogger name, table name and the record number of the received
record.
If the Data Export application loses its connection with the LoggerNet
communication server, it will need to be re-connected before any records can
be obtained and sent out.
There are two record formats used to send the record data:
The following illustrations show the state diagrams for the custom client/Data
Export interface. (The diagramming notation is by Booch[1] who claims to
have adopted it from Harel [2]).
Start
Stop
Running
TimeOut
Session
Session Opened
Failed
TimeOut
Session Open
exit: Close Session ()
RecordRdy/
Wait for Record
Secure Rec.
entry: Start Timer (RecIntv2)
Send Ack.
exit: Stop Timer ( )
11-16
Section 11. Utilities Installed with LoggerNet Admin and LoggerNet Remote
Running
Connect Failed/
Report Server Registration
entry: Register Server ( )
Server Registered
exit: Deregister Server ( )
Reg OK/Report
Wait For Client Connection
entry: Open Session ( )
Session Failed/Report
Session Opened/
Report
Session Open
exit: Close Session ( )
Key concepts from the state diagrams are shown in the following tables with
key words from the diagrams. In these definitions the “server” refers to the
Data Export data server and the “client” is the custom data retrieval client
application.
Test For Server Rdy With Socket APIs, usually there will be a
function used to open the socket. In this state, the
client program should attempt to open the socket.
If Open Failed, the client should wait 5 seconds
and try again.
Wait For Record In this state the client is waiting for the next data
record from the server. When a record is received
it should be “secured” (saved to disk or
database), then an acknowledgment should be
sent back to the server. Once the server has
processed the acknowledgment it will not send
that record again. The client should use a
watchdog timer while waiting for a data record. If
the client is in the Wait For Record state for
longer than expected (RecIntv2) then it should
assume that the server has died and close the
session. This watchdog operation may be difficult
to implement, but it seems that some
11-17
Section 11. Utilities Installed with LoggerNet Admin and LoggerNet Remote
Wait For Record Available In this state the server is waiting for the next
record to become available from the server’s data
record source.
Wait For Ack In this state the server is waiting for the client to
acknowledge that it has secured the record. If an
acknowledgment for the wrong record comes in,
the server will just continue to wait. After waiting
for a minute, the server will re-issue the data
record and wait again.
11-18
Section 11. Utilities Installed with LoggerNet Admin and LoggerNet Remote
Communications between the client and server are conducted using ASCII
records where each record is terminated by a carriage return – line feed (CRLF)
pair. Record length varies quite a bit. For each datalogger record there is
exactly one ASCII record. Because of the Block Mode Protocol used to
communicate with dataloggers, the maximum size datalogger record is limited
to something less than 1024 field values. Assuming 6 characters per value, 13
characters per field name, and 6 characters per field type designation, a single
ASCII record could come out to be a little longer than 25K characters.
To express the format of ASCII records used for communications between the
client and server, we will use Extended Backus Naur – Formalism (EBNF), a
notation used to express syntax. This notation was adopted from Wirth [3], and
extended here by adding a repetition count preceding some brackets. EBNF is
summarized in the following table where A, B and C are syntactic entities of
the language being described. Where one of these entities is a literal string it is
enclosed in quotes.
Expression Means
11-19
Section 11. Utilities Installed with LoggerNet Admin and LoggerNet Remote
Only without the tabs and carriage return in the middle. One with strings might
look like this.
The acknowledgment records to be sent back to the server for the two records
shown above would be:
Lgr,Sec15,123456
and
PC1,StatMsg,13355
11-20
Section 11. Utilities Installed with LoggerNet Admin and LoggerNet Remote
Within a string, quotation marks and back slash characters will be quoted with
a backslash character.
The sample record from the original protocol would have the following format
under this new syntax:
“Lgr”,”Sec15”,”1993-12-08 15:02:00”,”123456”,”Battery_V”,”FLOAT”,
“13.5”,”Temp”,”FLOAT”,”72.123” CRLF
The acknowledgment message is the same as for the RTMS format. The
acknowledgment for the above record would be:
Lgr,Sec15,123456
11-21
Section 12. Optional Client Applications
Available for LoggerNet
Several client applications are available that are compatible with LoggerNet. Many of
these allow remote access to the data in the LoggerNet data cache, or provide a way to
post process that data.
Client applications include RTMC-RT, RTMC Pro, LNDB, OPC Server, and the LoggerNet
SDKs.
For the client applications that allow remote access to the LoggerNet data cache,
LoggerNet must be configured to allow connection from remote clients.
12-1
Section 12. Optional Client Applications Available for LoggerNet
RTMC Pro is not covered in this manual. RTMC Pro comes with a separate
user’s manual. Product literature can be downloaded from our website at
www.campbellsci.com.
12.4 LNDB
LNDB is an application that enables you to easily move data from a LoggerNet
data cache into a database such as Microsoft SQL Server or MySQL. The two
main components of LNDB are LNDB Manager and LNDB Engine. LNDB
Manager is used to set up a database and select the datalogger data tables that
will be stored in the database. It also provides tools to monitor the LNDB
Engine and to review the database data. LNDB Engine runs as a service and
sends the selected data from the LoggerNet data cache to the database.
Additionally, LNDB includes utilities for importing and exporting data, and
generating simple reports from your database data.
LNDB is not covered in this manual. LNDB comes with a separate user’s
manual. Product literature can be downloaded from our website at
www.campbellsci.com.
The CSIOPC Server is not covered in this manual. The CSIOPC Server comes
with a separate user’s manual. Product literature can be downloaded from our
website at www.campbellsci.com.
12-2
Section 12. Optional Client Applications Available for LoggerNet
The LoggerNet-SDK is not covered in this manual. The SDK comes with a
separate user’s manual. Product literature can be downloaded from our website
at www.campbellsci.com.
12-3
Section 13. Implementing Advanced
Communications Links
This section describes the configuration and operation of a variety of communications
links. The communications links included here require special setup or configuration, or
require special consideration in the implementation to work properly.
13.1 Phone to RF
Phone to RF is used in situations where the RF network is far away from where
the LoggerNet server computer is located and phone access is available to the
RF base site. Before implementing this type of network, consideration needs to
be given to the collection intervals and the communication time required
between the computer and the RF base.
LoggerNet will make a call each time that it does data collection for a station.
It will stay on-line until a response is received — either the data, or an error
indicating that the data collection failed. LoggerNet will also initiate a call for
any datalogger operations such as connect in the Connect Screen or get table
definitions in the Setup Screen.
13.1.1 Setup
The device map set up in the Setup Screen for a Phone to RF link would look
similar to the communications network below.
To begin, add a Serial Port to the device map if one does not exist. Add a
Phone Modem to the Serial Port. To this Phone Modem, add a Remote Phone
Modem. Next, add an RF base modem and then an RF remote modem. To
complete the network, add your datalogger to the remote RF modem. Review
all of the settings for each device, and make any changes to customize the
settings for your network configuration.
13-1
Section 13. Implementing Advanced Communications Links
13.1.2.3 RF Address
The hardware settings for the address of the RF base must be 255 for phone to
RF operation. Additionally, ensure that the addresses set for the RF remote
hardware match the settings defined in Setup.
13-2
Section 13. Implementing Advanced Communications Links
NOTE SDC mode cannot be used with 21X or CR7 dataloggers or any of
the table based dataloggers.
The datalogger and the RF base must both be connected to the remote phone
modem on the same 9 pin ribbon cable.
13-3
Section 13. Implementing Advanced Communications Links
To begin, add a Serial Port to the device map if one does not exist. Add a
Phone Modem to the Serial Port. To this Phone Modem, add a Remote Phone
Modem. Next, add an MD9 base and then a remote MD9. To complete the
network, add your datalogger to the remote MD9. Review all of the settings for
each device, and make any changes to customize the settings for your network
configuration.
13-4
Section 13. Implementing Advanced Communications Links
13.2.2.5 Grounding
Depending on the configuration and distance of the MD9 network, be sure to
follow the grounding guidelines provided in the MD9 hardware manual.
Grounding issues have been known to prevent reliable communications and
data collection.
13.3 TCP/IP to RF
The development of Serial Server devices that allow serial communications
devices to be connected to TCP/IP networks now allows an RF network to be
connected to the LoggerNet server over the Internet or across a Local Area
Network. A Serial Server has a standard TCP/IP connection on one side and
one or more serial ports, typically RS232, on the other. This type of network
setup is typically used for organizations that have field offices or stations that
are connected together by a TCP/IP network. This allows the LoggerNet server
computer to be located in a central area for administration while providing
communications to remote RF networks.
13.3.1 Setup
The device map set up in the Setup Screen for a TCP/IP to RF link would look
similar to the communications network below.
To begin, add an IPPort to the device map if one does not exist. Add an RF
base modem to the IPPort, and to this, add the remote RF modem. To complete
the network, add your datalogger to the remote RF modem. Review all of the
settings for each device, and make any changes to customize the settings for
your network configuration.
Max Time online – This should be set to zero (disabled) for all of the stations
in the RF network. Otherwise, when the communications link is dropped
because this value is exceeded, communication will be re-attempted
immediately. Forcing the connection offline and back on quickly causes errors
because not enough time is allowed for the serial server to reset the TCP/IP
socket.
13-5
Section 13. Implementing Advanced Communications Links
When connecting the RF Base to a Serial Server or the RS232 connector of the
NL100 you will need to build or modify a serial cable to either cut or remove
the RTS line. The other standard serial communication lines need to be in
place.
This special cable is needed to allow an RF base to work with the standard RS-
232 Port on other Internet serial devices. The drawing below depicts the cable
needed.
2 TxD 2
3 RxD 3
2 2
5 5
p p
i 4 RTS 4 i
Radio Connection
EtherNet port
n
5 CTS 5
n
Campbell Scientific
Third-Party
R DSR R RF232
Serial Server S
6 6 S RF Network Base
-
20 DTR 20 -
2 2
3 8 DCD 8 3
2 2
7 GND 7
The serial server must be configured for this application before operation. The
basic settings that must be configured are listed below. Depending on the
specific serial server, there may be other settings that must be configured for
proper operation.
Subnet Mask – This setting is used to limit the search applicability area for IP
addresses. If both the server and the serial server are in the same low level
subnet this would be set to 255.255.255.0. Consult with the network
administrator for the proper setting.
Default Gateway – This specifies the IP address of the router for the local
computer network. Consult with the computer network administrator for the
proper setting.
13-6
Section 13. Implementing Advanced Communications Links
Baud Rate – This specifies the baud rate used by the serial server to
communicate with the serial device attached to the COM port. The RF base
communicates at a baud rate of 9600.
IP Port ID – This specifies the port ID used by the serial server to direct serial
communications. This must be set even on devices with only one port. This
number is entered as part of the IP address in Setup for the IPPort device. For
example, if the port ID was specified to be 3201, using the IP address above
the entry in Setup would appear as follows: 198.199.32.45:3201
Inactivity Timeout – This timer resets the TCP/IP socket port if there has not
been any activity on the port for the specified period of time. The time is
usually specified in minutes. This prevents a situation where the socket gets
left open after a call and blocks other incoming calls.
13-7
Section 14. Troubleshooting Guide
This section is provided as an aid to solving some of the common problems that might be
encountered using the LoggerNet software. This list is not comprehensive but should
provide some insight and ability to correct simple errors without a call to Campbell
Scientific technical support.
This section also includes descriptions of some of the tools such as Terminal Emulator and
Data Table Monitor that can be useful in troubleshooting LoggerNet problems.
14-1
Section 14. Troubleshooting Guide
The Windows operating system has limits on the number of socket connections
that can be held open. For most operations this should be more than enough to
cover the open applications that use sockets. One situation that does cause
problems is using the IPPorts to communicate with dataloggers where the
socket is being opened and closed quickly. For example if you have 20 stations
on IPPorts and you do normal data collection every 5 seconds, 20 new sockets
are created every 5 seconds. The normal lifetime of the created socket is about
4 minutes leaving about 1000 active sockets at a time. If there are other
applications that use sockets, it is possible to exceed the allowed number of
sockets.
To work around this problem, either slow down the rate of data collection, or
use the Delay Hangup setting for the IP Port (accessed from the Setup Screen)
to keep the stations online.
When you get an error message that says Socket Error and a number, check the
chart below for the type of error that occurred and what to do about it. Note
that these error messages can show up either in pop up error boxes or as part of
the LoggerNet Communications log.
14-2
Section 14. Troubleshooting Guide
Socket
Error
Number Message Meaning User Response to Message
10013 Permission Denied. The This is normally a network type of
requested socket issue. Check with your computer
connection has refused network operator.
the connection.
10024 Too many open files. This can occur when you have
Too many open sockets many applications that are using
for the applications sockets running at the same time.
running.
10047 Address family not This message shows up when the
supported by protocol LoggerNet Toolbar comes up but
family. The socket being the server did not come up because
addressed does not TCP/IP is not installed on the
support the type of computer. Install TCP/IP and restart
connection being LoggerNet. (Section 1.2, TCP/IP
attempted. Service (p. 1-1))
10055 No buffer space The operating system cannot create
available. Cannot create any more socket connection. See
more temporary sockets. the text above about Maximum
Number of Sockets Open.
10058 Cannot send after This would be an indication that an
socket shutdown. A application is not communicating
message was sent to a well with the server. Check the
socket that has been application.
closed.
10060 Connection timed out. Either the server has crashed and is
not responding or the application
did not maintain the connection to
the server. Try restarting
LoggerNet. This message can also
be seen in connection with the
NL100 LAN interface.
10061 Connection refused. The This is normally associated with the
LoggerNet server or an NL100 and occurs because the last
NL100 refused to allow connection did not have enough
the socket connection. time to close before a new
connection is requested. Slow down
the low level polling delay interval.
10065 No route to host. The This occurs with remote
application is trying to connections to a LoggerNet server
connect to a host running on another computer. The
address that isn’t in the requested host name can’t be found.
routing table.
14-3
Section 14. Troubleshooting Guide
Invalid Table Defs indicates that LoggerNet does not have a current copy of
the datalogger table definitions. You will need to update the table definitions
from the Setup Screen or the Connect Screen.
Network Paused indicates that data collection for the entire network has been
suspended. You will need to go to the Status Monitor and remove the check
mark from the Pause Schedule check box.
14-4
Section 14. Troubleshooting Guide
The Terminal Emulator utility is available from the Connect Screen to help
troubleshoot communications problems. When you choose a device with the
Select Device field, the Terminal Emulator will attempt to establish
communications with that device. The Terminal Emulator will use the lowest
baud rate among all of the devices involved in the link. For example if
choosing a COM port, the baud rate will typically be 115,200 baud and
LoggerNet simply opens the port. For a phone modem, the baud rate will be set
to the value in the Setup Screen for that phone modem, the COM port will be
opened and the DTR line will be asserted to enable the phone modem. For a
datalogger on a phone link, the baud rate will be set for the lower of the phone
modem or datalogger baud rates in the Setup Screen, and LoggerNet will try to
dial the phone modem and get a prompt from the datalogger. You can also use
terminal emulation to send commands to the dataloggers.
When the Terminal Emulator screen comes up as shown click the drop down
arrow to the right of the Select Device box to choose the device from the list of
devices in the network map. The correct baud rate for the link is automatically
set. The characters you type in the window are sent as ASCII text to the
selected device. The options that are available from this screen depend on the
device you select.
14-5
Section 14. Troubleshooting Guide
Dataloggers
NOTE Use caution while in terminal emulation mode. You can change or
disable operation of the datalogger with these commands.
ComPort
Phone Modems
Selecting a phone modem and clicking Open Terminal will allow you to send
ASCII characters to the phone modem. This can also be used to test
communications to the phone modem and the initialization strings used to set
up and configure it. Get the information for the available commands and
format from the modem manufacturer.
Say that you cannot communicate with a datalogger and you don’t hear the
phone modem dial the number. You could wonder if you are using the correct
COM port. Disconnect the serial cable from the phone modem, select the COM
port, and click Open Terminal. Then, shorting pins 2 and 3 at the end of the
serial cable, type characters and, if you’ve chosen the correct COM port, you
should see those characters echoed to the screen. (Note: the RS232 protocol
allows any pins on a cable to be shorted without damaging the computer. Pins
2 and 3 are the second and third pins from the upper left on the top row when
looking at the male end of the cable, with the long row of pins on top. This is
true for either 9-pin or 25-pin cables.)
Once you have established that you have the correct COM port, you may hear
the modem dialing, but the datalogger doesn’t connect. Perhaps the modem is
not dialing correctly or using an appropriate initialization string. You can work
with the modem by selecting it in Terminal Emulator and clicking Open
Terminal. LoggerNet will open the COM port and raise DTR to enable the
modem. You can then type characters to be sent to the modem. For example,
14-6
Section 14. Troubleshooting Guide
Type ATH <Enter> To hang up the modem. You should see an “OK” on the
screen sent by the modem. If you do not, perhaps there is no
modem attached to that COM port or perhaps the modem is
not powered on.
Type AT&C1&D2 To cause the modem to use hardware flow control and
<Enter> report the loss of the carrier when the datalogger hangs up
its modem. You should see “OK” appear on the screen. If
you see “ERROR”, then the modem doesn’t recognize one
or both of the “&C1” or “&D2” commands.
Type AT\J1 <Enter> To force the modem to follow the serial port rate rather than
try to connect at the fastest rate the remote modem will
support. You should see “OK” appear on the screen. If you
see “ERROR”, then the modem doesn’t recognize this
command.
Type A <Enter> You should see a string of characters from the datalogger
that report its status, including final storage pointers,
memory size, perhaps lithium battery voltage, and internal
error counters. If you do not, then the phone line may be too
14-7
Section 14. Troubleshooting Guide
Type E <Enter> To hang up the datalogger, which causes it to turn off its
phone modem, which in turn causes loss of the carrier
signal between the modems. After a few seconds you
should see “NO CARRIER” reported by your base modem.
If the modem you select in LoggerNet’s Setup Screen doesn’t work, check to
make sure you’ve selected the correct modem, that it’s powered up and that the
phone line is working. You may have to adjust the baud rate. If you still have
trouble, you may need to consult your modem manual for the appropriate
initialization strings.
Perhaps you can communicate with other dataloggers on this phone line and
with this base modem, but there’s one remote datalogger that’s problematic.
You could try communicating with that datalogger in remote keyboard mode.
Using the example above, choose instead the datalogger in the Terminal
Emulator and click Open Terminal. You should hear the phone modem dial,
followed by screeches as the modems negotiate a connection, followed by
“CONNECT”, etc., and then a response from the datalogger. Pressing <Enter>
for an array-based logger should return an asterisk “ * ”. Typing “A” <Enter>
should return the status line. Type “7H” (“2718H” for 21X or CR7X
dataloggers) to put the datalogger in remote keyboard mode. From there, you
can enter commands much like from a keyboard/display handheld interface.
Pressing “*6” followed by several <Enter> keys should cause the datalogger to
report its input locations. If all you get from some of these commands is
“MODE”, perhaps the datalogger has security set. See your datalogger manual
for other remote keyboard commands. You may also call your Campbell
Scientific application engineer for more help on troubleshooting links.
NOTE Using remote keyboard mode can result in loss of programs, data,
or the ability to further communicate with a datalogger over a
remote link, for example, by altering security settings or changing
a program leading to memory resets or powering down a cellular
phone. Remember that keystrokes entered may not reach the
datalogger intact. That is, the datalogger may not receive what you
send.
14-8
Section 14. Troubleshooting Guide
1. Check that the RF modem has the correct switch ID set on the DIP switches.
(This is a common problem and should be checked first.)
2. Check the type and brand of the radio. In general, the radios in a network
should be the same type.
3. Check that the radio is set for the right frequency. With a programmable
radio, verify that the correct frequency and other settings are set properly. If the
radio is crystal based there should be a label showing the frequency. If not, you
will have to test the radio with a programmable scanner or frequency analyzer.
5. Check that the antenna is the right type (directional or omnidirectional) and
is designed for the frequency being used. Most antennas will have labels
identifying the frequency range. Make sure the antenna is mounted for a clear
line-of-sight and that directional antennas are properly oriented.
6. Make sure the antenna is the right impedance to match the system. This is
almost always 50 Ω. This should match the cable connecting the antenna to the
radio and the radio connection.
7. Check that the cable connecting the radio to the antenna matches the
impedance of the antenna and the radio. This is almost always 50 Ω.
One simple, but very effective, technique is to swap out components. Use
components from a part of the network that you know is working, and swap
them out one at a time to isolate a faulty hardware component.
14-9
Section 14. Troubleshooting Guide
NOTE If you are using a data radio that does not have a transmit button
built in, you can easily build a push to transmit button from the
documentation of the radio/RF modem interface connector. There
will be one pin that when pulled high or pulled low will initiate
radio communication. See the radio documentation to identify this
pin. Connect a momentary push-button to either raise or ground
that pin. Always make sure that the antenna is connected to the
radio before attempting to transmit. Serious damage to the
radio can occur if transmitting without an antenna.
Place the watt meter in series between the radio and antenna cable. Set the watt
meter to the 15-Watt range, or the next highest watt meter setting, and point the
directional arrow first toward the antenna cable to measure forward power
(Wf). Initiate radio communication, let the watt meter stabilize, and record the
watt meter reading. Reverse the directional arrow so it is pointing back toward
the radio, initiate radio communication, let the watt meter stabilize, and record
the watt meter reading. This second reading is the reflected power (Wr). Take
the square root of the reflected power divided by the forward power to arrive at
the square root ratio (R). Calculate the Voltage Standing Wave Ratio (VSWR)
with the following equation:
VSWR = [(1+R)/(1–R)]
Where, R = (Wr/Wf)½
For example, if the forward power (Wf) is 5 Watts and the reflected power
(Wr) is 0.2 Watts, the VSWR is 1.5:1.
• The cable is worn, cut or damaged so that not all of the radio energy can
travel through to the antenna.
• The antenna design frequency does not match the radio frequency.
14-10
Section 14. Troubleshooting Guide
While at the station, check the voltage on the 12 V port of the datalogger both
with and without the radio transmitting. Regardless of the battery type, the
datalogger requires a minimum of 9.6 Volts.
Testing the radio transmission quality between radios requires the use of a
programmable scanner and a set of attenuators or attenuation pads. You will
need someone at each end of the radio link with a way to talk to each other.
If the carrier detect light is coming on at the RF base station radio, but
communication quality is poor or not being set up properly, there may be a
marginal or low signal power inherent in the RF link. In this case, it is a good
idea to do a signal power check with attenuation pads for each sub-link in a
complete RF link. Every RF link has one or more sub-links. For example, if
there is one repeater in an RF link then there is a sub-link between the base
station and the repeater and a sub-link between the repeater and the field
station. The sub-links should be checked in both directions of communication.
Before proceeding, it is a good idea to calculate the theoretical signal power for
each of the RF links. Appendix C of Campbell Scientific’s RF Telemetry
manual outlines the calculations.
For proper radio communications the signal power must be greater than –95
dBm at the standard transmission rate. However, a signal can be detected on
the radios with a power greater than –115 dBm. Therefore, there is a 20 dBm
range in which the radios are not working, but may “sound” proper.
An attenuation pad inserted into the link increases the power loss of the system.
If a 20 dB attenuation pad, or two 10dB pads in series, are inserted into the link
and subsequently the radio will not detect the signal, the signal power is
14-11
Section 14. Troubleshooting Guide
between –95 and –115 dBm which is below the power limit for good data
transmission.
Similarly, if a 10 dBm attenuation pad is inserted in the link and the radio
subsequently will not detect the signal, the actual signal power is between –105
and –115 dBm. In this case, the signal power is far below the power limit.
Radio
Programmable
Scanner
Attenuation Pads
(10 - 20 dB)
To test the power being received by a radio over an RF link, disconnect the
radio from the antenna and insert the programmable scanner as shown in the
figure above. Program the scanner to the radio frequency and adjust the
squelch control until ambient RF noise is just cut out. This level will normally
be around –110 to –115 dBm. The scanner is now ready to conduct the test.
NOTE If you are using a data radio that does not have a transmit button
built in, you can easily build a push to transmit button from the
documentation of the radio/RF modem interface connector. There
will be one pin that when pulled high or pulled low will initiate
radio communication. See the radio documentation to identify this
pin. Connect a momentary push-button to either raise or ground
that pin. Always make sure that the antenna is connected to the
radio before attempting to transmit. Serious damage to the
radio can occur if transmitting without an antenna.
First, test the sub-link of the base station to the first repeater or field station.
Initially treat the base station as the transmitting station and the first field or
repeater station as the receiving station. Disconnect the radio’s multicolored
cable from the RF modem. To start the test, have the person at the base station
initiate a radio transmission. When the radio transmission is received, if
squelch is broken, you will hear it on the speaker of the scanner. If you don’t
hear the radio transmission, the signal is getting lost in the ambient noise and
14-12
Section 14. Troubleshooting Guide
will not be picked up. If squelch is not broken, then either the signal power is
less than –115 dBm, or something is wrong with the power supply, antenna
orientation, or cable connections. If squelch is broken on the receiving radio,
the site can be tested with the attenuation pads to determine the approximate
signal power if it is between –115 and –95 dBm.
Insert the attenuation pad(s) (20 dB) between the scanner and antenna of the
receiving station ONLY (most attenuation pads have a limited current
capacity). Initiate radio transmission from the base station transceiver. If
squelch is broken at the receiving station, this sub-link is good in this direction.
If squelch is not broken this sub-link has signal power between –95 and –115
dBm which should be corrected. Corrections can involve shortening the
distance between radios, reorienting antennas, fixing connectors or cables,
providing a better power supply, or shortening coaxial cable lengths.
If it did not break squelch with the 20 dBm attenuation pad, it is possible to
decrease the attenuation to 10 dBm to determine if signal power is between –95
and –105 dBm, or between –105 and –115 dBm. This will identify if the signal
power is close to or far away from –95 dBm.
If it did break squelch with the 20 dBm attenuation pad, then that sub-link is
good in that direction. The next sub-link can now be tested. Remember to place
the attenuation pads at the receiving station only! If all of the sub-links were
good, the same sub-links can be tested in the opposite direction. If reversing
directions in a sub-link gives bad results while the other direction is good, be
suspicious of the transmitting radio in the bad direction and the radio’s power
supply.
The most important use of Data Table Monitor is to see what records are being
stored in the data cache and to diagnose suspected data cache problems.
Data Table Monitor gets all the data available from the data cache that matches
the export conditions. As the server collects new records from the datalogger,
they are automatically displayed and sent to the data file. This continues until
Data Table Monitor is closed or data export is stopped.
CAUTION One caution about the data file created by Data Table
Monitor—there are no limits to size or longevity. If you plan
to use the export to file feature on a regular basis, make sure
to either restart Data Table Monitor (which overwrites the
exported file) or delete the files periodically. The data export
can easily be restarted by clicking the Start button. This will
delete the old file and start a new one.
14-13
Section 14. Troubleshooting Guide
To start Data Table Monitor open Windows Explorer and got to the Program
Files\CampbellSci\LoggerNet directory. Double click the Tablemon2.exe file.
The utility will start with a screen similar to the one shown below.
Click the Connect button to connect to the LoggerNet server. The dialog box
shown below will be displayed. If you are working on the same computer
where LoggerNet is running leave the default Server Host Address as localhost.
The Server Port number should also be 6789. The Logon Name and Logon
Password are only used with versions of LoggerNet that support security. To
connect to LoggerNet on another computer, enter the computer network name
or IP address as the Server Host Address. When you click OK a list of the
dataloggers in the network will be shown in the upper left window.
Selecting a datalogger will list the names of the data tables or array IDs in the
datalogger. Note that if data collection has not been set up and enabled in the
14-14
Section 14. Troubleshooting Guide
Setup Screen, no data will be coming into the data cache. Data Table Monitor
can only display and output data from the data cache. Data Table Monitor
displays and outputs all the data points from an array or table.
Click the Start button to bring up the Start Advise Options dialog. This dialog
gives you choices about which records to display and the data file in which to
store them.
Start Option: This selects the starting point for the data to be displayed and
output to the file.
• At Newest: This option will set the starting position to the last record
stored in the data cache. This last record and any future records stored will
be output.
• After Newest: This option will set the starting position to be the next
record stored in the data cache. Output begins with the next record stored
in the data cache. No historical records will be output.
• Relative to Newest: This option starts from the most recent record
collected. The Offset from Newest specifies how much time to go back
from the current write index. For example, an offset of 10 with a setting of
minutes will get the last 10 minutes of data collected.
14-15
Section 14. Troubleshooting Guide
• At Offset from Newest: This option allows you to specify how many
records back from the current write index to go. A setting of 10 in the Start
Offset box will display the last 10 records collected.
The Start File Mark, Start Record Number, Start Offset, Begin Date, and Offset
from Newest edit boxes are used only with the corresponding start options
above. For each option selected, the appropriate boxes are enabled.
Order Option:
• Collected: displays and writes the data to the file in the order it was
collected by the server. This setting is useful to look at the actual data
record storage in the data cache.
• Logged With Holes: The output will include only complete data
sequences. If the Data Table Monitor comes to a hole that has not yet been
filled, it will wait for the hole to fill before displaying or writing the next
record to the file.
• Logged Without Holes: The data output will be displayed and written to
file as quickly as it is collected, without waiting for holes to be filled. Any
data in holes will be skipped in the output.
• Real Time: the most recent data is always sent out starting with the last
record stored. This will not provide a complete data set.
Set the output file directory and name in the Export File box. The Browse
button will bring up a Windows Save As dialog box to select the file name and
directory.
File Format:
Once the start options have been set, click the OK button to start. The records
are displayed in the list box on the bottom of the screen. If you have set up an
output file they are also sent to the output file.
14-16
Section 14. Troubleshooting Guide
(PC-RF400~~~RF400-CR510PB~~~ RF400-CR510PB~~~CR205)
Remedy 2: Set both radios exactly the same in the above parameters.
(‘base’ radio’s Active Interface is typically Auto Sense, remote radio
is typically CSDC 7)
(PC-RF400~~~RF400-CR10XPB~~~RF400-CR510~~~CR205)
14-17
Section 14. Troubleshooting Guide
Remedy 3: Set all network radios exactly the same in the above
parameters. Network RF400s’ Active Interfaces may vary from node
to node, however, they will typically be configured for CSDC 7 or 8
except for a ‘base’ radio which is typically AutoSense or M.E..
Dataloggers automatically detect the RF400’s port (Active Interface)
for packet communications, however, the potential neighbor hello
port or beacon port must be configured to match the RF400’s Active
Interface, or no discovery of neighbors will take place.
Possible reason 5: The two routers in the path to the CR205 have
neighbor filters and at least one of the neighbor filters doesn’t list the
other as a potential neighbor.
(PC-RF400~~~RF400-CR10XPB~~~CR205)
Problem: Changed P190 port type and it no longer communicates with remote.
14-18
Section 14. Troubleshooting Guide
14-19
Appendix A. Glossary of Terms
A
Advise – See Data Advise
ASCII File – A computer file containing letters, numbers, and other characters
using the ASCII character encoding.
Asynchronous – The transmission of data between a transmitting and a
receiving device occurs as a series of zeros and ones. For the data to be “read”
correctly, the receiving device must begin reading at the proper point in the
series. In asynchronous communications, this coordination is accomplished by
having each character surrounded by one or more start and stop bits that
designate the beginning and ending points of the information (see
Synchronous). The transfer of information is not otherwise coordinated
between the sender and receiver.
Analog Channel – A terminal on the datalogger’s wiring panel where leads for
analog signals are connected. The analog channels are designated single-ended
(SE) or differential (DIFF) on the wiring panel. Many sensors, such as
thermistor temperature probes and wind vanes, output analog signals.
Array-based Datalogger – See Mixed-array Datalogger.
B
Batch Files – An ACSII text file that contains one or more DOS commands or
executable file commands. When the batch file is run, the commands in the file
are executed sequentially.
Battery – This entry in the status table returns the datalogger battery voltage.
Baud – The rate at which a communication signal travels between two devices.
Binary File – A file based on software defined formatting. A binary file can
only be interpreted by the software programmed to decode the formatting. This
format is used for more efficient data storage than is provided by ASCII.
BMP (Block Mode Protocol) – The communications protocol used by the
server to communicate with table-based dataloggers and RF modems.
Broadcast – Part of the radio (RF) technique of polling remote radio modem
datalogger sites. A single modem sends a message (broadcast) that all affected
remotes hear and respond to.
C
Call-back – When a datalogger is programmed for Call-back, it will
automatically call the host computer when a specified condition is met. The
computer must be set up to look for such an incoming call.
Call-back ID Number – A three-digit number that is used to identify what
datalogger has called the host computer. (Not available for Table-based
dataloggers.)
A-1
Appendix A. Glossary of Terms
Cancel – Choosing Cancel from a dialog box will typically ignore any changes
made and close the box.
Carrier – An electrical signal used to convey data or other information. For
example, radio and phone modems use carrier signals. Phone modems attempt
to detect carrier when the call is placed. The red LED on the RF95T lights
when the modem detects a carrier.
Child Node – See Node. A node that is accessed through another device
(parent node). For example a remote radio frequency (RF) site is accessed
through and a child of the base RF232T. All nodes are child nodes of the PC.
Client – a software application designed to connect to a server. Usually
provides some type of user interface or data acquisition. Email programs
running on individual PCs are typically client applications that connect to an
email server program running on a computer at an Internet Service Provider to
receive and send email messages.
Coaxial cable – Special type of cable with two conductors (center conductor
and outer shield conductor). Classified by size, impedance, and loss
characteristics. Used to connect MD9 modems and to connect radios to
antennas.
Collection – (see Data Collection)
COM Port – A computer’s serial communications port. Cables and other
interface devices are connected between the computer’s COM port and the
datalogger.
Communication Server – The software (typically packaged as a DLL) that
provides the communications functions within other software such as PC400 or
LoggerNet.
Control Port – Dataloggers have digital output ports that can be used to switch
power to sensors such as the HMP35C relative humidity circuit or to control
relays. These digital outputs are called Control Ports and are labeled C1, C2,
etc., on the wiring panel. Control ports on some dataloggers can also be used as
inputs to sense the digital (high or low) state of a signal, monitor pulse signals,
control Synchronous Devices for Measurement (SDM), or used as data
input/output connections for SDI-12 sensors.
CoraScript – A command line interpreter client to the LoggerNet server that
allows the user access to many of the capabilities of the LoggerNet server
using direct commands or programmed script files.
CR10X-TD Family of Dataloggers – Any of the Edlog dataloggers with
table-data operating systems become “TD” dataloggers, including the CR10T,
CR510-TD, CR10X-TD, and CR23X-TD.
CRBasic – The programming language used for CR1000X-series, CR6-series,
CR300-series, CR1000, CR3000, CR800-series, CR200-series, GRANITE 6,
GRANITE 9, GRANITE 10, CR5000, CR9000, and CR9000X dataloggers.
Short Cut or the CRBasic Editor are used to create program files for these
dataloggers.
CRBasic Datalogger – A CR1000X-series, CR6-series, CR300-series,
CR1000, CR3000, CR800-series, CR200-series, GRANITE 6, GRANITE 9,
GRANITE 10, CR5000, CR9000, or CR9000X datalogger. Sometimes referred
to as “CRx000 dataloggers.”
A-2
Appendix A. Glossary of Terms
D
Data Advise (Datalogger) – A mutual agreement between the communication
server and the datalogger about which tables are to be collected every time the
datalogger is contacted. Based on the dataloggers table definitions.
Data Advise (Server) – an agreement between a client application and the
communication server to provide specified data as it is collected by the server.
Data Advise Notification – The packet of data sent by the datalogger based on
the Data Advise agreement.
Data Cache – The storage for data collected from the datalogger by the
communication server. This data is stored in binary files on the hard disk of the
computer where the server is running.
Data Collection – Getting a copy of the data stored in the datalogger and
saving it in the communication server’s data cache (compare to Data
Retrieval).
Data Point – A data value that is sent to Final Storage as the result of an
Output Instruction. A group of data points output at the same time makes up a
record in a data table.
Data Retrieval – Sending a copy of the data from the communication server’s
data cache to a file, network, or data display (compare to Data Collection).
Data Storage Table, Data Table – A portion of the datalogger’s Final Storage
allocated for a particular output. Each time output for a given data table occurs,
a new record is written to the table. The size of the table (in number of records)
and when records are written to the data table are determined by the
datalogger’s Data Table Instruction (P84). The fields (columns) of the table are
determined by the Output Processing Instructions that follow the Data Table
Instruction.
Data Table Instruction – Instruction 84. Used to create a Data Table and to
cause records to be written to the Data Table.
DaysFull – A field in the status table that shows the number of days before any
of the tables using automatic record allocation are filled.
DevConfig – Short for “Device Configuration Utility”, a software application
that provides a graphical user interface to configure settings in dataloggers and
communications peripherals. Available in PC400, LoggerNet, and as a stand-
alone application from the Campbell Scientific website. (Supplants
CSOS.EXE, PakCom, and stand-alone terminal emulators.)
Differential Analog Input – Some sensors have two signal wires and the
measurement is reflected in the voltage difference between them. This type of
sensor requires two analog connections. The channels marked DIFF on the
datalogger wiring panel are used to connect differential sensors.
DLD File – An ASCII file that can be sent to program an Edlog datalogger.
Dataloggers must be programmed to perform measurements, convert data to
final units, and to save data for retrieval. Edlog is used to create these files that
A-3
Appendix A. Glossary of Terms
are saved to disk with a DLD file name extension. A program must be sent to
the datalogger before the datalogger will begin to collect data.
E
Edlog – Campbell Scientific’s software application used to create new or edit
existing datalogger programs. Edlog supports all of the programming
capabilities in the dataloggers it supports. (Program generators such as Short
Cut are necessarily more limited in the features they can support.)
Edlog Datalogger – Any of the dataloggers, 21X, CR7, CR10, CR500,
CR10X, CR510, or CR23X. The default operating system for these dataloggers
is a mixed-array configuration. Some of these, specifically the last three, can
have alternative operating systems installed by users. These include mixed-
array, table-data (TD), or PakBus (PB) operating systems.
EEPROM – Electrically erasable programmable read only memory; the
memory CR10X-TD, CR510-TD, and CR23X-TD dataloggers use to store
their operating system. A new operating system can be transferred to the
datalogger using a special software package (see PROM and DevConfig).
Execution Interval – The periodic interval on which the datalogger program is
run. The execution interval is sometimes referred to as the Scan Interval. For
example, when an execution interval of 60 seconds is set, the datalogger will
execute its program table every 60 seconds. Between executions the datalogger
enters a sleep (quiescent) mode. This conserves battery power and creates
predictable measurement intervals. The execution interval is synchronized with
the datalogger’s real-time clock.
Execution Time – The time required to execute an instruction or group of
instructions. If the total execution time of a Program Table exceeds the table’s
Execution Interval, the Program Table will be executed less frequently than
programmed. Each time this occurs, a Table Overrun occurs. Table Overruns
are considered to be “errors” and are reported in the datalogger status
information table.
Excitation Channel – Sensors utilizing electrical bridge circuits require a
precise electrical voltage to be applied. The excitation channels, marked as E1,
E2, etc., on the datalogger wiring panel, provide this required precision
voltage.
F
Fault – Message relating to network activity where repeated problems or errors
have occurred. Repeated faults usually indicate a failure of some kind.
F1 – In most instances, pressing the F1 key will provide context sensitive help
for the highlighted object on the screen.
Final Storage – Final Storage is an area in the datalogger’s memory where
data is stored for collection to a PC. When you collect data from the datalogger
you are collecting data from a Final Storage area or table.
Flag – Memory locations where the program can store a logical high or low
value. These locations, called User Flags, are typically used to signal a state to
another part of the program.
A-4
Appendix A. Glossary of Terms
G
Ground Connection – Most sensors require one or more ground connections
in addition to excitation or signal inputs. Ground connections may serve any of
several purposes:
• a reference for a single-ended (SE) analog voltage (use analog ground if
available)
• a power return path (do NOT use analog ground for power return)
• a connection for cable shield wire to help reduce electrical noise (do not
use analog ground for shield wires, also known as drain wires)
H
Highlight – Text or objects can be highlighted, by positioning the cursor where
you want the highlight to begin, holding the left mouse button, and dragging it
across the words or group of objects to be highlighted. A single object can be
highlighted, by clicking it once with the left mouse button. Highlighted items
can then be edited or activated.
Holes – When using Data Advise, the communications server always gets the
most recent data records, so if there are more records to be returned than can fit
in one packet there can be sequences of older data available from the
datalogger that have not yet been collected to the data cache. The server tracks
and collects these holes only if that option is enabled. This entry in the status
table shows the number of data points in missed records for the data storage
tables in that station.
Hole Collection – The process used by the server to collect data records
missing from the data cache but possibly still in the datalogger. If Hole
Collection is delayed or disabled, the memory in the datalogger can ring
around and overwrite the missing data records resulting in an Uncollectable
Hole.
Host Computer – The machine where the communication server software is
running.
I
INI Files – Configuration files that are used to preserve the last known setups
or states of a program or device.
Initialization String – A string of alphanumeric characters that are sent to a
device, such as a modem, to prepare that device for communications.
InLocs – Abbreviation for “Input Locations”. This entry in the status table
shows the number of input locations allocated for the program.
Input Location Storage – Each time a measurement or calculation is
performed the resultant value is stored in an Input (memory) Location,
sometimes abbreviated as “InLoc.”
A-5
Appendix A. Glossary of Terms
L
Link – Communications route between two devices, for example the phone
link between two phone modems.
LDEP – Logger Data Export Protocol, a protocol and client application that
provides for data distribution from the communications server to a third party
application through a standard TCP/IP socket. Installed with LoggerNet
Admin; see the associated PDF file for more information. Requires record-
specific acknowledgements for record flow control. See LDMP.
LDMP – Logger Data Monitoring Protocol, a protocol and client application
that provides for data distribution from the communications server to a third
party application through a standard TCP/IP socket. Installed with LoggerNet
Admin; see the associated PDF file for more information. Requires very simple
acknowledgements for record flow control. See LDEP.
Log Files – Text files that are stored on the computer’s hard drive that record
activity. They contain information about communications between the
communications server and other devices in the datalogger network. Log files
are typically used for troubleshooting purposes. LoggerNet has four types of
log files: Transaction, Communications Status, Object State, and Low Level
I/O. Refer to Appendix D, Log Files (p. D-1), or the help within the LogTool (in
PC400 click the Tools | LogTool menu item) application for information on
these log files.
M
MD9 – An MD9, or multi-drop modem, is a communications device that uses
twisted pair cable for connection. Typically, the system consists of one MD9
base modem that is attached to the user’s computer, with one or more remote
modems at the datalogger field site. One remote modem is needed for each
datalogger at the field site.
Measurements – Values stored by the datalogger in an Input Location after
reading an electronic signal from a sensor and converting the raw signal into
meaningful units.
Mixed-array – Dataloggers with mixed-array operating systems save output in
a common area of the datalogger’s final storage memory. When data is directed
to final storage, a unique array ID number is stored, followed by other values
as determined by the datalogger program. These are called “elements”.
“Mixed-array dataloggers” typically save all information that is directed to
output storage to the same area of datalogger memory (as opposed to table-
based dataloggers that always store different output processing intervals to
separate tables in datalogger memory). Data retrieved by the PC must be
processed by PC software to separate the data based on the array IDs.
A-6
Appendix A. Glossary of Terms
N
Net Description – Description of dataloggers and communications devices that
form the datalogger network. Created using the EZWizard in PC400 or Setup
Screen in LoggerNet to communicate with the various dataloggers.
Node – Part of the description of a datalogger network. Each node represents a
device that the communications server will dial through or communicate with
individually. Nodes are organized as a hierarchy with all nodes accessed by the
same device (parent node) entered as child nodes. A node can be both a parent
and a child node.
O
ObjSrlNo – This entry in the status table provides the revision number of the
datalogger PROM.
Output Interval – The output interval is the interval at which the datalogger
writes data to Final Storage. The output interval is defined by Instruction 84 in
Edlog (for table-based dataloggers) or the instructions that set the output flag
high in mixed-array dataloggers.
Output Processing – Writing to final storage memory a sample or summary
statistic of data measurements. Output processing options include sending a
sample, average, maximum, minimum, total, or wind vector of data to Final
Storage. Each Output Processing data value is kept in a separate location
within the datalogger. This allows multiple output processing for each
measurement. For example, you can average air temperature over a 60-second
interval, a one-hour interval, and a 24-hour interval. See the operator’s manual
or programming software for output processing options available for each
datalogger model.
Overrun Errors – Overrun errors occur when the actual program execution
time exceeds the execution interval. This causes program executions to be
skipped. When an overrun error occurs, the Table Overrun parameter in the
datalogger’s status table is incremented by 1.
Overruns – This entry in the status table provides the number of table
overruns that have occurred. A table overrun occurs when the datalogger has
insufficient time between execution intervals to complete one pass through the
program. This counter is incremented with each table overrun.
A-7
Appendix A. Glossary of Terms
P
Packet – a unit of information sent between two BMP or PakBus devices that
are communicating. Each packet can contain data, messages, programming,
etc. Usually contains addressing and routing information.
PakBus – A packet-based and packet-switched networking protocol used by
newer dataloggers. PakBus allows for robust transmission of commands and
data, dynamic routing between PakBus devices, and peer-to-peer
communications (such as when one datalogger needs to control another
datalogger without involving the PC).
Parameter – Number or code which helps to specify exactly what a given
datalogger instruction is to do.
Path – The modems, or other devices that make up a link to communicate with
a remote site datalogger.
Polling – Process where a datalogger or other communications device is
periodically checked for any packets it needs to send. The server polls
dataloggers for most communications links. Some communications devices,
such as RF232T radio bases or repeaters can also poll datalogger sites.
Polling Interval – The user-specified interval that determines when to poll a
given device.
PrgmFree – An entry in the status table that shows the amount of remaining
program memory, in bytes.
PrgmSig – An entry in the status table that shows the signature of the
datalogger program. The signature is a unique number derived from the size
and format of the datalogger program.
PromID – An entry in the status table that shows the version number of the
datalogger PROM or OS.
PromSig – An entry in the status table that shows the signature of the
datalogger PROM or OS. As with the PrgmSig, if this signature changes, the
datalogger instruction set has somehow been changed.
Processing Instructions – Datalogger instructions that further process input
location data values and typically return the result to Input Storage where it can
be accessed for output processing. Arithmetic and transcendental functions are
included in these instructions.
Program Control Instructions – Datalogger instructions that modify the
sequence of execution of other instructions in the datalogger program; also
used to set or clear user flags.
Program Signature – A program signature is a unique value calculated by the
datalogger based on program structure. Record this signature in a daily output
to document when the datalogger program is changed.
Program Table – The area where a datalogger program is stored.
Programming in Edlog dataloggers can be separated into two tables, each
having its own execution interval. A third table is available for programming
subroutines that may be called by instructions in Tables 1 or 2. Programming in
CRBasic dataloggers can be separated into different “scans”. The length of the
program tables or scans is constrained only by the total memory available for
programming.
A-8
Appendix A. Glossary of Terms
Q
Quiescent Mode – Often referred to as “sleep mode” – a low power state
between program execution intervals.
R
Real-Time Clock – All dataloggers have an internal clock. The date and time
information from this clock are used in the time stamp for stored data. The
datalogger’s execution interval and timer are synchronized with the clock.
Some Edlog dataloggers (CR10X, CR510, and CR23X) and all CRBasic
dataloggers have battery backups that maintain the clock even when 12V
power is not available.
Record – A group of data values output at the same time to the same data
table. Records are written in response to the Data Table Instruction (84) in TD
dataloggers or the DataTable declaration in CRBasic dataloggers. The
individual fields within each record are determined by the Output Processing
instructions following the instruction that created the data table.
RecNbr – An entry in a table that shows the sequential record number in the
table.
Remote Site – Typically where a datalogger is located at the other end of a
communications link. Also can refer to the site where a radio (RF) repeater is
located.
Repeater – a radio (RF) site that relays packets of information to a remote site.
Used to extend the range of radio transmissions. Most remote datalogger sites
with radios can act as repeaters.
Retries – When a transaction or communication between two devices or
programs fails, the transaction or communication can often be triggered to
repeat until it succeeds.
Retrieval – (see Data Retrieval).
RF – Radio Frequency.
RTDM – Real Time Data Monitor software. A very sophisticated graphical
data display application that gets data from either data files or the
communication server’s data cache. RTDM is a stand-alone application.
RTMC – Real Time Monitoring and Control software. A client application to
the communications server that displays data from the server’s data cache
(only) and updates as new data is collected. RTMC is relatively easy to set up,
and ships with LoggerNet.
A-9
Appendix A. Glossary of Terms
S
Scan Interval – See Execution Interval.
SDI-12 – SDI-12 stands for Serial Digital Interface at 1200 baud. It is an
electrical interface standard and communications protocol that was originally
developed by Campbell Scientific and other manufacturers for the U.S.
Geological Survey for hydrologic and environmental sensors. SDI-12 was
designed to be a simple interface (ground, 12 volts, and signal) that improves
compatibility between dataloggers and “smart” microprocessor-based sensors.
Other goals of the SDI-12 standard are:
• low power consumption for battery powered operation via the datalogger
• low system cost
• use of multiple sensors on one cable connected to one datalogger
• allow up to 200 feet of cable between a sensor and a datalogger
Security Code – A code entered into the datalogger either directly with a
keypad or via the datalogger’s program to prevent unauthorized access to
datalogger settings, programs, and data.
Server – Also “communication server”, a software application that accepts
connections from client applications and provides data or other information as
requested. The LoggerNet server manages all the communications and data
collection for a network of dataloggers. The collected data is made available
for client applications. PC400 also uses the communication server but in a
more limited configuration.
Short Cut – A program generator application that ships with PC400,
LoggerNet, and is available as a stand-alone product from the Campbell
Scientific website. Short Cut does not require knowledge of individual program
instructions. Users need only know what kind of datalogger and sensors they’re
using and decide what output they require. Short Cut generates the program for
them. (Contrast a “program generator” with the full-featured “program
editors”, Edlog and CRBasic Editor.)
Signature – Number calculated to verify both sequence and validity of bytes
within a packet or block of memory.
Single-ended Analog Input – Some analog sensors have only one signal wire.
(They will also have another wire that can be grounded and that is used as the
reference for the signal wire.) With this type of sensor, only one analog
connection is required. Hence, it needs a “single-ended” or SE analog input.
The single ended channels are marked as SE on the datalogger wiring panel.
Station – A datalogger site is often referred to as a station.
Station Number – The LoggerNet server assigns and uses station numbers for
routing packets to the dataloggers. These numbers can be modified using
CoraScript. Not to be confused with datalogger serial numbers, PakBus
A-10
Appendix A. Glossary of Terms
T
Tab Windows – Some screens depict a series of related windows in a multi-
tabbed notebook format. When you click the file folder tab, the information on
the tab you chose will be displayed.
Tables – An entry in the status table that shows the number of user-created
data tables. (See also Data Table.)
Table-based Dataloggers – Table-based dataloggers store each record of data
that follows an output instruction in a table. Each separate occurrence of an
output instruction directs the datalogger to store the data in a separate table.
“Table-based” includes both “TD” table-data and “PB” PakBus versions of the
Edlog dataloggers as well as the CRBasic dataloggers.
Table Definitions – List of data available from a table-based datalogger. The
datalogger supplies this list on request. The tables are determined by the
datalogger program. The LoggerNet server must have a current version of the
table definitions to collect data from the datalogger.
Time Stamp – The date and time when data are stored in the datalogger.
TMStamp – An entry in the status table that shows the date and time the status
information was recorded.
Transaction – The exchange of data or information between two devices or
programs. For example, setting the clock in a datalogger requires a transaction
between the server and the datalogger.
U
Uncollectable Hole – Occurs when a hole in the data cache cannot be collected
from the datalogger before the data table wraps around and the records are
overwritten.
V
Variable Name – Edlog uses variable names in expressions. Variables are
another name for input location labels. For instance, in the equation TempF =
(TempC*1.8) + 32, TempC is an input location label and TempF is a new
location calculated from TempC. CRBasic dataloggers use variables for all
A-11
Appendix A. Glossary of Terms
W
Wiring Panel – The set of terminals and underlying circuits that enable
connections of sensors, control and power supply wiring to the datalogger
itself. Some dataloggers such as the CR23X have built-in wiring panels.
Others, such as the CR10X, have removable wiring panels.
Watchdog – An entry in the status table that shows the number of watchdog
errors that have occurred. The watchdog checks the processor state and resets it
if necessary. If an error occurs, the watchdog error counter is incremented.
A-12
Appendix B. Campbell Scientific File
Formats
Campbell Scientific, Inc. uses different formats for data in datalogger memory, external PC
cards, datalogger communication software, and PC files. The data formats written to PC
files by LoggerNet are written by default as .DAT files. The following sections will focus on
the format of these PC files, discuss the data formats that exist in the datalogger and on PC
cards, and describe methods for converting binary data formats.
• Data from multiple arrays or intervals can be included in the same file.
• Data are formatted in as little space as possible. Values are printed with all
extraneous formatting such as that used in the printable ASCII format
removed.
108,2002,7,1528,58,.17365
112,2002,7,1528,58,.98481
108,2002,7,1528,59,.19081
112,2002,7,1528,59,.98163
108,2002,7,1529,0,.20791
112,2002,7,1529,0,.97815
B-1
Appendix B. Campbell Scientific File Formats
• The length of each line of text in the file will not exceed 79 characters.
B.1.3 TOACI1
This file format was originally introduced to support data coming from table-
data dataloggers. This format has the following features:
• The file format type, the station name, and the table name. Note
that, by default, the station name is the name given to the
datalogger in the network map. If the Use Reported Station
Name check box is selected, the station name from the Status
Table will be used.
• The field name for each of the data values. (See TABLE B-1 for
field name suffixes.)
• Each record in the file is assigned a timestamp and record number. The
record number is a logged sequence number that is assigned by the
datalogger.
"TOACI1","gold","one_min"
"TMSTAMP","RECNBR","temp_degf_AVG","meas1","meas2"
"2001-12-30 19:16:00",18002,69.05,3000,1500
"2001-12-30 19:17:00",18003,69.06,3001,1499
"2001-12-30 19:18:00",18004,69.06,3002,1498
B-2
Appendix B. Campbell Scientific File Formats
B-3
Appendix B. Campbell Scientific File Formats
B.1.4 TOA5
TOA5 is a text-based file format similar to TOACI1 but with additional
information in the header. This format has the following features:
• The file format type, the station name, the datalogger type, the
serial number, the OS version, the DLD name, the DLD
signature, and the table name. Note that, by default, the station
name is the name given to the datalogger in the network map. If
the Use Reported Station Name check box is selected, the
station name from the Status Table will be used.
• The field name for each of the data values. (See TABLE B-1 for
field name suffixes.)
• Data values are formatted as comma separated text suitable for importing
into spreadsheet or database applications.
"TOA5","CR1000","CR1000","1031","CR1000.Std.00.60","CPU:Test.CR1","4062","Test"
"TIMESTAMP","RECORD","batt_volt_Min","PTemp"
"TS","RN","Volts","C"
"","","Min","Smp"
"2004-11-11 15:03:45",0,13.7,24.92
"2004-11-11 15:04:00",1,13.7,24.95
"2004-11-11 15:04:15",2,13.7,24.98
B.1.5 TOB1
TOB1 files can be generated by LoggerNet when outputting data files to the
PC. This binary file format is typically only used when it is essential to
minimize the file size or when other software requires this format. It has the
following structure:
B-4
Appendix B. Campbell Scientific File Formats
"TOB1","STATION","CR9000","1000","1.00","CPU:BIG.DLD","25871","VALUES"
"SECONDS","NANOSECONDS","RECORD","Array(1)","Array(2)","Fast","my_string"
"","","RN","mVolts","mVolts","mVolts"
"","","","Smp","Smp","Smp"
"ULONG","ULONG","ULONG","IEEE4","IEEE4","FP2","ASCII(25)"
Header line one describes the file environment with the following eight fields:
• Station name (STATION). (Note that, by default, the station name is the
name given to the datalogger in the network map. If the Use Reported
Station Name check box is selected, the station name from the Status
Table will be used.)
Header line three describes the units associated with each field in the record.
Units are optional and are specified in the datalogger program, if included. If
no units are provided in the program, then an empty string placeholder is left in
this line for that specific field.
Header line five describes the data type for each field and supports the
following values: IEEE4, IEEE8, FP2, ULONG, LONG, SecNano, BOOL, and
ASCII(len).
Each data record following the header is a sequence of binary values. The
length of each value is determined by the data type assigned to it in header line
five and the length of the entire record is the sum of the individual data value
B-5
Appendix B. Campbell Scientific File Formats
lengths. There are no characters that separate records so the application that
reads the TOB1 file must understand the file header so that the record length
can be calculated.
The timestamp and record number for each record are an optional output in a
TOB1 file. If these elements are present, a “SECONDS”, “NANOSECONDS”,
and “RECORD” column will be generated as names in the field list of header
line two.
101,2009,105,1051,27,13.39,24.04,23.99
101,2009,105,1052,28,13.39,24.04,23.98
101,2009,105,1053,29,13.39,24.04,23.98
101,2009,105,1054,30,13.39,24.04,24
101,2009,105,1055,31,13.39,24.04,23.98
101,2009,105,1056,32,13.39,24.04,23.98
B.1.7 CSIXML
CSIXML is an XML (eXtensible Markup Language) based file format
designed to provide the following features:
• Data records can be appended without having to reformat the entire file.
• The file meta-data can be verified for appending data without having to
read the entire file.
• Can handle both interval driven and event driven data without significant
structural complexity.
• XSD (XML Schema) files can be generated readily for a specific table file
using XSL transforms.
B-6
Appendix B. Campbell Scientific File Formats
Most XML files will begin with a sequence that identifies the file as XML and
can also specify the character encoding of the file (if no character encoding is
specified, the file is assumed to use the UTF-8 unicode character encoding).
The following example shows this sequence as it will appear in CSIXML data
files:
The following example shows how an element with content may appear:
<v n=“pi”>3.14159</v>
<v n=“emptyString”/>
Because XML reserves special characters for its mark-up language, pre-defined
entities are recognized by all XML parsers. These entities include the
following:
B-7
Appendix B. Campbell Scientific File Formats
For more details regarding XML documents and their contents, you can visit
the W3C consortium web page at www.w3.org/XML/. In addition, they offer an
excellent tutorial at www.w3schools.com/xml/default.asp.
<xsd:schema
xmlns:xsd="http://www.w3.org/2001/XMLSchema">
<xsd:complexType name="csixmlType">
<xsd:sequence>
<xsd:element
name="head"
type="headType"
minOccurs="1"
maxOccurs="1"/>
<xsd:element
name="data"
type="dataType"
minOccurs="1"
maxOccurs="1"/>
</xsd:sequence>
<xsd:attribute name="version" fixed="1.0"/>
</xsd:complexType>
<xsd:complexType name="headType">
<xsd:sequence>
<xsd:element
name="environment"
type="environmentType"
minOccurs="1"
maxOccurs="1"/>
<xsd:element
name="fields"
type="fieldsType"
minOccurs="1"
maxOccurs="1"/>
</xsd:sequence>
</xsd:complexType>
<xsd:complexType name="environmentType">
<xsd:sequence>
<xsd:element
name="station-name"
type="xsd:string"
minOccurs="1"
maxOccurs="1"/>
B-8
Appendix B. Campbell Scientific File Formats
<xsd:element
name="table-name"
type="xsd:string"
minOccurs="1"
maxOccurs="1"/>
<xsd:element
name="model"
type="xsd:string"
minOccurs="0"
maxOccurs="1"/>
<xsd:element
name="serial-no"
type="xsd:unsignedInt"
minOccurs="0"
maxOccurs="1"/>
<xsd:element
name="os-version"
type="xsd:string"
minOccurs="0"
maxOccurs="1"/>
<xsd:element
name="dld-name"
type="xsd:string"
minOccurs="0"/>
<xsd:element
name="dld-sig"
type="xsd:unsignedShort"
minOccurs="0"
maxOccurs="1"/>
</xsd:sequence>
</xsd:complexType>
<xsd:complexType name="fieldsType">
<xsd:element
name="field"
type="fieldType"
minOccurs="1"
maxOccurs="unbounded"/>
</xsd:complexType>
<xsd:simpleType name="fieldDataType">
<xsd:restriction base="xsd:string">
<xsd:enumeration value="xsd:string"/>
<xsd:enumeration value="xsd:long"/>
<xsd:enumeration value="xsd:unsignedLong"/>
<xsd:enumeration value="xsd:int"/>
<xsd:enumeration value="xsd:unsignedInt"/>
<xsd:enumeration value="xsd:short"/>
<xsd:enumeration value="xsd:unsignedShort"/>
<xsd:enumeration value="xsd:byte"/>
<xsd:enumeration value="xsd:unsignedByte"/>
<xsd:enumeration value="xsd:float"/>
<xsd:enumeration value="xsd:double"/>
<xsd:enumeration value="xsd:boolean"/>
<xsd:enumeration value="xsd:dateTime"/>
</xsd:restriction>
</xsd:simpleType>
B-9
Appendix B. Campbell Scientific File Formats
<xsd:complexType name="fieldType">
<xsd:attribute
name="name"
use="required"
type="xsd:string"/>
<xsd:attribute
name="type"
use="required"
type="fieldDataType"/>
<xsd:attribute
name="units"
use="optional"
type="xsd:string"/>
<xsd:attribute
name="process"
use="optional"
type="xsd:string"/>
</xsd:complexType>
<xsd:complexType name="dataType">
<xsd:element
name="r"
type="recordType"
minOccurs="0"
maxOccurs="unbounded"/>
</xsd:complexType>
<xsd:complexType name="recordType">
<xsd:attribute
name="no"
type="xsd:unsignedInt"
use="optional"/>
<xsd:attribute
name="time"
type="xsd:dateTime"
use="optional"/>
<xsd:element
minOccurs="1"
maxOccurs="unbounded"
type="valueType">
<xsd:annotation>
<xsd:documentation xml:lang="en">
In order to make value elements easily addressable in
transforms as well as describable in table specific XML
Schema documents, value element names will begin and end
with a unique number so that value elements will be named
using the following sequence: { v1, v2, v3, ... vn }.
</xsd:documentation>
</xsd:annotation>
</xsd:element>
</xsd:complexType>
<xsd:complexType name="valueType">
<xsd:attribute name="n" type="xsd:string"
use="optional"/>
B-10
Appendix B. Campbell Scientific File Formats
<xsd:simpleContent type="anyType"/>
</xsd:complexType>
</xsd:schema>
station-name – Specifies the name of the station that generated the data.
This element must be present.
model – Specifies the model number of the station. This element may be
omitted if the information is not available.
dld-name – Specifies the file name of the program that is running in the
datalogger. This element may be omitted if the information is not available.
name This attribute is required and specifies the name of the field. If the field
is part of an array, the name will include the array subscripts as a comma
separated list of integers within parentheses.
B-11
Appendix B. Campbell Scientific File Formats
type This required attribute specifies the data type for the field. This data type
is a string that corresponds with a subset of XML Schema data types. The
following values will be used within csixml:
units This optional attribute will specify the units string provided by the
datalogger program.
process This optional attribute specifies the process string given by the
datalogger program based upon the processing instruction used to output data
into final storage.
no Specifies the record number for this record. These values indicate the
logged order of the data and will generally increment by one with each record
logged. Records can appear out of order, however, if one-way or data advise
data is used in conjunction with hole collection. Missed numbers can signify
missed records (holes).
time Specifies the time stamp for the record. This format will conform to the
standard XSD timestamp format.
This element will contain as many value sub-elements as there are field
elements in the fields header element.
B-12
Appendix B. Campbell Scientific File Formats
B-13
Appendix B. Campbell Scientific File Formats
<v14>2006-08-15T22:47:40</v14>
<v15>82.7</v15>
<v16>2006-08-15T05:00:20</v16>
<v17>10.45</v17>
<v18>2006-08-15T17:21:13</v18>
<v19>10.98</v19>
<v20>2006-08-15T00:28:05</v20>
<v21>127.8</v21>
<v22>0</v22>
<v23>2006-08-15T01:39:45</v23>
<v24>0</v24>
<v25>1108</v25>
<v26>2006-08-15T10:13:37</v26>
<v27>24.76</v27>
<v28>766.1</v28>
<v29>2006-08-15T07:15:00</v29>
<v30>761.7</v30>
<v31>2006-08-15T18:30:00</v31>
<v32>0.254</v32>
</r>
<r no="341" time="2006-08-17T00:00:00">
<v1>12.97</v1>
<v2>2006-08-16T16:09:52</v2>
<v3>31.16</v3>
<v4>2006-08-16T15:49:27</v4>
<v5>10.43</v5>
<v6>2006-08-16T06:13:46</v6>
<v7>52.5</v7>
<v8>2006-08-16T22:20:10</v8>
<v9>32.15</v9>
<v10>2006-08-16T10:19:42</v10>
<v11>30.34</v11>
<v12>2006-08-16T15:37:33</v12>
<v13>10.06</v13>
<v14>2006-08-16T05:25:24</v14>
<v15>84.6</v15>
<v16>2006-08-16T04:26:52</v16>
<v17>13.21</v17>
<v18>2006-08-16T17:28:53</v18>
<v19>11.86</v19>
<v20>2006-08-16T14:34:31</v20>
<v21>217.4</v21>
<v22>0</v22>
<v23>2006-08-16T00:02:48</v23>
<v24>0</v24>
<v25>922</v25>
<v26>2006-08-16T12:47:13</v26>
<v27>26.68</v27>
<v28>764.5</v28>
<v29>2006-08-16T10:00:00</v29>
<v30>761.7</v30>
<v31>2006-08-16T18:45:00</v31>
<v32>0</v32>
</r>
</data>
</csixml>
B.1.8 CSIJSON
CSIJSON is a file format that is relatively easy to parse in any language but
more particularly so in JavaScript since it adopts the same syntax rules that are
used for JavaScript object initialization. Its structure is much like CSIXML It is
very easy to digest in a JavaScript or ActionScript (Flash) environment and is
probably the most efficient means of handling CSI generated data in a web
browser context.
The CSIJSON file format is available for some CRBasic instructions including
TableFile and the WebPageBegin/WebPageEnd Format command.
B-14
Appendix B. Campbell Scientific File Formats
{
"head": {
"signature": xxxx,
"transaction": "xxxxyyyy",
"environment": {
}
"fields": [
]
},
"data": [ ]
}
This declaration declares an object that contains two empty sub-objects, head,
and data. In a JavaScript program, a string in this format can be easily parsed
using the Eval() function or the newer ParseJSON() function. Once parsed, the
data contained therein can be accessed using standard JavaScript notation.
B.1.8.2.1.1 head.signature
This numeric value is the signature calculated on the table definitions. This
value can be used by the web client to determine whether the table definitions
have changed while that client is monitoring or polling for data. If the web
client is using the DataQuery command in the datalogger web services and
specifies a tablesig value that matches this value, the server will not send the
head.environment or head.fields values.
B.1.8.2.1.2 head.transaction
This optional value specifies a transaction identifier that can be sent by a web
client in the DataQuery parameter. This value can help the client route
responses to the correct object.
B-15
Appendix B. Campbell Scientific File Formats
B-16
Appendix B. Campbell Scientific File Formats
units Specifies the units string for this field as assigned by the
datalogger program.
vals An array of the data values for this record. Each element in this array
must correspond with the equivalent element in the head.fields array.
B-17
Appendix B. Campbell Scientific File Formats
{
"name": "temp_degf_TMn",
"type": "xsd:dateTime",
"units": "",
"processing": "TMn"
},
{
"name": "temp_degf_Avg",
"type": "xsd:float",
"units": "DegF",
"processing": "Avg"
},
{
"name": "temp_degf_Max",
"type": "xsd:float",
"units": "DegF",
"processing:" "Max"
},
{
"name": "temp_degf_TMx",
"type": "xsd:dateTime",
"units": "",
"processing": "TMx"
}
]
}
"data": [
{
"no": 43,
"time": "2010-01-20T00:00:00",
"vals": [
69.62625, "2010-01-19T07:53:40", 73.69058,
78.82542,
"2010-01-19T17:41:05"
]
},
{
"no": 44,
"time": "2010-01021T00:00:00",
"vals": [
70.85629, "2010-01-20T08:14:40", 74.24667,
77.28296,
"2010-01-20T17:41:51"
]
},
{
"no": 45,
"time": "2010-01-22T00:00:00",
"vals": [
70.90952, "2010-01-21T07:17:08", 74.41795,
78.02577,
"2010-01-21T17:39:01"
]
}
]
}
B-18
Appendix B. Campbell Scientific File Formats
• Frame headers in TOB3 are twelve bytes long rather than eight bytes long.
The additional four bytes contain an unsigned integer with the least
significant byte written first to identify the record number for the first
record in the frame.
• In the TOB3 format, the offset field in the major frame footer no longer
represents the number of frames back to the last minor frame. This
information is used in TOB2 to help accelerate searching for data but is
not considered to be necessary in TOB3 because of the presence of the
record number in the frame header.
The TOB2 or TOB3 binary file format has the following structure with each
header line terminated with a carriage return and line feed (CRLF):
Header line one describes the file environment with the following fields:
• Station name.
B-19
Appendix B. Campbell Scientific File Formats
Header line three describes the names for each field in a table record as
determined by the datalogger program.
Header line four describes the units associated with each field in the record.
Units are optional and are specified in the datalogger program, if they are
included. If no units are provided in the program, then an empty string
placeholder is placed in this line for that specific field.
Header line six defines the data types for each field in the record and supports
the following values: IEEE4, FP2, ULONG, LONG, SecNano, and ASCII(len).
TOB2 frame headers are eight bytes long and hold the timestamp for the first
record in the frame. TOB3 frame headers are twelve bytes long and contain the
same timestamp information but also add a four-byte unsigned integer that
represents the beginning record number for that frame.
The frame data begins immediately following the frame header and consists of
zero or more data records. Each record contains one data point for each of the
field names identified in header line three. The data type and implied size of
these data points are identified by the data types list given by header line six.
The frame footer makes up the last four bytes of the frame.
B-20
Appendix B. Campbell Scientific File Formats
B.3.3 IEEE4
A standard four-byte floating-point number format used for certain values
within a record. This format consists of a single sign bit, an eight-bit binary
exponent, and a 23-bit mantissa.
B.3.4 IEEE8
A standard eight-byte floating-point number format used for certain values
within a record. This format consists of a single sign bit, an 11-bit exponent,
and a 52-bit mantissa.
B.4.1 Split
Split has the capability of reading TOB1, TOB2, and TOB3 files and
displaying data from those files in ASCII format. The output parameters are
user specified and Split generates a file containing the converted ASCII format
values.
B.4.3 CardConvert
The CardConvert program can convert TOB1, TOB2, and TOB3 binary files to
TOA5, Array Compatible CSV, or CSIXML file format. It can also be used to
convert TOB2 or TOB3 binary files to TOB1 file format.
B-21
Appendix B. Campbell Scientific File Formats
B.4.5 TOB32.EXE
The TOB32.EXE command line utility is installed by default in the LoggerNet
program directory at C:\Program Files\Campbellsci\Loggernet\tob32.exe. The
output is similar to CardConvert. Command line switches are used to determine
the new file format that will be created. Some of the basic switches available
are listed below:
-h or -? | Help
B.4.6 csidft_convert.exe
The csidft_convert.exe command line utility is installed by default in the
LoggerNet program directory at C:\Program Files
(x86)\Campbellsci\LoggerNet\csidft_convert.exe. It takes as input the name of
a data file in one of Campbell Scientific’s standard formats and will create a
second file in another specified format.
where:
B-22
Appendix B. Campbell Scientific File Formats
All output formats other than toaci1 have an additional optional parameter:
--format-options = format-options (This is an integer value as described
below for the different output formats. In each case, add the numbers
together for all desired options and input this number in the --format-
options parameter.)
TOA5
Include Timestamp 1
Include Record Number 2
Midnight is 2400 4
TOB1
Include Timestamp 1
Include Record Number 2
CSIXML
Include Timestamp 1
Include Record Number 2
Include Field Names in 4
each row
Midnight is 2400 8
Custom-CSV
Include Seconds 1
Include Hour/Minutes 2
Include Julian Day 4
Include Year 8
Midnight is 2400 16
Include Array ID 256
Array ID Desired array ID (between 1 and
1023) multiplied by 65,536
B-23
Appendix B. Campbell Scientific File Formats
No-Header
Include Timestamp 1
Include Record Number 2
Surround strings by quotation marks 4
Midnight is 2400 8
Examples
NOTE If the utility does not reside in the same directory as the data files,
the entire directory paths must be used. Also, note that the utility
will overwrite any existing file with the same name as
output_file_name. Therefore, use caution in specifying the
output_file_name.
NOTE TOB1 requires that a size be specified for each field containing a
string. Therefore, when converting a file containing strings to
TOB1, this utility will estimate the length of each field containing
strings. It will do this by taking the length of the string in the first
record and multiplying it by 10 with a minimum string size of 64.
If a string in the field is too large to fit into this estimated string
size, it will be truncated.
B-24
Appendix B. Campbell Scientific File Formats
Only without the tabs and carriage return in the middle. One with strings might
look like this.
The acknowledgment records to be sent back to the server for the two records
shown above would be:
Lgr,Sec15,123456
and
PC1,StatMsg,13355
B-25
Appendix C. Software Organization
C.1 LoggerNet/Client Architecture
The LoggerNet communication server provides the interface to all of the
dataloggers and the support for the different communications mediums. It runs
in the background and provides an attachment for the clients that provide the
user interface. The server handles all communications with the dataloggers.
The LoggerNet server handles connections from all the user interface screens
simultaneously, allowing many different views and ways to access the data
collected from the dataloggers. In addition to running on the same computer
with LoggerNet, some client applications can be run on other computers
connected to the LoggerNet computer over a local area network (LAN).
C.2.1 Organization
The data cache is set up to emulate the way data is stored in the datalogger.
When a new datalogger station is defined for the network and communication
is established with the station, the server requests the table definitions from
table data dataloggers. For array based dataloggers the array definitions are
contained in a final storage label file that is associated with a datalogger. This
table or array information is used to set up equivalent tables and data arrays for
data storage in the data cache. The size of the areas set up in the data cache is
dependent on the size of final storage in the datalogger.
Datalogger tables that hold only one record, such as the Input Locations table
and the Status table, would have only two records assigned in the data cache.
The storage in the data cache is designed to operate with “ring memory” just
like the datalogger. This means that records will be stored in the data cache
area for that table until it has reached the maximum number of records, the next
data record will replace the oldest record in the storage table, and so on.
C.2.2 Operation
Normal data collection from the datalogger is done with polling based on the
scheduled collection interval set up by the user. This is the most efficient
means of data collection for networks with rapid direct communications links.
When it is time for a scheduled data collection the server sends a data poll
request to the datalogger to get all of the data stored in the selected tables since
C-1
Appendix C. Software Organization
the last poll. The tables to be collected are specified by the user in the Setup
Screen.
As each record is written to the data cache, the server adds a filemark number
to the record as it is stored. This filemark number is used to identify
discontinuities in the data. The filemark number starts out as zero when the
table for the data cache is created or re-initialized. This number is incremented
each time a discontinuity is seen in the data records. Such a discontinuity can
occur when there is a gap in the record numbers because the data table filled
and overwrote the requested data. This also can occur if the record number
rolls over from the maximum to start back at zero or an identical program is
loaded into the datalogger without going through the server.
Data can also be collected from the datalogger using a manual poll operation.
This is achieved by selecting Collect Now from the Connect Screen. When a
manual poll is done the data from the datalogger is saved in the output data file
and is also put into the data cache.
There are a number of things that could cause datalogger table definitions to
change. A new program may have been downloaded to the datalogger, or the
keyboard display may have been used to manually make changes to the
datalogger program.
When a change in table definitions is detected, the server stops data collection
and indicates in the Collection State of Status Monitor that the table definitions
have been changed. Data collection cannot be restarted until either a new
datalogger program is loaded into the datalogger by the server, or updated table
definitions are received from the datalogger. Either of these actions causes the
data in the data cache for that datalogger to be removed and new data cache
tables set up based on the new table definitions for that datalogger. LoggerNet
will save the existing output data file with a modified name and create a new
output data file.
C-2
Appendix C. Software Organization
The default installation of the LoggerNet software creates folders and installs
software in two directories: the C:\CampbellSci working directory and
C:\Program Files\CampbellSci\LoggerNet program directory.
The client application directories are used for storing system related files for
each application, and they can also be used for storing user files for that
C-3
Appendix C. Software Organization
application (such as the *.SCW program file for Short Cut for Windows). The
client applications are each given their own directories so that if more than one
Campbell Scientific software product is installed on the system, common
applications will be shared among these software applications. For instance,
you may have PC400 and LoggerNet installed on the same computer. Both of
these applications include the CRBasic Editor. By sharing directories among
CSI applications, you have only one “instance” of CRBasic running on your
machine, and it will look the same regardless of whether it is started from
LoggerNet or PC400.
Most files necessary for running LoggerNet and its applications are stored in
the C:\Program Files\CampbellSci\LoggerNet subdirectory. Applications, such
as RTMC and SCWIN, have their own subdirectories under C:\Program
Files\CampbellSci.
No other files are saved to these program file subdirectories in order to protect
the integrity of the LoggerNet communication server and the clients.
C-4
Appendix D. Log Files
D.1 Event Logging
As LoggerNet performs its work, it will create records of various kinds of
events. The logs can be very useful for troubleshooting problems and
monitoring the operation of the datalogger network. You can monitor these
logs using LogTool launched from the Tools category of the LoggerNet
toolbar. They can also be saved to disk and opened in a text editor. Most users
will not need to understand these logs, but if you request technical assistance, a
Campbell Scientific applications engineer may ask you to send them one or
more of the logs.
Transaction Status (TranX.log) – This log file documents the state of the
various transactions that occur between the LoggerNet server and devices in
the datalogger network. This is the most readable of the logs and contains event
messages that are meaningful to most users. Examples of these events are:
• Data collection
The format and type of records in this log are strictly defined to make it
possible for a software program to parse the log records.
Object State (StateX.log) – This log file documents the state of an object.
This is primarily for troubleshooting by software developers and the messages
are relatively free in form.
Low Level I/O (IOXSerial Port_1.log) – A low level log file is associated
with each root device in the datalogger network to record incoming and
outgoing communications. While the entire network can be monitored from a
single messaging session of the transaction, communications status, or object
state logs, monitoring of the low-level log is performed on a session with the
root device for that log.
D-1
Appendix D. Log Files
where “X” is “$” for the currently active file and 0, 1, 2, etc. for archived files.
The server stores the most recent log records in a file that has a $ character in
the place of the version number. When this file grows to the point that it will
exceed the threshold set by the File Size setting for that log, the server will
rename the log file by replacing the dollar sign with a new version number. At
the same time that the server rolls over to a new log file, the File Count
parameter for that log will also be evaluated. If there are more saved files for
that log than are allowed by the File Count parameter, the server will delete the
oldest of these files until the count is less than or equal to the File Count.
Timestamp – The server time when the record was generated. It will have the
following format:
YYYY-MM-DD HH:MM:SS.mmm
where “YYYY” is the 4-digit year, “MM” is the month number, “DD” is the
day of the month, “HH” is the hour in the day (24 hour format), “MM” is the
minutes into the hour, “SS” is the seconds into the minute, and “mmm” is the
milliseconds into the second.
Device Name – The name of the device associated with the message. If the
message is associated with the LoggerNet server, this will be an empty string.
D-2
Appendix D. Log Files
Message Type Code – Identifies the type of event that has occurred. This is a
number that corresponds to the description immediately following. If this log is
being read by a software program, a number is very easy to use for comparison
when looking for specific message types.
Message Type Description – Text that describes the message type code.
The following table is a list of the different messages that can appear in the
transaction log, some of the optional parameters and what the message means.
Where appropriate, a suggested response to the message is provided.
User Response to
Code Message Text Message Parameters Message Meaning Message
1 Network device Device Name A new device was
added added to the network
map.
2 Network branch Device Name A branch of the
deleted network map was
deleted (this may
consist of a single
device)
3 Network branch Device Name A branch of the
moved network map was
moved from one
parent device to
another (not
supported in
LoggerNet 1.1)
5 Network logon Logon Name A client application
succeeded successfully attached
to the server
6 Network logon Logon Name A client application If unsuccessful logon
failed failed to attach to the messages occur
server frequently, use a
network monitor to
determine who is trying
to connect. If security is
enabled this message
will appear for someone
trying to connect with
the wrong user name or
password.
7 Security session The security
opened configuration utility
has attached to the
server.
D-3
Appendix D. Log Files
User Response to
Code Message Text Message Parameters Message Meaning Message
8 Security When the server This is a normal
database read started up it could not message on server
failed read the security startup if security has
settings file. not been set up. If
security should be set
the file needs to be
removed and security
re-configured.
9 Modem default When the server This file should exist in
database read started up it could not the working directory
failed read the default on the server computer
modem file (c:\campbellsci
wmodem.ini. \loggernet\sys\bin).
May indicate a
permissions or
configuration problem
on the computer.
10 Modem custom When the server If the user has not set
database read started up it could not up custom modem
failed read the user configurations, this file
customized modem will not exist.
settings file
wmodem.cust.
11 Clock check A clock check has
started been initiated. This
clock check is not
sent out to the station
until the transaction
is sent.
12 Clock set Device time before set; The device clock has
Server time; been set.
13 Clock checked Datalogger time The datalogger clock
has been checked.
14 Clock check Reason code: The clock check/set Check the connections
failed 3. Communication failure failed for the reason of the communication
4. Invalid datalogger specified in the path to the datalogger,
security clearance reason code. make sure the
5. Invalid transaction datalogger is connected
number specified (already and has power, check
in use) the security setting in
6. Communications are the datalogger and in
disabled for this device Setup, check that
7. The transaction was communications are
aborted by client request enabled in Setup for all
8. The device is busy with the devices in the path.
another transaction
D-4
Appendix D. Log Files
User Response to
Code Message Text Message Parameters Message Meaning Message
15 Starting BMP A start data advise
data advise operation has been
transaction initiated. Data advise
is not in place until
the datalogger
responds.
16 Stopping BMP A stop data advise
data advise operation has been
transaction initiated.
17 BMP data The message from
advise the datalogger
transaction confirming the start
started of data advise has
been received.
18 BMP data The message from
advise the datalogger
transaction confirming the
stopped suspension of data
advise has been
received.
19 BMP data The attempt to start Check communications
advise or stop a data advise with the datalogger by
transaction with the datalogger trying to check the
failed has failed or the clock. If that fails
operation has timed follow the steps for
out waiting for a message 14.
response.
20 Hole detected Table name; Beginning A hole or missed The server will
record number; Ending records has been automatically try to
record number detected in the data collect the data if hole
coming from the collection is enabled.
datalogger.
21 Hole collected Table name; Beginning The missing records
record number; Ending specified have been
record number collected from the
datalogger.
22 Hole lost Table name; Beginning The missing records
record number; Ending have been
record number overwritten in the
datalogger.
23 Hole collect Table name; Beginning The hole collect
start record number; Ending request has been
record number started. This message
won’t go to the
datalogger until the
BMP1 message is
sent. (see message
104)
D-5
Appendix D. Log Files
User Response to
Code Message Text Message Parameters Message Meaning Message
24 Hole collect The datalogger has
response returned the response
received to the hole collect
request. This will
contain either the
data or state that the
hole is lost.
25 Hole collect The hole collection Check communications
failed request either timed with the datalogger by
out or a trying to check the
communication clock. If that fails
failure occurred. follow the steps for
message 14.
26 Data polling Data collection by
started polling started.
27 Data polling Data collection by
complete polling completed
28 Data polling Data collection by Check communications
failed polling failed due to with the datalogger by
communication trying to check the
failure or a timeout. clock. If that fails,
follow the steps for
message 14.
29 Directed data A user initiated query
query start has been started.
30 Directed data The requested data in
query continue the directed query
could not fit in one
block and the next
part is being
requested.
31 Directed data The user requested
query complete data has been
received by the
server.
32 Directed data The directed query
query failed request failed.
33 Getting logger The server is getting Getting the datalogger
table definitions the table definitions table definitions will
from the datalogger. erase any data in the
data cache.
34 Received logger The server has
table definitions received the
datalogger table
definitions.
35 Failed to get The request to get
logger table table definitions has
definitions failed.
D-6
Appendix D. Log Files
User Response to
Code Message Text Message Parameters Message Meaning Message
36 Logger table The server has A change in table
definitions have detected a change in definitions indicates
changed the table definitions that the datalogger
in the datalogger. program may have
changed. Before
updating table
definitions make sure
the needed data in the
data cache has been
saved to a file if
desired.
37 Updating The network
BMP1 network description in the RF
description base is being updated
to reflect changes in
collection schedule or
stations to collect.
38 BMP1 network The RF base has
description acknowledged the
update network description
complete update.
39 BMP1 network The network Check the connections
description description update to from the PC to the RF
update failed the RF base has base.
either timed out or
communication has
failed.
40 Datalogger Severity (S for Status, W This is a message that Datalogger warning and
message for Warning, F for Fault); has been generated fault messages should
Message text. by the datalogger (or be investigated using
in some cases the RF the datalogger operators
base on behalf of the manual or contacting an
datalogger). applications engineer at
Campbell Scientific.
41 Records Table name; Beginning Datalogger records
received record number; Ending have been received
record number and stored in the data
cache.
42 A datalogger Time out period in The server has waited Determine the reason
transaction has milliseconds longer than the for the timeout. This is
timed out allotted time for the usually due to a
expected response to problem with the
a transaction. communications path
between the PC and the
datalogger.
43 Terminal Terminal emulation
emulation message has been
transaction sent to the
started datalogger.
D-7
Appendix D. Log Files
User Response to
Code Message Text Message Parameters Message Meaning Message
44 Terminal Terminal emulation
emulation response message has
transaction been received from
complete the datalogger.
45 Terminal The expected
emulation terminal emulation
transaction response from the
failed datalogger was not
received.
46 Set variable The message to set an
started input location, flag or
port has been sent to
the datalogger.
47 Set variable The datalogger has
complete acknowledged the set
of an input location,
flag or port.
48 Set variable The datalogger failed
failed to acknowledge the
set variable message.
49 Table resized The size of the table If the table is made
storage area in the smaller the oldest data
data cache has been will be lost.
changed.
50 Program file The server is sending
send start a program to the
datalogger. The
actual program
segments will appear
as BMP1 message
type 4.
51 Program file The datalogger has
send status received the program
segment.
52 Program file The datalogger has
send complete compiled the
program.
53 Program file The datalogger did If the program did not
send failed not acknowledge the compile check the error
receipt of the messages. Otherwise,
program, the program check communications
did not compile, or with the datalogger by
communications trying to check the
failed with the clock. If that fails,
datalogger. follow the steps for
message 14.
D-8
Appendix D. Log Files
User Response to
Code Message Text Message Parameters Message Meaning Message
54 Program file The server is
receive start requesting the
datalogger program.
The actual program
segments will
appears as BMP1
message type 5.
55 Program file A program segment
receive status has been received.
56 Program file The datalogger
receive program has been
complete received from the
datalogger.
57 Program file The datalogger failed Check communications
receive failed to send the program with the datalogger by
or communications trying to check the
with the datalogger clock. If that fails,
failed. follow the steps for
message 14.
58 Collection This is an advisory
schedule: message that the
normal normal data
collection schedule is
active.
59 Collection A normal data Determine the reason
schedule: collection has failed for communication
primary retry and data collection failure. Temporary
will be attempted at communication
the primary retry problems may cause the
interval. collection state to
change between normal
and primary.
60 Collection The number of
schedule: primary retries
secondary retry specified has passed
and data collection
will be attempted at
the secondary retry
interval.
61 Collection The scheduled data
schedule collection has been
suspended turned off or
suspended because
communication is
disabled or table
definitions have
changed.
D-9
Appendix D. Log Files
User Response to
Code Message Text Message Parameters Message Meaning Message
62 Primary retry Data collection on the Check communications
collection primary data with the datalogger by
attempt failed collection interval trying to check the
failed. clock. If that fails,
follow the steps for
message 14.
63 Secondary retry Data collection on the Check communications
collection secondary data with the datalogger by
attempt failed collection interval trying to check the
failed. clock. If that fails,
follow the steps for
message 14.
64 Device restore On server startup a
from file device previously
succeeded entered in the
network map has
been restored.
65 Device restore On server startup a This is an indication
from file failed device in the network that the configuration
map could not be file has been corrupted.
restored. Check the network map
and the computer file
system.
66 Device save to The update to the
file succeeded device configuration
file was successful.
67 Device save to The update to the This may be due to a
file failed device configuration problem with directory
file failed. permissions or a
corrupted directory.
D-10
Appendix D. Log Files
User Response to
Code Message Text Message Parameters Message Meaning Message
68 Packet delivery Fault code: This is a message Codes 1 and 3 are rare.
failed 1. Incompatible BMP1 from the RF base If ever seen contact an
device or malformed indicating that a application engineer at
packet BMP1 message Campbell Scientific.
2. Routing failure didn’t make it to the Code 2 indicates that
{unrecognized station data logger. the RF base has lost the
number} network map and
3. Temporarily out of doesn’t know how to
resources route the message. The
4. Link failure server automatically
resends the network
map.
Code 4 is an indication
that the RF base was
not able to
communicate with the
RF modem attached to
the datalogger. These
will happen
occasionally as part of
normal operations.
Frequent occurrences
indicate that the radio,
antenna, connectors and
RF link be reviewed.
69 Unexpected As part of data A change in table
change in collection the server definitions indicates
datalogger table has detected a change that the datalogger
definitions in the datalogger’s program may have
table definitions. changed. This will
suspend data collection
and warnings will be
shown in the Status
Monitor. Data
Collection can only be
restored by updating
table definitions. Before
updating table
definitions make sure
the needed data in the
data cache has been
saved to a file if
desired.
70 A device setting Setting Identifier; Client’s A client has changed
value has logon name; New value of one of the device
changed the setting configuration
settings.
71 A LgrNet Setting Identifier; Client’s A client has changed
setting value logon name; one of the server
has changed configuration
settings.
D-11
Appendix D. Log Files
User Response to
Code Message Text Message Parameters Message Meaning Message
72 Client defined Client defined message These messages are
message placed in the
transaction log by
client applications.
The message should
indicate which client
entered the message.
73 Socket listen Indicates an error in This is a rare error and
failed the computer system results in a problem
that prevents the with the computer
server from listening operating system. If
for client connections rebooting the computer
on a socket. does not clear the error,
contact an application
engineer.
74 Device renamed The name of a device
in the network was
changed.
75 Logger locked This message
indicates the start of a
transaction such as
terminal emulation
that will tie up the
datalogger preventing
other operations.
76 Logger The transaction
unlocked blocking datalogger
access has completed.
77 Null program The server has sent a
sent null program to get
an older datalogger
(CR7X or 21X) out
of keyboard
emulation mode.
78 Server started The server version The server has been
started.
79 Server shut The server is being If a new “server
down shut down started” message is seen
without the shut down
message before it, this
is an indication that the
server or the PC
crashed without exiting
properly.
80 Collect area Collect area name A data cache collect
initialized area has been created.
82 Collect area A data cache collect
removed area has been
removed
D-12
Appendix D. Log Files
User Response to
Code Message Text Message Parameters Message Meaning Message
83 LgrNet restore On server startup the The network setup and
failed network description configuration will have
file, csilgrnet.dnd, to be restored from a
could not be read. backup or re-entered.
Try to determine what
corrupted or removed
the network description
file.
84 Security On server startup the There is a problem with
manager restore security manager the computer or
failed database could not be operating system. If
restored. rebooting the machine
does not get it working
get help from someone
who can troubleshoot
computer problems.
85 Data restore On server startup the This is a computer
failed data broker data problem. The files are
storage area could not either not present or are
be created. corrupted. See notes for
message 83.
86 Manual poll Client logon name The listed client is
transaction starting a manual poll
started operation according
to the scheduled
collection settings. A
manual poll is
initiated from the
Collect Now button
on the Connect
Screen.
87 Manual poll The manual poll
transaction operation has
complete received the data
from the datalogger.
88 Manual poll The manual poll Check communications
aborted operation was with the datalogger by
stopped or failed to trying to check the
complete due to clock. If that fails,
communications follow the steps for
failure or a timeout. message 14.
89 Selective Collect area name A user specified poll
manual poll has been started for
begun one of the datalogger
collect areas.
90 Selective Collect area name The user specified
manual poll manual poll has
complete completed.
D-13
Appendix D. Log Files
User Response to
Code Message Text Message Parameters Message Meaning Message
91 Selective Collect area name The user specified Check communications
manual poll manual poll failed. with the datalogger by
aborted trying to check the
clock. If that fails,
follow the steps for
message 14.
92 Polling started Collect area name Data has been Collect areas can be
on collect area requested for the table for table mode
specified collect area. dataloggers, final
This message is storage areas, ports and
always associated flags, or input locations.
with another message
indicating whether
this is scheduled,
manual or selective
manual polling.
93 Collect area Collect area name Data has been
poll data received from an
array based
datalogger for the
specified collect area.
94 Collect area Collect area name Data collection for
polling the specified collect
complete area has successfully
completed.
95 Collect area Collect area name Data collection for Check communications
polling failed the specified collect with the datalogger by
area failed. trying to check the
clock. If that fails,
follow the steps for
message 14.
96 Scheduled Scheduled data
polling begun collection has started.
97 Scheduled Scheduled data
polling collection has
succeeded completed.
98 Scheduled Scheduled data Check communications
polling failed collection failed. with the datalogger by
trying to check the
clock. If that fails,
follow the steps for
message 14.
99 Collect area This message is If this is not the first
first poll posted either the first poll for the collect area,
time data is collected this message indicates
for a collect area, or that data that had been
holes were lost for stored in the datalogger
the datalogger. was lost before it could
be collected.
D-14
Appendix D. Log Files
User Response to
Code Message Text Message Parameters Message Meaning Message
100 Table mount Table name; Operating The server was not Check the computer
failed system information able to create a data operating system
regarding the failure collection area from integrity. Verify that the
the stored table LoggerNet system
configuration file or configuration files exist
new table definitions. and the directory has
This could be the not been corrupted.
result of trying to
create table files that
are too large for the
computer system.
101 Add record Table name; Beginning The server was not This indicates a
failed record number; End record able to write data problem writing to files
number; A reason for the records to the data on the computer hard
failure storage area. disk. Verify write
permissions are set and
that there is sufficient
space left on the disk.
102 Collect area Collect area name The specified collect During system startup
skipped area was skipped this is a normal
warning because the message. If it occurs at
associated table has other times contact an
not been initialized application engineer.
by the server yet.
103 Collect area Collect area name The specified collect See message 100
skipped error area was skipped
because the server
could not initialize
the associated table.
D-15
Appendix D. Log Files
User Response to
Code Message Text Message Parameters Message Meaning Message
104 BMP1 packet The packet message type The specified BMP1
sent code: packet was sent to the
0 Packet Delivery Fault serial communication
Notification interface. The
1 Status/Warning/Fault number specifies the
Notification type of message that
2 Network Description was sent.
Transaction
3 Clock Check/Set
Transaction
4 Program Down-load
Transaction
5 Program Up-load
Transaction
7 Data Advise Command
Transaction
8 Data Advise Notification
Packet
9 Hole Collection
Command Transaction
10 Control Command (Set
Variable) Transaction
11 User I/O Transaction
(Terminal Mode)
12 Memory Image Down-
load Transaction
13 Memory Image Up-load
Transaction
14 Get Table Definitions
Transaction
15 RF Test Transaction
16 Communication Status
Notification
D-16
Appendix D. Log Files
User Response to
Code Message Text Message Parameters Message Meaning Message
105 BMP1 packet The packet message type The specified BMP1
received code: packet was received
0 Packet Delivery Fault over the serial
Notification communications link.
1 Status/Warning/Fault The number indicates
Notification the type of message
2 Network Description received.
Transaction
3 Clock Check/Set
Transaction
4 Program Down-load
Transaction
5 Program Up-load
Transaction
7 Data Advise Command
Transaction
8 Data Advise Notification
Packet
9 Hole Collection
Command Transaction
10 Control Command (Set
Variable) Transaction
11 User I/O Transaction
(Terminal Mode)
12 Memory Image Down-
load Transaction
13 Memory Image Up-load
Transaction
14 Get Table Definitions
Transaction
15 RF Test Transaction
16 Communication Status
Notification
106 Data file output Data collected from a Check that there is
failed datalogger could not space available on the
be written to the data hard disk and that write
output file. permissions allow the
server to write the data
output files.
107 Max time on- The amount of time the A client kept the
line exceeded device was connected, in communication link
milliseconds on-line longer than
the specified max
time on-line.
D-17
Appendix D. Log Files
User Response to
Code Message Text Message Parameters Message Meaning Message
108 Table reset The name of the table that The name of a table
was reset; The account was changed at the
name of the logged in request of a client.
client On CR5000 and
CR9000 loggers this
is a reset for the table
in the datalogger and
on the PC.
109 Collect The account name of the The collection
schedule reset logged in client schedule was reset by
the indicated client.
110 Collect area The name of the collection One of the settings
setting changed area; The setting identifier for the specified
for the setting that was collect area was
changed; The new value of changed. The
the setting; The account identifiers for the
name of the logged in setting can be found
client. in CoraScript help.
111 PakBus route A new PakBus route
added has been added to the
routing table.
112 PakBus route A PakBus route has
lost been lost and will be
removed from the
routing table.
113 PakBus station A new PakBus
added station was added to
the network.
114 Call-back begin A device has called
in to the server
starting the call-back
response.
116 Call-back A datalogger that
stopped called in to the server
with call-back is
hanging up.
117 Client logged The login name of the A client application
off client; The reason the has closed or lost the
session was closed. connection to the
server.
118 Table size The name of the table that The size of the table Reduce the size of the
reduced during was resized; The original in the data cache was tables in the datalogger
creation specified size of the table; reduced because program or get more
The new size of the table. there was not enough hard disk storage space
computer disk space for the computer.
to create it, or the file
would have exceeded
the 2 Gbyte size
limit.
D-18
Appendix D. Log Files
User Response to
Code Message Text Message Parameters Message Meaning Message
119 Security Account name used to Security has been Usernames and
enabled enable security. enabled on the passwords will now be
LoggerNet server. required for
communication with
the LoggerNet server.
120 Security Account name used to Security has been
disabled disable security. disabled on the
LoggerNet server.
121 Security Account name used to add A new security
account added new account; Name of the account has been
account that was added. added.
122 Security Account name used to A change has been
account change account; Name of made to the attributes
changed the account that was of a security account.
changed.
123 Security Account name used to A security account
account deleted delete account; Name of has been deleted.
the account that was
deleted.
124 Security Account name used by the The security interface
interface locked client that started the is locked because an
transaction that locked the account is currently
interface. making changes to
the interface.
125 Security Account name used by the The security interface
interface client that started the is unlocked because
unlocked transaction that unlocked pending changes
the interface. were applied or
canceled.
126 Network lock Account name used by the The network is Some functionality will
started client that started the locked because a be disabled until the
transaction that locked the client is currently network lock is
network; Client that started making changes to stopped. To unlock,
the transaction that locked the interface. determine why the
the network. client transaction
locked the network. For
instance, there may be
unapplied changes in
the Setup Screen.
Apply or cancel the
changes to unlock the
network.
127 Network lock The network is
stopped unlocked because
pending changes
were applied or
canceled.
128 Set value Name of the table A device has
command specified; Name of the requested to set a
received field specified. value in one of its
tables.
D-19
Appendix D. Log Files
User Response to
Code Message Text Message Parameters Message Meaning Message
129 Column Name of the table; Original The name of a
renamed column name; New column has been
column name; Reason why changed due to an
column was renamed. incompatibility with
a previous field in the
table that had the
same name.
130 Last primary Number of retries that were The last primary retry Check the connections
retry failed made. attempt failed. of the communication
path to the datalogger,
make sure the
datalogger is connected
and has power, check
the security setting in
the datalogger and in
Setup, check that
communications are
enabled in Setup for all
the devices in the path.
131 Working Name of the file that was The server created a
directory created. backup.
snapshot
132 Working Name of the file from The network was
directory which the network was restored from a
snapshot restored. backup file.
restored
133 File receive Name of the file being The server has begun
started received. a file retrieval from
the datalogger.
134 File receive Name of the file received. The server completed
completed a file retrieval from
the datalogger.
135 File receive Name of the file received; The server failed to Check the connections
failed Reason for the failure. retrieval a file. of the communication
path to the datalogger,
make sure the
datalogger is connected
and has power, check
the security setting in
the datalogger and in
Setup, check that
communications are
enabled in Setup for all
the devices in the path.
136 File send started Name of the file being The server has begun
sent. to send a file to the
datalogger.
137 File send Name of the file sent. The server has
completed completed a file send
to the datalogger.
D-20
Appendix D. Log Files
User Response to
Code Message Text Message Parameters Message Meaning Message
138 File send failed Name of the file sent; The server failed to Check the connections
Reason for the failure. send a file to the of the communication
datalogger. path to the datalogger,
make sure the
datalogger is connected
and has power, check
the security setting in
the datalogger and in
Setup, check that
communications are
enabled in Setup for all
the devices in the path.
139 Collect area Polling on a collect
poll stopped area was aborted
due to table because the table
interval interval has not
expired.
140 Device setting Setting identifier; Name of One of the device
override the user’s account settings has been
overriding the setting; overridden.
Value of the setting.
141 Device setting The device setting
override override has been
stopped stopped.
142 Collect area Name of the collect area; One of the device
setting Setting identifier; Name of collect area settings
overridden the user overriding the has been overridden.
setting; Value of the
setting.
143 Device collect The device collect
area setting area setting override
override has been stopped.
stopped
144 Data file Collect area name; File Collect area data file
opened name. has been opened by
the server.
145 Data file closed Collect area name; File Collect area data file
name. has been closed by
the server.
146 Datalogger Table name; Query Mode; A datalogger query
query started Client Logon Name has been started by a
client.
147 Datalogger Table name; Temporary A temporary cache
query temp table name. table has been created
table created for a datalogger
query.
148 Datalogger Table name; Being record Records have been
query records number; End record received from the
received number. datalogger for a
datalogger query
transaction.
D-21
Appendix D. Log Files
User Response to
Code Message Text Message Parameters Message Meaning Message
149 Datalogger Table name. All of the data for a
query complete datalogger query
transaction has been
collected from
datalogger.
150 Datalogger Table name. Client has closed a
query closed datalogger query
transaction.
151 Existing data Collect area name; File Server has renamed Existing data file will
file renamed Name; Reason for an existing data file be renamed with a
renaming. as a result of .backup extension. New
attempting to append data will be stored to
data in an the specified file name.
incompatible format.
153 Program/TDF User account name. Client has begun a
file associate program file
start association
transaction.
154 Program/TDF Program file
file associate associate transaction
complete has successfully
concluded.
155 Program/TDF Reason for the failure. Program file
file associate associate transaction
failed has failed.
156 File control File control command; A file control
started First argument (optional); operation has begun
Second Argument with a PakBus
(optional); User name datalogger.
(optional).
157 File control File control command; A file control
complete First argument (optional); operation with a
Second Argument PakBus datalogger
(optional); User name has successfully
(optional). completed.
158 File control File control command; A file control Check the connections
failed First argument (optional); operation with a of the communication
Second Argument PakBus datalogger path to the datalogger,
(optional); User name has failed. make sure the
(optional). datalogger is connected
and has power, check
the security setting in
the datalogger and in
Setup, check that
communications are
enabled in Setup for all
the devices in the path.
D-22
Appendix D. Log Files
Severity – A single character code that indicates the type of message. The
following values are legal:
• “S” (Status) Indicates that the identified operation has successfully
completed.
• “W” (Warning) Indicates that the server has attempted to retry the
operation with the identified device.
• “F” (Fault) Indicates that the identified operation has failed and that the
server has stopped retrying.
D-23
Appendix D. Log Files
D-24
Appendix D. Log Files
D-25
Appendix D. Log Files
D-26
Appendix D. Log Files
Each time an RF link is shutdown, an entry will be written to the CQR Log.
The first line in each entry is the timestamp and the name of the datalogger
being communicated with. The remaining lines are the RF Link Quality
Accumulators (RLQA) for each modem in the link. The RLQA are
representative of the active period of the link. The line for each modem will
contain three numbers:
where
“10/14/2010 12:10:35”,CR10XTD
0002 0128 0055
0000 0129 0063
The first line is the timestamp and name of the datalogger being communicated
with. The next line is the RLQA for the EOL (End of Link) modem. This is the
remote modem connected to the datalogger. The last line is the RLQA for the
SOL (Start of Link) modem. This is the base modem. (This entry is for a link
that contains no repeaters. A link with repeaters would show an additional line
for each repeater between the EOL line and the SOL line.) The 0002 indicates
that two interruptions occurred on the EOL modem while the link was active.
All noise level indicators are within acceptable bounds in this example.
D-27
Appendix E. Calibration and Zeroing
E.1 Calibration Essentials
E.1.1 Definition of Calibration
Calibration, in general, refers to actions taken on a measurement system to
increase its accuracy. This is usually done by matching the system’s outputs to
known “control” values in order to increase confidence in the measurement of
future unknowns.
Calibration is periodically necessary when there has been sensor drift or other
variation in sensor outputs. When a calibration instruction is part of the
datalogger program, it is quick and easy to use a software Wizard to change the
measurement configuration at run-time. This saves time over previously used
methods, such as re-writing the CRBasic program or interfering with
measurements to obtain calibration constants manually. With this method,
changes to multipliers and offsets can be made quickly and automatically
without rewriting datalogger programs or interfering with sensor
measurements.
E-1
Appendix E. Calibration and Zeroing
Instruction Description
FieldCal This is the main calibration instruction. The CRBasic
program should contain one FieldCal instruction per
measurement requiring calibration. This instruction is
placed after the measurement instruction to which it
applies.
LoadFieldCal (optional) This instruction loads values into program
variables from the calibration file (*.cal), if it exists. It
will also indicate whether the attempt to load those
values was successful or not by returning a Boolean
(true/false) result.
SampleFieldCal (optional) This is a table output instruction. It writes the
latest calibration values for all calibrated measurements
to a data table (separate from the *.cal file).
NewFieldCal (optional) This is a Boolean system value indicating
when a calibration has succeeded. During one scan cycle
after a calibration has occurred this value will be true. Its
value is then set to false until another calibration occurs.
The value of this variable cannot be set within a CRBasic
Program, but only evaluated. The main purpose for this
variable is to be used together with the SampleFieldCal
instruction to output one table record per calibration to a
specified table.
E-2
Appendix E. Calibration and Zeroing
store calibration values to a data table (in addition to the values stored in the
*.cal file), use the SampleFieldCal table output instruction with the
NewFieldCal system variable as the trigger.
For more information about how to use these instructions, refer to the FieldCal
instruction topic of your datalogger manual, or use the online help topic for
FieldCal within the CRBasic Editor.
E.3.1 Zeroing
Zeroing is the act of placing a sensor into a state where the output condition is
known to be zero and changing the measurement’s offset variable so that the
sensor output reads as zero. By measuring the output of the sensor in this
specialized condition (the zero condition), the offset variable will be changed
to ensure that the known zero condition results in a measurement value of zero.
Note that this process only changes the offset variable that is shared between
the measurement instruction and the FieldCal instruction. The multiplier is
unaffected.
A simple example of zeroing would be taking off all items from a scale
designed to measure the mass of objects. With nothing on the scale, this is the
condition in which the scale should give a “zero” reading for its output. The
calibration is triggered and the offset is adjusted to ensure the scale gives a zero
reading for that condition.
E-3
Appendix E. Calibration and Zeroing
To perform an offset calibration, use an argument of 1 (the number one) for the
calibration type in the FieldCal instruction of your CRBasic program. The
Calibration Wizard can be used to calculate and apply the proper offset while
the program is running in the datalogger, or code can be configured within the
CRBasic program to trigger the offset event based on flags or other user-
defined conditions that occur while the program runs.
E-4
Appendix E. Calibration and Zeroing
Most values of the mode variable represent the status of the calibration for that
affected measurement. A few values of the mode variable are set by the user of
the datalogger to instruct the program to proceed with calibrations.
E-5
Appendix E. Calibration and Zeroing
The following values of the mode variable give the status of the calibration:
The following values of the mode variable are used to initiate a calibration
process:
1 Start the calibration, OR start the first point of a two point calibration
4 Start the second point of a two point calibration
By properly changing the known value variables and the mode variables in a
calibrating program, a manual calibration can be performed on a sensor. Steps
for doing this are given below.
1. Ensure the status (value of the mode variable) is 0 or 6 before you start.
3. Indicate the known offset value (if applicable) by changing the “known
value” variable to that value.
E-6
Appendix E. Calibration and Zeroing
5. Note that the datalogger automatically sets the mode variable to 2 during
the calibration process.
1. Ensure the status (value of the mode variable) is 0 or 6 before you start.
3. Indicate the known value of the first point by changing the “known value”
variable to that value.
4. Set the mode variable to 1 to initiate the first part of the calibration
process.
5. Note that the datalogger automatically sets the mode variable to 2 during
the first point calibration process.
6. Note that the mode variable is automatically set to 3 when the first point is
completed.
a. The datalogger is waiting for the user to place the system into the
second point condition.
8. Indicate the known value of the second point by changing the “known
value” variable to that value.
9. Set the mode variable to 4 to initiate the second part of the calibration.
10. Note that the datalogger sets the mode variable to 5 during the second
point calibration process.
11. Note that the mode variable is set to 6 by the datalogger when the
calibration process completes successfully.
E-7
Appendix E. Calibration and Zeroing
The Introduction screen for the Wizard will appear. Review the instructions
and press Next.
E-8
Appendix E. Calibration and Zeroing
Now select the kind of calibration you wish to perform, which in this case is
Multiplier and Offset, and press Next.
Now select which sensor it is that you wish to calibrate and press Next. You
can select an entire array, or any single element of that array, as well as scalar
(single-valued) variables. Any items that have been aliased (i.e., given an
alternate name using the Alias instruction in the CRBasic program) will show
by the alias name, including aliased elements of an array.
E-9
Appendix E. Calibration and Zeroing
The currently measured value for the sensor will be displayed in the next
screen. Now place the sensor into the first known condition, and enter that
known value into the First calibrated value box. Press Set First Value. Wait
for the calibration process to measure the first value. The word Calibrating will
be visible in the Current Value box until that process is complete. Now place
the sensor into the second known condition, and then enter the corresponding
known value into the Second calibrated value box. Press Set Second Value.
The calibration process measures the second point value. At that point the
datalogger calculates the new multiplier and offset and applies them within the
running program. These values are also written to the calibration file.
After the multiplier and offset have been calculated and set, the ending screen
of the Wizard appears. You can conclude the calibration, or return to the
starting point to perform more calibrations of the same or different sensors.
E-10
Appendix E. Calibration and Zeroing
NOTE The steps for performing a two-point slope only (multiplier only)
calibration in the Wizard are nearly identical to those shown above
for a two-point multiplier and offset calibration.
E-11
Appendix E. Calibration and Zeroing
Now you can monitor the reading on the sensor to be calibrated. Set the sensor
to the zero condition, and press Calibrate.
The Current Value box will be yellow during the calibration process. When it
finishes, you will see the new value of the sensor after the application of the
zeroing offset. Press Finish to end the calibration.
E-12
Appendix E. Calibration and Zeroing
Now you can view the current reading on the sensor to be calibrated. Set the
sensor to the known value (“calibrate to” value). Enter that value into the
Enter Calibrated Value box. Press Calibrate. The current value will show
Calibrating until the process is complete. You will then have the opportunity
to press Finish, or press Previous to return and calibrate more sensors.
E-13
Appendix E. Calibration and Zeroing
E-14
Appendix F. Importing Files into Excel
Data files saved by LoggerNet can be imported into a spreadsheet program for
analysis or manipulation. Instructions are given below for importing a comma
separated file into Microsoft Excel.
From the Excel menu, select File | Open. Browse for the *.dat file that you
want to import. Excel will recognize the file as not being in an xls format, and
will invoke the Text Import Wizard. The Text Import Wizard consists of three
steps, each having its own window.
Step 1 of 3
Select the Delimited option from the Original Data Type group box. Using the
arrow buttons to the right of the Start Import at Row field, select the number of
the first row of data to be imported. Select the Next button.
F-1
Appendix F. Importing Files into Excel
Step 2 of 3
From the Delimiters group box, select Comma and Space. The Comma option
directs Excel to place each data value, which is separated by a comma, into a
separate column. The Space option will separate the Date and the Time into
two columns.
From the Text Qualifiers list box, select None. Select the Next button.
F-2
Appendix F. Importing Files into Excel
Step 3 of 3
A quick look at the columns of data is provided in the Data Preview group box.
Once the data file has been imported into Excel, the time fields are still
displayed as comma separated numbers such as Year, Day of Year, and
Hours/Minutes in HHMM format.
Split can take array-based data files and convert the year, day of year, and
hours/minutes fields into a standard timestamp format that Excel will read
directly. See Section 8.1.3.1, Input Files (p. 8-7).
You can also enter formulas as described below to convert the timestamp fields
in the array data to the decimal format used by Excel. Microsoft’s database
(MS Access) and spreadsheet (MS Excel) programs store dates and times as
real numbers, where the integer portion of the number represents the number of
days since some base date (usually January 1, 1900), and the fractional portion
represents the time of day. For example, June 1, 2000 at 10:00 a.m. would be
stored as “36678.41667.”
This formula will take the comma separated date and time fields and convert
them to the decimal date use by Excel. The variables shown in brackets [ ]
should be replaced by the cell location for that data. If you don’t have all of the
date or time elements in your data, you can replace that part of the equation
with a number (e.g. [Year] = 2002), or for Hours, Minutes, and Seconds leave
that part of the formula out.
F-3
Appendix F. Importing Files into Excel
([Year]–1900)*365+1+Int(([Year] –1901)/4)+[Day]+Int([HHMM]/100)/
24+([HHMM]/100-Int([HHMM]/100))*100/60/24+[Sec]/60/60/24
Once you have entered the formula for one cell you can apply it to multiple
cells using Excel’s Fill function. Selecting the cells and using Format Cells can
set the display format of the timestamp.
F-4
Appendix F. Importing Files into Excel
Step 2 of 3
From the Delimiters group box, select Comma and Space. The Comma option
directs Excel to place each data value, which is separated by a comma, into a
separate column. The Space option will separate the Date and the Time into
two columns.
From the Text Qualifiers list box, select None. Select the Next button.
F-5
Appendix F. Importing Files into Excel
Step 3 of 3
A quick look at the columns of data is provided in the Data Preview group box.
Highlight the column with the year/month/day and from the Column Data
Format group box, select the Date option. From the drop down list box to the
right of this option select the YMD format.
As imported, the Date and Time fields have a quotation mark in the field.
F-6
Appendix F. Importing Files into Excel
The quotation marks can be removed by using Excel’s Search and Replace
feature. From the Excel menu, select Edit | Replace. In the Find What field,
type in a quotation mark (“). Leave the Replace With field blank, and select the
Replace All button.
If headers have been imported with the data, the column headings will be off
by one since the date and time have been imported as two separate fields. The
headers can be highlighted and moved one cell to the right to correct this.
F-7
Appendix G. View Pro
View Pro is a program that can be used to open data files (*.DAT) or other CSI file types (*.DLD,
*.CSI, *.PTI, *.FSL, *.LOG, *.CRX, etc.). It can also be used to view data from a LoggerNet database
table.
Use the File | Open menu option to view a data file. (TOACI1, TOA5, TOB1, TOB2, TOB3, printable
ASCII, comma separated ASCII, and array-based datalogger binary data files can be opened with
View Pro.) Use the File | View LoggerNet Database Table menu option to view a database table.
This opens a text view of your data. Data values from the text view can then be chosen to display
graphically on a line graph, Histogram, XY Plot, Rainflow Histogram, or FFT as appropriate for the
data type. Multiple instances of each of these graphical displays can be opened. Both numeric data
and graphs can be sent to a printer. Graphs can be saved to disk in a choice of formats.
NOTE: View Pro cannot display FFTs or Histograms from a TOACI1 data file.
Data values from array-based data files cannot be graphed unless an *.FSL file has been
associated with the data file or the View | Array Definitions (p. 9) dialog box has been used to
generate timestamps for the data file.
Use the File | Open As Text menu option to view other CSI file types.
View Pro options are accessed by using the menus or by selecting the toolbar icons. If you move
and hold the mouse over a toolbar icon for a few seconds, a brief description of that icon's function
will appear.
NOTE: If the Keep Data on Top option has been selected from the View menu, you will need to
disable it before pressing the Bring the selected graph to the front button in order to bring the
selected graph to the front.
You will only be able to make selections that correspond to the currently-selected graph. For
example, if the currently-selected graph is a Histogram, you will only be able to select histogram
records in a data file.
In order to begin a new graph, set the Selected Graph field to None. All highlighting in the data file
(s) will be cleared, and you can begin making selections for a new graph. Once you have selected
your data, press the button for the type of graph you wish to create.
With the Selected Graph field set to None, you can add selections to any graph by making the
selections, right-clicking, choosing Add Selections to Graph, and then selecting the desired graph.
NOTE: The Time Span control will not allow a time span less than the interval existing between
the initial record and the next record. For example, for a data file with a 5-minute data interval,
the Time Span cannot be less than 5 minutes.
If the Ending Record is less than or equal to the Initial Record when Apply is pressed, the Ending
Record will automatically be set to one greater than the Initial Record.
Set the color to be used for the selection by pressing the … button and choosing a color.
Format Code
Indicate how you would like the date and time to be formatted, according to the following codes:
Format specifiers may be written in upper case as well as in lower case letters--both produce the
same result.
Decimal Places
Specify the number of digits that will be shown after a decimal place.
Leading Zeroes
Specify the minimum number of digits that will be shown before a decimal place. Leading zeroes
will be added, if necessary.
Scientific Notation
Select the check box to display the data values in scientific notation.
To open a data file, press the icon or select File | Open from the menu. TOACI1, TOA5, TOB1,
TOB2, TOB3, CSIXML, printable ASCII, comma separated ASCII, and array-based datalogger binary
data files can be viewed.
Files with a particular extension can be configured in Windows to be opened by View Pro
automatically when double-clicked in Windows Explorer. Refer to Assigning Data Files to View
Pro|tag=AssignFilesAssigning Data Files to View|tag=AssignFiles for more information.
NOTE: If you are viewing a CSIXML file and it was collected using the Midnight is 2400 option,
midnight will not be displayed as 2400 in View Pro.
Array-based data files do not contain timestamps. If an FSL file is associated with the data file, View
Pro will try to extract timestamps from the appropriate columns. You can select View | Array
Definitions (p. 9) to specify how the timestamps are created. Note that if no timestamps are used,
data cannot be graphed.
When a data file is opened, the columns are autosized to fit the data. Column sizes can be changed
by dragging a column divider bar to the desired location. If column sizes have been changed,
choosing this menu item will return them to their default sizes.
When this option is selected, all new graphs will be started in a maximized state.
This is a toggle menu item. There will be a check mark next to the item, when it is active. Deactivate
it by selecting it again.
Specifies the FSL file to be used and how the timestamps will be created for an array from an array-
based datalogger data file.
NOTE: If a data file is opened that contains multiple arrays, each array will be opened in a
separate window. The array definitions for each array are set individually.
Data values from array-based data files cannot be graphed unless an *.FSL file has been
associated with the data file or the Array Definitions dialog box has been used to generate
timestamps for the data file.
FSL File
Select the FSL (Final Storage Label) file that will be used to provide column headings. The *.FSL file
is created when a datalogger program is compiled in Edlog or ShortCut.
No Timestamp
No timestamps will be used. Note that arrays without timestamps cannot be graphed.
Extract Timestamp using RealTime fields
When data is being graphed and this option is selected, the Data Display will always be at the
forefront of the View Pro program. This is a toggle item. There will be a check mark to the left of
the menu item when it is turned on. Note that if this menu item is enabled, the Keep Graph on Top
item will automatically be disabled.
When data is being graphed and this option is selected, the currently selected graph will always be
at the forefront of the View Pro program. This is a toggle item. There will be a check mark to the
left of the menu item when it is turned on. Note that if this menu item is enabled, the Keep Data on
Top item will automatically be disabled.
This option allows you to change the background color for the currently selected data file.
View | Font
This option allows you to change the font used on the Data Panel display when viewing data. The
selected font is also used when a file is printed.
The following characteristics can be set:
l Font Type (Courier, Courier New, Fixedsys, Lucida Console, MS LineDraw, Terminal, etc.)
l Font Style (regular, italic, bold, bold italic)
Window | Cascade
This menu option is used for array-based data files where the entire data file and each individual
array are opened in separate windows. It rearranges all open, non-minimized data file windows so
that the title bar of each window is visible. Windows cascade down and to the right starting from
the upper left corner. This function can also be accomplished by pressing the button on the
main toolbar.
File
The name of the file to be imported. Press the File button to bring up a browser to select the
desired file.
Header Line Count
Import
After all of the settings have been specified, press the Import button to import the CSV file into
View Pro.
NOTE: To use a PostgreSQL or Oracle database, ODBC drivers are required. See Installing
PostgreSQL ODBC Drivers (p. 20) or Installing Oracle ODBC Drivers (p. 28).
The information to enter changes depending on the database type as described below:
SQL Server Compact is an embedded database that just requires the selection of a filename. Press
the Browse button to the right of the Data Source field to browse to the desired database.
To configure a connection to SQL Server you must select a SQL Server instance. The list of
published SQL Server instances is shown in the Data Source combo box. You can also type into the
Data Source combo box, because the desired server might not be published. Windows
Authentication or SQL Server Authentication can be selected. Windows Authentication does not
require a username and password, but rather uses Windows user accounts to authenticate valid
users. SQL Server Authentication requires a login ID and Password and is independent of Windows
user accounts. You can select the <default> database or select a specific database from the
Database combo box.
The MySQL connection is an ODBC connection. You must use the Windows ODBC Data Source
Administrator to configure the database connection. Currently only system data sources are
supported and show in the Data Source combo box. The Login ID and Password may be optional.
They will be set to blank in the connection string. It has been found that when set to blank, the
login id and password configured in the system data source are used. You can select the <default>
database (default as configured in the data source) or select a different database.
PostgreSQL (ODBC)
NOTE: To use a PostgreSQL database, ODBC drivers are required. See Installing PostgreSQL
ODBC Drivers (p. 20).
Oracle (ODBC)
NOTE: To use an Oracle database, ODBC drivers are required. See Installing Oracle ODBC
Drivers (p. 28).
NOTE: View Pro is different from the LNDB Manager (and LNDB in general), which can connect
to the PostgreSQL database without an ODBC driver.
The official ODBC driver is called psqlODBC. The home page for the driver is
https://odbc.postgresql.org. It is very important to install the 32-bit version of the ODBC driver. (It
shouldn’t cause any trouble if the 64-bit driver is installed in addition to the 32-bit driver, but only
the 32-bit version will work with View Pro.)
From the home page:
Your PostgreSQL ODB data source set up is now complete. See Selecting a Database (p. 14) and
Selecting a Table (p. 19) for how to use the data source in View Pro.
For testing or troubleshooting, a DSN can be configured in the System DSN tab, but this is not
required.
See Selecting a Database (p. 14) and Selecting a Table (p. 19) for information on setting up the
database in View Pro.
NOTE: The highlighted selections in the data files will always indicate the values being graphed
in the currently selected graph.
Edit
Brings up a dialog box to set properties for the selected trace. (A trace is selected by clicking its
name in the list above the Edit and Delete buttons.) This dialog box can be used to set properties
for Display (name, color, line width, line style, and symbol style), Y Axis (scaling, limits, and title),
and Marks.
Delete
Deletes the selected trace from the graph. (A trace is selected by clicking its name in the list above
the Edit and Delete buttons.)
Graph Width
The Graph Width can be set either as a function of time or number of records.
NOTE: If the Graph Width Time or Records field is changed, when the Apply button is pressed,
the other field will automatically be changed correspondingly.
Time
The amount of time that will be shown on the graph in days, hours, minutes, seconds and
milliseconds. Each element (days, hours, minutes, seconds, and milliseconds) can be highlighted
separately. Once an element is highlighted, type in the desired value or use the arrow keys to
increase or decrease. After entering the desired value, press the Apply button to make the change
take effect.
Records
The number of records that will be shown on the graph. Highlight the field and then type in the
desired value or click on the arrow to pick a value from the drop-down list. After entering the
desired value, press the Apply button to make the change take effect
Options
Brings up the Options dialog box for the graph. This dialog box can be used to set options for chart
colors, margins, title, legend, points, and plotting options (line only, points and line, points only).
Clear
Press this button to clear all traces and data contained in the graph.
Zoom Feature
You can zoom in on a particular area of a Line Graph by holding the left mouse button and
dragging the mouse pointer from top-left to bottom-right (or bottom-left to top-right) over the
area to be zoomed. Pressing the Undo Zoom toolbar icon or dragging the mouse pointer from
bottom-right to top-left (or top-right to bottom-left) will undo the zoom.
Jumps the graph to the position showing a full screen of data ending with the
last record on the right-most part of the graph.
Jumps the graph to the position showing a full screen of data beginning with
the first record on the left-most part of the graph.
Common Y-Axes ( )
When using common y-axes, the same scale will apply to all traces assigned to an axis. Traces can
be assigned to an axis by selecting the trace, right-clicking, and then choosing Assign to Left Axis or
Assign to Right Axis. (A trace is selected by clicking on its name in the list above the Edit and Delete
buttons.) The scale for the common left y-axis will spread from the lowest value of any trace
assigned to the left y-axis to the highest value of any trace assigned to that axis. The scale for the
common right y-axis will spread from the lowest value of any trace assigned to the right y-axis to
the highest value of any trace assigned to that axis. By default, all traces are assigned to the left y-
axis and the scale will spread from the lowest value of any trace being graphed to the highest value
of any trace being graphed.
The graph legend will show an L or R next to the name of each trace to indicate whether that trace
is assigned to the left or right y-axis.
The default y-axis title will be the name of the data file from which data values are being graphed.
Independent Y-Axes ( )
When using independent y-axes, the scale shown on each y-axis will apply only to the last selected
trace that is assigned to that axis. (Traces can be assigned to an axis by selecting the trace, right-
clicking, and then choosing Assign to Left Axis or Assign to Right Axis.) The scale will only show the
value spread for the specified trace, but all traces will maintain their relative positions on the graph.
The y-axis title will be the name of the last selected trace.
Chart Color
Determines the color for the back wall of the graph. Select Solid and press the … button to select a
solid color. Select Gradient to give the graph’s back wall a gradient. Press the Edit Gradient button
to display a Gradient Editor that can be used to select a default or customized gradient.
Titles
Show Graph Title
Determines whether a title is shown for the graph. Select the check box to show a title. Type in the
desired title and press the Font button to choose the font, style, size, effects, and color of the title.
Margins
Sets the left, top, bottom, and right margins of the graph as percentages of the available space.
Legend
Show Legend
Check this box to display a legend for the chart.
Position
Determines where the legend appears on the chart. Choose Right or Top.
X Axis
Use record number on X Axis
Select this check box to display record numbers rather than timestamps on the X Axis. This may be
useful when displaying data with gaps in the timestamps. With this option selected, you will see
continuous data without the gaps. It may also be useful when displaying data with duplicate
timestamps.
Plotting Options
Determines how the data points will be drawn. Specify Line Only to simply draw a line between the
data points; Points and Line to draw a line between the data points and draw a symbol at each
point; or Points Only to draw a symbol at each data point without connecting the points with a line.
Note that when switching between Line Only and either of the other two options, the graph will
temporarily be cleared and no traces will be visible until the OK button is pressed. This is because a
different component is used for these options.
Set as Defaults
Sets these settings as the defaults for a line graph. This applies to all of the settings on this page,
For example if you change the Chart Color to Solid and the color to white and click Set as Defaults,
every new line graph will be created with these settings.
Trace
Line Width
Sets the width of the line that is drawn between each point on the trace. The value must be
between 1 and 10.
Line Style
Determines the style of line for the trace. Choose Solid Line, Dash, Dot, Dash Dot, or Dash Dot Dot.
Symbol Style
Sets the style of the symbol that is used to represent each data point for this trace. The available
styles are Rectangle, Circle, Triangle, Down Triangle, Cross, Diagonal Cross, Star, Diamond, Small
Dot, Nothing, Left Triangle, and Right Triangle.
Note that the symbols will not be shown if the Plotting Options for the graph is set to Line Only.
Custom Limits
Max Value
The maximum value on the Y Axis scale. This field is disabled when the Scaling Option is set to
Automatic Scaling.
Min Value
The minimum value on the Y Axis scale. This field is disabled when the Scaling Option is set to
Automatic Scaling.
Axis Title
Determines whether a title is shown for the Y Axis. Select the check box to show a title. Type in the
desired title and press the Font button to choose the font, style, size, effects, and color of the title.
Round Frame
Sets the shape of the frame around each mark. Select the check box to make the frame round.
Clear the check box to make the frame square.
Draw Every
Determines whether marks are shown for all or only some data points. The default value is one,
which will show marks for every data point. Entering a value of n will cause marks to be shown only
for every nth data point.
Color
Press the button to open a dialog box from which to choose a color for the frames around the
marks.
Style
Determines the data contained in each mark. Choose Y-Value, X-Value, or X and Y Value.
G.5.2 Histogram
From the Histogram screen, you can view histogram data. The Histogram button on the toolbar will
be enabled if there is at least one valid histogram in the currently selected data file.
This dialog box can also be opened from a button, , on the Histogram toolbar. This allows you
to change the options for the histogram record that is selected in the list on the left side of the
Histogram screen.
Additional histogram records can be added by pressing the New button. (These additional records
can be from either the same histogram or a different histogram in your data file.) You can then
choose which histogram record is being displayed by selecting it in the list.
From the Data Grid
You can also select histogram records directly from a data file to be displayed on a Histogram
screen. Clicking on any data value in a histogram record will select that histogram record.
Histogram records can be selected before the Histogram screen is opened with the Selected Graph
(p. 3) set to None. When the Histogram screen is opened, all selected histogram records will be
listed on the left side of the Histogram screen. A histogram record can then be displayed by clicking
on it in the list. Once the Histogram screen is opened, additional histogram records can be added
to the Histogram screen by selecting them in the data file as described above.
NOTE: All histogram records from the same histogram will have the same default name in the
list. They can be distinguished by the colored boxes next to their names. Each box is the same
color with which that histogram record is highlighted in the data file. It is also the color with
which that histogram record is displayed if the “Use Selection Color” option is chosen in the
Selection Properties dialog box. The color associated with a histogram record can also be
New
Brings up the Histogram Setup dialog box to allow you to add a new histogram record to the
display.
Delete
Deletes the selected histogram record from the Histogram.
Edit
Brings up a dialog box to set properties for the selected histogram record. This dialog box can be
used to set properties for Display (name, color, marks) , Y Axis (scaling, limits, and title), and X Axis
(scaling, limits, title).
Options
Determines the graph type. Select Area, Histogram, Line, or Bar from the drop-down list.
Record
Indicates which record of the histogram is being viewed. The arrow buttons can be used to scroll
through records of the histogram.
3D View
Determines whether the histogram is viewed in 2D or 3D mode. Select the checkbox to view the
histogram in 3D. Clear the checkbox to view the histogram in 2D.
Number of Plots
This field is only enabled for 3D View. Sets the number of plots (histogram records) to be viewed.
X-Axis Mode
Determines how the labels on the X-Axis are displayed. Select Show Ranges to have ranges of data
values shown on the X-Axis. Select View Bins to have bin numbers shown on the X-Axis.
Options
Brings up the Options dialog box for the Histogram. This dialog box can be used to set the title,
margins, and chart colors.
Zoom Feature
You can zoom in on a particular area of a Histogram by holding the left mouse button and
dragging the mouse pointer from top-left to bottom-right (or bottom-left to top-right) over the
area to be zoomed. Pressing the Undo Zoom toolbar icon or dragging the mouse pointer from
bottom-right to top-left (or top-right to bottom-left) will undo the zoom.
In 3D View, you can also zoom in and out by using the Page Down and Page Up buttons on your
keyboard.
Right-Click Menus
Right-clicking on the graphical display area will bring up a menu from which you can choose
Export to save the Histogram in a choice of formats, Copy to Clipboard to place the Histogram on
the clipboard, Print to print the Histogram, or Options to bring up the Histogram’s Options dialog
box.
Right-clicking on a histogram record in the list above the New, Edit and Delete buttons brings up a
menu from which you can choose Edit Selection to bring up the Selection Properties dialog box,
Delete Selection to delete the selection from the Histogram, or Selection Summary to see
information about the histogram record, the data file, and the datalogger and program that
generated the data file.
Histogram Options. Opens a dialog box from which you can set properties for the
Histogram including scaling, colors, margins, titles, etc. This dialog box can also be
brought up by pressing the Options button.
Chart Colors
Back Wall Color
Determines the color for the back wall of the graph. Press the … button to select a color.
Use Gradient
Select the check box give the graph’s plot area a gradient. Clear the check box to clear the
gradient. Press the Edit Gradient button to the right of the field (…) to display a Gradient Editor that
can be used to select a default or customized gradient.
Margins
Sets the left, top, bottom, and right margins of the graph as percentages of the available space.
Marks
Show Marks
Determines whether marks are shown on the graph to display the value of data points. Select the
check box to show marks.
Round Frame
Sets the shape of the frame around each mark. Select the check box to make the frame round.
Clear the check box to make the frame square.
Transparent
Determines whether a frame is shown around each mark. Select the check box to display a frame
around each mark. Clear the check box to display only text.
Draw Every
Determines whether marks are shown for all or only some data points. The default value is one,
which will show marks for every data point. Entering a value of n will cause marks to be shown only
for every nth data point.
Color
Press the button to open a dialog box from which to choose a color for the frames around the
marks.
Style
Determines the data contained in each mark. Choose Y-Value, X-Value, or X and Y Value.
Custom Limits
Max Value
The maximum value on the Y Axis scale. This field is disabled when the Scaling Option is set to
Automatic Scaling.
Min Value
The minimum value on the Y Axis scale. This field is disabled when the Scaling Option is set to
Automatic Scaling.
Scientific Notation
Determines whether values on the Y Axis are shown in scientific notation. Select the check box to
display values in scientific notation.
Axis Title
Determines whether a title is shown for the Y Axis. Select the check box to show a title. Type in the
desired title and press the Font button to choose the font, style, size, effects, and color of the title.
Custom Limits
Max Value
The maximum value on the X Axis scale. This field is disabled when the Scaling Option is set to
Automatic Scaling.
Min Value
The minimum value on the X Axis scale. This field is disabled when the Scaling Option is set to
Automatic Scaling.
Axis Title
Determines whether a title is shown for the X Axis. Select the check box to show a title. Type in the
desired title and press the Font button to choose the font, style, size, effects, and color of the title.
G.5.3 XY Plot
From the XY Plot screen, you can graph a data value on the y-axis against a different data value on
the x-axis. The user specifies what will be used for both the X axis data value and the Y axis data
value. Each Y axis data value is plotted against the X axis data value with the identical timestamp.
New
Adds another series to the XY Plot. The X and Y values for the new series can then be selected from
the X and Y drop-down list boxes.
Delete
Deletes the selected series from the XY Plot.
Edit
Brings up a dialog box to set properties for the selected series. This dialog box can be used to set
properties for symbols and marks.
Right-Click Menu
Right-clicking on the graphical display area will bring up a menu from which you can choose
Export to save the XY Plot in a choice of formats, Copy to Clipboard to place the XY Plot on the
clipboard, Print to print the XY Plot, or Options to bring up the XY Plot’s Options dialog box.
Right-clicking on a series in the list above the New, Edit and Delete buttons brings up a menu from
which you can choose Edit Series to bring up the Series options dialog box or Delete Series to
delete the series from the XY Plot.
Margins
Sets the left, top, bottom, and right margins of the graph as percentages of the available space.
Titles
Show Graph Title
Determines whether a title is shown for the graph. Select the check box to show a title. Type in the
desired title and press the Font button to choose the font, style, size, effects, and color of the title.
Custom Limits
Max Value
The maximum value on the Y Axis scale. This field is disabled when the Scaling Option is set to
Automatic Scaling.
Min Value
The minimum value on the Y Axis scale. This field is disabled when the Scaling Option is set to
Automatic Scaling.
Custom Limits
Max Value
The maximum value on the X Axis scale. This field is disabled when the Scaling Option is set to
Automatic Scaling.
Min Value
The minimum value on the X Axis scale. This field is disabled when the Scaling Option is set to
Automatic Scaling.
Axis Title
Determines whether a title is shown for the X Axis. Select the check box to show a title. Type in the
desired title and press the Font button to choose the font, style, size, effects, and color of the title.
Marks
Show Marks
Determines whether marks are shown on the plot to display the value of data points. Select the
check box to show marks.
Round Frame
Sets the shape of the frame around each mark. Select the check box to make the frame round.
Clear the check box to make the frame square.
Transparent
Determines whether a frame is shown around each mark. Select the check box to display a frame
around each mark. Clear the check box to display only text.
Draw Every
Determines whether marks are shown for all or only some data points. The default value is one,
which will show marks for every data point. Entering a value of n will cause marks to be shown only
for every nth data point.
Color
Press the button to open a dialog box from which to choose a color for the frames around the
marks.
Style
Determines the data contained in each mark. Choose Y-Value, X-Value, or X and Y Value.
NOTE: View Pro does not create rainflow histogram data from time series information. It only
displays rainflow histogram data contained in a *.DAT file. Rainflow histogram data in a *.DAT
file is created by using the CRBasic Rainflow instruction in a CRBasic program Data Table.
NOTE: All rainflow histogram records from the same rainflow histogram will have the same
default name in the list. They can be distinguished by the colored boxes next to their names.
Each box is the same color with which that rainflow histogram record is highlighted in the data
file. It is also the color with which that rainflow histogram record is displayed. The color
associated with a histogram record can be changed from the Selection Properties dialog box.
(The Selection Properties dialog box is opened by clicking on the rainflow histogram record in
the list and then pressing the Edit button.)
New
Brings up the Rainflow Histogram Setup dialog box to allow you to add a new rainflow histogram
record to the display.
Delete
Deletes the selected rainflow histogram record from the Rainflow Histogram.
Edit
Brings up a dialog box to set properties for the selected rainflow histogram record. This dialog box
can be used to set properties for Display (name, color, marks) and Axes (scaling, limits, and title).
Options
Brings up the Options dialog box for the Rainflow Histogram. This dialog box can be used to set
options for chart colors, margins, and the chart title.
Clear
Press this button to clear all rainflow histogram records contained in the Rainflow Histogram.
Zoom Feature
You can zoom in on a particular area of a Rainflow Histogram by holding the left mouse button
and dragging the mouse pointer from top-left to bottom-right (or bottom-left to top-right) over
the area to be zoomed. Pressing the Undo Zoom toolbar icon or dragging the mouse pointer
from bottom-right to top-left (or top-right to bottom-left) will undo the zoom.
You can also zoom in and out by using the Page Down and Page Up buttons on your keyboard.
Right-Click Menus
Right-clicking on the graphical display area will bring up a menu from which you can choose
Export to save the Rainflow Histogram in a choice of formats, Copy to Clipboard to place the
Rainflow Histogram on the clipboard, Print to print the Rainflow Histogram, or Options to bring up
the Rainflow Histogram’s Options dialog box.
Right-clicking on a rainflow histogram record in the list above the New, Edit and Delete buttons
brings up a menu from which you can choose Edit Selection to bring up the Selection Properties
dialog box, Delete Selection to delete the selection from the graph, or Selection Summary to see
information about the rainflow histogram record, the data file, and the datalogger and program
that generated the data file.
Margins
Sets the left, top, bottom, and right margins of the graph as percentages of the available space.
Titles
Show Graph Title
Determines whether a title is shown for the graph. Select the check box to show a title. Type in the
desired title and press the Font button to choose the font, style, size, effects, and color of the title.
Marks
Show Marks
Determines whether marks are shown on the graph to display the value of data points. Select the
check box to show marks.
Round Frame
Sets the shape of the frame around each mark. Select the check box to make the frame round.
Clear the check box to make the frame square.
Transparent
Determines whether a frame is shown around each mark. Select the check box to display a frame
around each mark. Clear the check box to display only text.
Draw Every
Determines whether marks are shown for all or only some data points. The default value is one,
which will show marks for every data point. Entering a value of n will cause marks to be shown only
for every nth data point.
Color
Press the button to open a dialog box from which to choose a color for the frames around the
marks.
Style
Determines the data contained in each mark. Choose Z-Value, Y-Value, or Y and Z Value.
Z Axis
Determines whether a title is shown for the Z Axis. Select the check box to show a title. Type in the
desired title and press the Font button to choose the font, style, size, effects, and color of the title.
Y Axis
Determines whether a title is shown for the Y Axis. Select the check box to show a title. Type in the
desired title and press the Font button to choose the font, style, size, effects, and color of the title.
G.5.5 FFT
From the FFT screen, you can view FFT data. The FFT button on the toolbar will be enabled if there
is at least one valid FFT in the currently selected data file.
This dialog box can also be opened from a button, , on the FFT toolbar. This allows you to
change the options for the FFT record that is selected in the list on the left side of the FFT screen.
NOTE: All FFT records from the same FFT will have the same default name in the list. They can
be distinguished by the colored boxes next to their names. Each box is the same color with
which that FFT record is highlighted in the data file. It is also the color with which that FFT
record is displayed if the “Use Selection Color” option is chosen in the Selection Properties
dialog box. The color associated with an FFT record can also be changed from this dialog box.
(The Selection Properties dialog box is opened by clicking on the FFT record in the list and then
pressing the Edit button.)
New
Brings up the Fast Fourier Transform Setup dialog box to allow you to add a new FFT record to the
display.
Delete
Deletes the selected FFT record from the FFT.
Edit
Brings up a dialog box to set properties for the selected FFT record. This dialog box can be used to
set properties for Display (name, color, marks) , Y Axis (scaling, limits, and title), and X Axis (scaling,
limits, title).
Options
Determines the graph type. Select Area, Histogram, Line, or Bar from the drop-down list.
Record
Indicates which record of the FFT is being viewed. The arrow buttons can be used to scroll through
records of the FFT.
X-Axis Mode
Determines how the labels on the X-Axis are displayed. Select Show Ranges to have ranges of data
values shown on the X-Axis. Select View Bins to have bin numbers shown on the X-Axis.
Options
Brings up the Options dialog box for the FFT. This dialog box can be used to set the title, margins,
and chart colors.
Clear
Press this button to clear all FFT records contained in the FFT.
Zoom Feature
You can zoom in on a particular area of an FFT by holding the left mouse button and dragging the
mouse pointer from top-left to bottom-right (or bottom-left to top-right) over the area to be
zoomed. Pressing the Undo Zoom toolbar icon or dragging the mouse pointer from bottom-
right to top-left (or top-right to bottom-left) will undo the zoom.
In 3D View, you can also zoom in and out by using the Page Down and Page Up buttons on your
keyboard.
Right-Click Menus
Right-clicking on the graphical display area will bring up a menu from which you can choose
Export to save the FFT in a choice of formats, Copy to Clipboard to place the FFT on the clipboard,
Print to print the FFT, or Options to bring up the FFT’s Options dialog box.
Right-clicking on an FFT record in the list above the New, Edit and Delete buttons brings up a
menu from which you can choose Edit Selection to bring up the Selection Properties dialog box,
Delete Selection to delete the selection from the graph, or Selection Summary to see information
about the FFT record, the data file, and the datalogger and program that generated the data file.
FFT Options. Opens a dialog box from which you can set properties for the FFT
including scaling, colors, margins, titles, etc. This dialog box can also be brought up by
pressing the Options button.
Show Table. Brings the main View Pro window in front of other windows, making the
data file(s) visible.
Show/Hide Gradient. A toggle button that turns on and off the gradient background
of the FFT. It may be useful to hide the gradient, when printing the FFT.
Modify Selection. Brings up the Fast Fourier Transform Setup dialog box from which
you can change the options for the selection.
Undo Zoom. Returns the FFT to its original state after zooming.
Chart Colors
Back Wall Color
Determines the color for the back wall of the graph. Press the … button to select a color.
Use Gradient
Select the check box give the graph’s plot area a gradient. Clear the check box to clear the
gradient. Press the Edit Gradient button to the right of the field (…) to display a Gradient Editor that
can be used to select a default or customized gradient.
Margins
Sets the left, top, bottom, and right margins of the graph as percentages of the available space.
Marks
Show Marks
Determines whether marks are shown on the graph to display the value of data points. Select the
check box to show marks.
Round Frame
Sets the shape of the frame around each mark. Select the check box to make the frame round.
Clear the check box to make the frame square.
Transparent
Determines whether a frame is shown around each mark. Select the check box to display a frame
around each mark. Clear the check box to display only text.
Draw Every
Determines whether marks are shown for all or only some data points. The default value is one,
which will show marks for every data point. Entering a value of n will cause marks to be shown only
for every nth data point.
Custom Limits
Max Value
The maximum value on the Y Axis scale. This field is disabled when the Scaling Option is set to
Automatic Scaling.
Min Value
The minimum value on the Y Axis scale. This field is disabled when the Scaling Option is set to
Automatic Scaling.
Scientific Notation
Determines whether values on the Y Axis are shown in scientific notation. Select the check box to
display values in scientific notation.
Axis Title
Determines whether a title is shown for the Y Axis. Select the check box to show a title. Type in the
desired title and press the Font button to choose the font, style, size, effects, and color of the title.
Custom Limits
Max Value
The maximum value on the X Axis scale. This field is disabled when the Scaling Option is set to
Automatic Scaling.
Min Value
The minimum value on the X Axis scale. This field is disabled when the Scaling Option is set to
Automatic Scaling.
Axis Title
Determines whether a title is shown for the X Axis. Select the check box to show a title. Type in the
desired title and press the Font button to choose the font, style, size, effects, and color of the title.
G.6 Printing
G.6.1 Printing Options
Printing Text
To print numerical data, press the print button or select File | Print from the menu. A dialog box
will appear allowing you to choose the printer, print range, number of copies, etc. After setting the
properties, press OK to print the data.
Printing Graphs
With a graph window opened, click the print button to preview the printed page and set various
printing options. Then select the Print button to print the graph. You can also right-click the graph
to bring up a menu from which you can select Print.
Adjusts the magnification of the page so that its entire width is visible
in the preview.
Adjusts the magnification of the page so that the entire page is visible
in the preview
Sets the printing orientation of the page to portrait (8.5” wide by 11”
tall).
Sets the printing orientation of the page to landscape (11” wide by 8.5”
tall).
Brings up this Help topic.
l The help file can be opened by choosing Help | View Pro Help from the View Pro menu.
l If the help file is opened, pressing the Contents tab will open the Table of Contents.
l If the help file is opened, choosing the Index tab will bring up an index. Keywords can be
typed in to search for a topic. An in-depth search can be performed by choosing the Search
tab and typing in a word.
l For information on a graphical window, open the window and press the ? button in the upper
right corner or press F1.
l For help on a dialog box, press the Help button at the bottom of the dialog box or press F1
with the dialog box opened.
l If a highlighted link takes you to another topic, you can return to the original topic by
selecting the Back button from the help system's toolbar.
Also See
Campbell Scientific Technical Support (p. 68)
Brazil Germany UK
Location: São Paulo, SP Brazil Location: Bremen, Germany Location: Shepshed, Loughborough, UK
Phone: 11.3732.3399 Phone: 49.0.421.460974.0 Phone: 44.0.1509.601141
Email: [email protected] Email: [email protected] Email: [email protected]
Website: www.campbellsci.com.br Website: www.campbellsci.de Website: www.campbellsci.co.uk