OPTIMA Operations and Maintenance Guide
OPTIMA Operations and Maintenance Guide
Operations and
Maintenance Guide
8.0
Confidentiality, Copyright Notice & Disclaimer
Due to a policy of continuous product development and refinement, TEOCO Ltd. (and its affiliates,
together “TEOCO”) reserves the right to alter the specifications, representation, descriptions and all
other matters outlined in this publication without prior notice. No part of this document, taken as a
whole or separately, shall be deemed to be part of any contract for a product or commitment of any
kind. Furthermore, this document is provided “As Is” and without any warranty.
This document is the property of TEOCO, which owns the sole and full rights including copyright.
TEOCO retains the sole property rights to all information contained in this document, and without
the written consent of TEOCO given by contract or otherwise in writing, the document must not be
copied, reprinted or reproduced in any manner or form, nor transmitted in any form or by any
means: electronic, mechanical, magnetic or otherwise, either wholly or in part.
The information herein is designated highly confidential and is subject to all restrictions in any law
regarding such matters and the relevant confidentiality and non-disclosure clauses or agreements
issued with TEOCO prior to or after the disclosure. All the information in this document is to be
safeguarded and all steps must be taken to prevent it from being disclosed to any person or entity
other than the direct entity that received it directly from TEOCO.
All other company, brand or product names are trademarks or service marks of their respective
holders.
This is a legal notice and may not be removed or altered in any way.
Your feedback is important to us: The TEOCO Documentation team takes many measures in
order to ensure that our work is of the highest quality.
If you found errors or feel that information is missing, please send your Documentation-related
feedback to [email protected]
Thank you,
Table of Contents
1 Introduction 15
The Data Loading Process 15
The Gather Stats Process 16
Creating a Daily Gather Stats Job 17
Creating a Gather Stats Job per Interface 18
Disabling Oracle's Auto Gather Stats Jobs 19
Gathering Dictionary Statistics 19
Monitoring the Gather Stats Job 19
OPTIMA Default Network Communication Requirements 20
Installation and Upgrade 20
Prerequisites for Installing the OPTIMA Backend 20
Installing the OPTIMA Combined Backend Package 21
Installing Executables 22
Preparing Windows Application Servers 22
Preparing for Mediation 22
Preparing Windows Mediation Servers 23
Preparing Unix Mediation Servers 24
Upgrading OPTIMA Backend 26
Patching OPTIMA Backend 26
System Components 27
About PRIDs 29
About File Locations and Naming 30
Scheduling Programs 31
The Monitoring Process 32
Configuring Programs 32
About Versioning 33
About Log Files 33
About Environment Variables 34
Setting Up Environment Variables on a Windows OS 35
Setting Up Environment Variables on a UNIX OS 37
Encrypting Passwords for OPTIMA Backend Applications 38
Which OPTIMA Backend Applications are Affected by Password Security? 39
Starting and Stopping the Data Loading Process 40
Checking Log Files 41
External Programs 41
Database Programs 41
Maintenance 42
Troubleshooting 43
About the Log File Analyzer (Opxlog Utility) 44
Prerequisites for Using the Log File Analyzer 44
Configuring the Log File Analyzer 45
Example Uses for the Opxlog Utility 46
Example Log File Analyzer Configuration (INI) File 47
5
OPTIMA 8.0 Operations and Maintenance Guide
6
Table of Contents
8
Table of Contents
Troubleshooting 241
Loader Error Codes 242
Example Loader Configuration (INI) File 245
About the Loader Configuration (INI) File Parameters 247
About Direct Path Loading 251
Migrating to Direct Path Loading 252
Configuring for Direct Path Loading 252
Tuning Direct Path Loading 254
Error Handling for Direct Path Loading 255
Configuration (INI) File Parameters for Direct Path Loading 256
10
Table of Contents
11
OPTIMA 8.0 Operations and Maintenance Guide
Index 483
13
Introduction
1 Introduction
The data loading process collects network performance data, typically from telecommunication
networks OMC platforms, and stores the data in the OPTIMA performance management database.
The data extraction, transformation and loading processes are carried out by the OPTIMA backend
programs, which are run continuously to ensure that network data is collected and presented in
near real-time.
Note: The extraction, transformation and loading processes are often referred to as ETL.
This guide contains the operation and maintenance (O&M) procedures for the data loading
processes handled by OPTIMA. For more information on SNMP data handled by Netrac mediation,
see the Netrac Server Installation Guide.
Note: This guide does not provide the specific configurations for each customer installation. Please
refer to the implementation plan for customer-specific information such as interfaces deployed,
machine IDs and administrator login details.
15
OPTIMA 8.0 Operations and Maintenance Guide
Component Process
After data has been loaded then there are a number of database-related programs that are used to
further analyze or manipulate the data:
Component Process
OPTIMA Summary Runs all summary processes, for example Busy Hour analysis or
daily summaries. This program can also be used to load data from
external Oracle databases. Summaries are organised as reports
and individual reports (or groups of reports) and are scheduled by
Oracle jobs.
Data Quality Used for Data Quality reports.
The required information is obtained by the Gather Stats process, which collects estimates of,
among other things:
• the table/index size(in blocks)
• the number of rows
• the low and high values of columns
• distinct values
16
Introduction
The configuration of the Gather Stats package depends on four metadata tables:
1. STATS_CONTROL - Controls full execution of the package and comes with the default
settings that will be used to populate the metadata tables
4. STATS_EXCLUDE_TABLES - controls which tables will not be included for the gather stats
process
OPTIMA includes the Gather Stats process in a package that is auto configured during the
Installation/Upgrade process. The only subsequent intervention normally necessary is to schedule
Oracle jobs.
Note: The default configuration suits most installations, but you may need to adjust some of the
parameters for one or more tables. If this is the case, contact TEOCO Support.
Important: If do not use the OIT (OPTIMA Installation Tool) to create vendors but instead create
them manually, you must also insert them manually in the Gather Stats process.
To do this:
begin
dbms_scheduler.create_job(
job_name => 'AIRCOM.GATHER_STATS_OPTIMA'
,job_type => 'PLSQL_BLOCK'
,job_action => 'BEGIN GATHER_STATS.collect_schema_stats;
END;'
,start_date => SYSDATE+1
,repeat_interval => 'FREQ=DAILY;BYHOUR=03'
,enabled => TRUE
,comments => 'Gather Stats');
end;
/
Warning: The scheduled job should be created under the schema that the Gather Stats package
has been installed on. Failure to do so may prevent the job from running successfully.
17
OPTIMA 8.0 Operations and Maintenance Guide
To see if the last run for the scheduled job was successful:
begin
dbms_scheduler.run_job('GATHER_STATS_OPTIMA',false);
end;
/
begin
dbms_scheduler.STOP_JOB('GATHER_STATS_OPTIMA');
end;
/
DECLARE
vhour varchar2(50);
begin
for i in (select schema_name from stats_schema_metadata) loop
vhour:='FREQ=DAILY;BYHOUR=0'||mod(i.num,4);
begin
-- drop gather_stats job if already exists
dbms_scheduler.drop_job('GT_'||i.schema_name);
exception
when others then
null;
end;
dbms_scheduler.create_job(
job_name => 'GT_'||i.schema_name
,job_type => 'PLSQL_BLOCK'
,job_action => 'BEGIN
GATHER_STATS.collect_schema_stats(pschema=>'||q'[']'||i.schema_
name||q'[']'||'); END; '
,start_date => SYSDATE
,repeat_interval => vhour
,enabled => TRUE
,comments => 'Gather Stats');
end loop;
end;
/
18
Introduction
BEGIN
DBMS_AUTO_TASK_ADMIN.DISABLE(
client_name => 'auto optimizer stats collection',
operation => NULL,
window_name => NULL);
END;
/
To verify that the Autotask Background Job has been disabled successfully:
If you are unsure of the Oracle Database version that you have installed:
Begin
dbms_scheduler.create_job(
job_name => 'GATHER_DICTIONARY_FIXED_STATS'
,job_type => 'PLSQL_BLOCK'
,job_action => 'BEGIN
gather_stats.collect_dictionary_stats; END;'
,start_date => SYSDATE
,repeat_interval => 'FREQ=MONTHLY;BYMONTHDAY=01;BYHOUR=01'
,enabled => TRUE
,comments => 'Gather Stats');
end;
/
STATS_EVENT_LOG
19
OPTIMA 8.0 Operations and Maintenance Guide
STATS_TABLES_LOG
Purpose: Log table level information for each table/interface, with execution times and CPU/IO
metrics.
Important: You must install the backend components using the OPTIMA combined backend
package, which will install the required components quickly and easily. For more information on
how to use this, see Installing the OPTIMA Combined Backend Package on page 21.
20
Introduction
• When applying a backend patch, the latest compatible database patch has also been
applied (for details, see the patch Release Notes).
• You have installed and patched the OPTIMA Installation Tool (OIT), for more information
see the OPTIMA Installation Tool User Reference Guide.
• Supported Oracle clients are installed on mediation servers, application servers and clients.
• The OPTIMA database has been upgraded to version 8.0, for more information, see the
Netrac Server Installation Guide.
To do this:
2. In the dialog box that appears, type the password and then click OK.
3. On the License Agreement page, read the license agreement, and if you accept its terms,
scroll to the bottom of the agreement, select 'I accept the terms in the licence
agreement' and then click Next.
Note: To print out the license agreement for your own reference, click the Print button.
4. In the User Information dialog box, enter your name and the name of your organisation,
then click Next.
5. On the Setup Type page, select the type of installation that you require:
o Complete - This installs all components of the backend features
- or -
o Custom - This enables you to select which components of the backend you want to
install
6. Click Next.
- or -
If you have selected a Custom installation, the Custom Setup page appears.
Ensure that the components to be installed are selected and those to be omitted are not
selected and then click Next.
Tip: If you wish to change the folder where the software is installed, you can click the
Change button. However, it is recommended that you use the default folder.
21
OPTIMA 8.0 Operations and Maintenance Guide
Installing Executables
Install all of the required application server executables from the backend build, for example:
• Alarm Service
• Alarm Notifier
• Alarm SNMP Agent
• Report Scheduler
1. Create an OPTIMA user with Administration permissions (needed for task scheduling).
Note: Microsoft Excel is required for the Report Scheduler to produce Excel reports.
6. Copy the programs from the Windows Mediation directory of the OPTIMA Backend
Program Files directory to the OPTDIR/bin directory, and ensure that they are executable
(run with –v option).
22
Introduction
1. Using root or administrator user, create an oracle user, this is for the oracle client.
2. Using the oracle user, install the latest supported Oracle client.
5. Enable all required tasks and job scheduling for the OPTIMA user.
7. Install ActivePerl and the extra modules required for SFTP and SSH.
Note: Perl is required for FTP and Combiner processing. However, SFTP without SSH is
not an option for Windows.
8. Ensure that the PATH environment variable includes the Oracle library directories.
9. Ensure that the PERL5LIB environment variable points to the correct directories.
10. Define the base directory for the OPTIMA installation (for example, C:/Optima).
11. Ensure that the OPTIMA user has access / permissions to read Oracle libraries.
12. Using the OPTIMA user, define the following environment variables:
o OPTDIR (The OPTIMA base directory)
o HOSTNAME (server name)
o ORACLE_HOME
o LD_LIBRARY_PATH
13. Verify that the optima user has a TNS connection to the database (sqlplus
aircom/<password>@<tnsname>).
14. Using the OPTIMA user, create the following central location directories under the OPTDIR
directory that you have specified:
o bin (for binaries/executables)
o AILIB (for OS specific libraries)
o tmp
o log (for log files produced by each process)
o prids (for process monitor PID files)
o run (for shell or batch scripts)
o maintenance (for common OPTIMA processes (such as MNT, MON and so on)
o extdir (for loader external files)
At a later time you will need to set up and schedule the Alarm Service (opx_ALM_GEN_817) and
Alarm SNMP Agent (opx_ALM_GEN_820).
23
OPTIMA 8.0 Operations and Maintenance Guide
1. Using root or administrator user, create an oracle user, this is for the oracle client.
2. Using the oracle user, install the latest supported Oracle client.
5. Using the OPTIMA user, create the library directory (AILIB) under the defined OPTDIR
directory.
6. Copy and install the relevant OS-specific library files from backend distribution installation
to the AILIB directory with OPTIMA user.
Note: You may need to install different library file sets for backend programs and parsers.
8. With the OPTIMA user, copy the programs from the appropriate OS Mediation directory of
the Backend Program Files directory to the OPTDIR/bin directory and ensure that they are
executable (run with –v option).
9. Define the base directory for the OPTIMA installation (for example,/opt/optima).
10. Ensure that the OPTIMA user has access / permissions to read Oracle libraries.
11. Using the root user, install PERL (to check if PERL is installed on a UNIX platform the
command is 'perl –v').
12. (Optional). If you want to use the FTP in secured mode (SFTP), install the Net::SFTP
CPAN module. You can only use this mode on UNIX.
13. (Optional) If you want to use the FTP in secured mode (SFTP) with SSH Key Authentication
(which is the strongly recommended SFTP option), install the Net::SFTP CPAN module and
the IO::Pty and Expect modules.
24
Introduction
14. (Optional) If you also want to use SFTP compression, then you must additionally install the
following CPAN Perl modules:
o Compress::Zlib
o Compress::Raw::Bzip2
o Compress::Raw::Zlib
o Crypt::Blowfish
o Crypt::CBC
o Crypt::DES
o Crypt::DH
o Crypt::DSA
o Crypt::Primes
o Crypt::RSA
o Crypt::Random
o Digest::BubbleBabble
o Digest::HMAC
o Digest::SHA
o IO::Compress
o IO::Zlib
o Net::SSH
o Net::SSH::Perl
15. Verify that the optima user has a TNS connection to the database (sqlplus
aircom/<password>@<tnsname>).
16. Using the OPTIMA user, create the following central location directories under the OPTDIR
directory that you have specified:
o bin (for binaries/executables)
o AILIB (for OS specific libraries)
o tmp
o log (for log files produced by each process)
o prids (for process monitor PID files)
o run (for shell or batch scripts)
o maintenance (for common OPTIMA processes (such as MNT, MON and so on)
o extdir (for loader external files)
17. At a later time you will need to set up and schedule the Alarm Service
(opx_ALM_GEN_817) and Alarm SNMP Agent (opx_ALM_GEN_820).
25
OPTIMA 8.0 Operations and Maintenance Guide
3. Replace all of the application server executables with new version executables from the
backend distribution package (for example, Alarms, Alarm Notifier, Alarm SNMP Agent and
Report Scheduler).
4. Confirm that the required parsers are compatible by running them with -v initially and then
running them with data files.
5. Test and verify that all of the processes are running, executing the OPTIMA upgrade
acceptance tests.
3. Click Next.
4. Read the License Agreement, and then select 'I accept the terms in the license agreement'.
5. Click Next.
6. Select the Complete option to install all of the required scripts and templates, and then click
Next.
7. Click Install.
9. Copy relevant new binaries from backend distribution to the mediation and application
server directories.
10. Copy relevant new libraries from backend distribution to the mediation server library
directory.
11. For UNIX operating systems, copy the relevant OS version of AILIB directory to the
directory defined by OPTDIR from the backend installation program directory
26
Introduction
12. Replace all of the application server executables with new version executables from the
backend distribution package (for example, Alarms, Alarm Notifier, Alarm SNMP Agent and
Report Scheduler).
System Components
The following table summarizes all the components provided as part of the file data loading
architecture. For detailed configuration options for each component, see the following chapters.
File Combiner Merges the CSV files output by External opx_CMB_GEN_903 (Multiple input)
certain parsers into new
combined CSV files. opx_CMB_GEN_900 (Single input)
27
OPTIMA 8.0 Operations and Maintenance Guide
Note: Database programs are run in Oracle. External programs are run external to the Oracle
database in UNIX on the server. The executable name may be suffixed with additional identifiers for
a particular installation. For example a Nortel Parser may be renamed opx_PAR_NOR_711.
28
Introduction
About PRIDs
All OPTIMA components are given a unique identifier, known as a PRID, which is used to identify
all processes involved with the backend. The PRID is used extensively in configuration, error
logging and process monitoring.
000aaabbb
For example 000110001 is the first instance of a program of type 110 (a loader) running on
interface 001.
Tip: The Interface ID is entirely numeric, but the Program ID and Instance ID are alphanumeric
(using uppercase characters). This enables them to support a larger number of programs and
interfaces.
The Program ID is set when the program is originally created, but the Instance ID is calculated by
the package 'AIRCOM.OPTIMA_PACKAGE', which generates the Instance IDs in the following
order - 000, 001, …, 009, 00A, 00B, …, 00Z, 010, 011, …, 019, 01A, …, 01Z, …, 09Z, 0A0, ..,
0ZZ, 100, …, ZZZ .
If you are upgrading from a version older than 6.2, any existing Instance IDs will not be updated but
any new Instance IDs will be calculated by using the next available alphanumeric ID.
Important: It is critical for the correct operation of the backend that all processes have a unique
PRID allocated.
For external programs, the PRID will be read from the associated configuration file.
The database contains a table (INSTANCES) that provides a master list of PRIDs for all installed
elements. If any new components are configured, record the PRID in this table.
29
OPTIMA 8.0 Operations and Maintenance Guide
Note: This table is automatically populated when adding interfaces using the OPTIMA Installation
Tool or adding a report in the Loader/Summary GUI.
30
Introduction
Input Input file name format $OPTIMA/.../<type>/parsers/i All input files into the parser,
unique to particular n validate and loaders
interface processes are placed in the
$OPTIMA/.../<type>/validate/i input directory.
n
$OPTIMA/.../<type>/loaders/i
n
Output Program specific $OPTIMA/.../<type>/parsers/o If multiple files are output
ut these should be place in the
output sub-directories.
$OPTIMA/.../<type>/validate/
out
Temporary <hostname>_<exena $OPTIMA/<interface Lock files and intermediate
me>_<PRIDn>.tmp name>/<vendor>/<type>/ftp/t files stored locally are placed
mp in the temporary directory.
$OPTIMA/.../<type>/parsers/t
mp
$OPTIMA/.../<type>/validate/t
mp
$OPTIMA/.../<type>/loaders/t
mp
Note: As the backend uses chained processes, the directories specified in the table above need
not be physical directories but symbolic links to the previous or next directory in the data flow chain.
Scheduling Programs
All OPTIMA external programs are run in scheduled mode using the Unix scheduler, cron tab. For
example, a parser may be scheduled to run on a periodic basis of every five minutes, in which
case, every five minutes the parser will:
• Be started by cron tab
• Process any input files that are available at that instance
• Exit
If a program instance does not complete before the next time it is scheduled, then multiple
instances of that program will occur. This is avoided by the use of a monitor file.
Before a program starts an instance, it checks if an associated monitor file exists. If one does exist,
then this indicates that an instance is already running and so the program immediately exits. If a
monitor file does not exist, the program starts and creates a monitor file. This file is uniquely
associated to the program instance using the PRID and the hostname environment variable in a
common directory. When the program has run, it removes the monitor file.
The Process Monitor ensures that monitor files are removed if programs crash or hang.
Multiple programs may be scheduled from a single cron entry by using a batch file. The programs
may be scheduled to run sequentially or concurrently, the latter achieved by running the program in
background mode (&) in the batch file.
31
OPTIMA 8.0 Operations and Maintenance Guide
Monitor files:
• Are created by all programs when each program starts running.
• Uniquely identify the OPTIMA program instance via the PRID contained in its filename, and
the hostname environment variable (that identifies on which machine it is running). The
program will also write the process identifier (PID) in the file.
• Provide a heartbeat function, which is created when the backend program regularly
updates the timestamp of the monitor file using a touch function. For example, a parser will
touch the file after parsing each file in the input directory.
The Process Monitor regularly scans all monitor files in the monitor directory to check:
• The PID in each file is still in the current OS process list. If it is not in the list, the associated
program has crashed and so the Process Monitor will remove the monitor file.
• The timestamp of the monitor file is not too old according to the user-specified grace
period. If the grace period has expired then the associated process is stopped from the
current OS process list. For example, a parser may have a three hour grace period. If the
parser monitor file has not been touched in the last three hours then the process is
stopped and the monitor file is removed.
Note: As all processes are scheduled, the parser will start again at the next schedule period.
For more information, see About the Process Monitor on page 257.
Configuring Programs
All external OPTIMA programs read their configuration settings on program startup from a local
configuration (INI) file. Database programs read their configuration from configuration tables in the
database such as the OPTIMA_Common table.
For external programs, the configuration file will be named using this convention,
ProgramName_PRID.ini.
The configuration file is specified in the command line of the program being run. This is usually in
the crontab entry or batch script. For example, opx_PAR_NOR_711_001711001.ini.
Configuration changes are made by editing the parameters in the configuration (INI) file with a
suitable text editor.
32
Introduction
About Versioning
All backend programs have a unique program ID and a descriptive code and version. The
descriptive code and version should be used when reporting bugs and problems to TEOCO
support. This is an example of the format:
You can either obtain the version details of the currently installed program from the log file or you
can print the information by typing the following command at the command prompt (in either
Windows or UNIX):
programname -v
Host Name The name of the machine on which the program/database Text (256 characters).
resides.
PRID The automatically-assigned PRID uniquely identifies each nnncccccc
instance of the application. It is composed of a 9-character
identifier, made up of Interface ID, Program ID and Instance
ID.
The Interface ID is made up of 3 numbers but the Program ID
and Instance ID are both made up of 3 characters, which can
be a combination of numbers and uppercase letters.
For more information, see About PRIDs on page 29.
Date Data of message. YYYY-MM-DD
Time Time of message. HH:MM:SS
Severity Severity classification. Allocated at design
time.
Message Type Unique identifier within the program for this particular type of Integer
message.
Allocated at design
time.
Message Text Explanation of the message. Text (256 characters)
Patch Number The patch release number of the version of the software that Pn.n
produced the error.
33
OPTIMA 8.0 Operations and Maintenance Guide
Critical 6 For error condition where there is a major service affecting fault and
immediate corrective action is required.
You can assign the level of severity message that is logged for each program individually. For
example, if a particular parser is assigned a Severity Level 3 (Warning) this would mean that only
messages of severity Warning or above would be logged.
For external programs, messages are recorded in a separate log file for each program instance.
The files have the following characteristics:
• All log files are stored in a common directory.
• A new log file is created. By default, this happens on a daily basis but can be set to run on
another time period. For more information, see Configuring the Process Monitor on page
260.
• The filename of the log file identifies the program, for example,
<hostname>_<exename>_PRID_<date>.log.
Log files may be archived using the Directory Maintenance program. For example, using this
program it will be possible to maintain a directory with only today’s log files.
A log file utility is provided to allow the quick analysis of log messages and also filter messages for
loading into the database.
A log file with a title including "WrongQuery" is generated when an error occurs during the
execution of an SQL script, within which the text of the last failing script is captured.
Notes:
• Log granularity is normally set to one day so that log entries are added to a file for the day,
then at midnight a new file is created for the next day’s log messages.
• The BAFC log files are first put through a log parser which converts tabs to commas, and
log severities from text, such as "critical", to a number, such as 6.
• The log files are then loaded into the LOGS.COMMON_LOGS table of the database by
configuring a standard 110 loader, this configuration can also be created by the OIT.
• The log parser is a perl application called opx_SCR_GEN_813 which is available under the
mediation folder in the backend installer.
The method is also slightly different depending on whether you are on a Windows OS or a UNIX
OS.
34
Introduction
3. To set the value for one of the environment variables, use the command:
Where
<VALUE> is the value to which you want to set the environment variable
An example might be
set HOSTNAME=UKDC247DT
Tip: It is recommended that HOSTNAME is set to the same value as COMPUTERNAME; the
value of this environment variable should be listed when you run the 'SET' command.
To set up an environment variable on a Windows OS for the session that you are running
and all future sessions:
35
OPTIMA 8.0 Operations and Maintenance Guide
6. Type the name and value of the new environment variable, and then click OK:
36
Introduction
7. Click OK.
8. Click OK.
Note: Before setting up an environment variable, you should check to see if it already exists - for
example, the HOSTNAME is usually set up during the OS installation, run:
echo $HOSTNAME
To set up an environment variable on a UNIX OS just for the session that you are running:
1. Run:
Where
<VALUE> is the value to which you want to set the environment variable
An example might be
export HOSTNAME=server1
echo $HOSTNAME
37
OPTIMA 8.0 Operations and Maintenance Guide
To set up an environment variable on a UNIX OS for the session that you are running and all
future sessions:
1. Open the '.profile' and/or '.bash_profile' file, which can be found in the OPTIMA user, and
add the following two lines:
<ENVIRONMENT VARIABLE>=<VALUE>
Where
<VALUE> is the value to which you want to set the environment variable
An example might be
HOSTNAME=server1
export HOSTNAME
3. Log off and then log in again to allow the new environment variable to take effect.
echo $HOSTNAME
For existing/legacy *.ini files, or those that are created manually, you can use the command line
executable 'opxcrypt' to encrypt the passwords. For more information on the components for which
the *.ini file is created manually, see Which OPTIMA Backend Applications are Affected by
Password Security? on page 39.
To use opxcrypt:
Where:
o -f File is a single, defined file containing the password to be encrypted. The
filename defined can be a wildcard (for Windows) or a regular expression (for UNIX) -
for example, *.ini.
o -d Path is the directory containing the *.ini files containing the passwords to be
encrypted. If the directory is not defined, then the current directory will be used by
default.
38
Introduction
o -r Recursive is an optional tag, indicating that opxcrypt should also look for *.ini
files in all sub-directories of the defined directory.
o -t Tag is an optional parameter for the password. If this is not used, the default is
"Password" - however, this will not find passwords where 'Password' is a substring, for
example, SNMPPassword.
o -v is the print version string. This overrides all other command line parameters.
The password in the selected *.ini file is encrypted. It is placed in parentheses (brackets)
and also prefixed and suffixed by 'ENC' in the INI file - for example, password =
ENC(uvwxyz)ENC.
Note: The syntax supports comma-separated values, for cases where there are multiple IP
addresses/passwords. During encryption, the comma is only ever used as a separator - it is
excluded from the character set available for encoding purposes in order to avoid erroneously
splitting whole passwords.
Important: For the LDAP single sign-on to work, the user AI_PROXY is required. The password
must never be changed by an engineer except as part of a patch upgrade.
The 'Manual Encryption Required' column indicates whether the user needs to use opxcrypt to
encrypt the appropriate element, or whether a configuration UI exists that will do this automatically.
If one does exist, this is shown in the 'Configuration UI Name' column.
For more information on using opxcrypt, see Encrypting Passwords for OPTIMA Backend
Applications on page 38.
39
OPTIMA 8.0 Operations and Maintenance Guide
The 'Encryption dll Needed in App Directory' column indicates whether or not the encryption .dll
(crypter.dll) must be present in the same directory as the application.
Important: OPTIMA performs internal decryption at the latest point possible prior to connection, in
order to maximise security and ensure that the decrypted password is not available for any longer
than it needs to be.
Note: The OIT encrypts the Oracle connection to the OSS_DICTIONARY and AIRCOM in the
project file. If you update the two passwords in the Database Connection section of the Project
Parameters form, they will be encrypted automatically when Loader ini files are created.
$OPTDIR/bin/opxstart.sh opxstartsystem.sh
Command Function
opxstart.sh Ensures the backend process is run with the correct environment.
$OPTDIR Sets the root location of the directory tree, that is the location under which
(Environment Variable) ./bin, ./etc, ./parsers, ./loaders, ./validate, and so on can be found.
opxstartsystem.sh Places all of the backend job entries into the cron configuration. Each
process should then start at their next scheduled time.
The cron entries are stored in $OPTDIR/etc/optima.crontab.
$OPTDIR/bin/opxstart.sh opxstopsystem.sh
The opxstopsystem.sh command removes the crontab configuration and stops all backend
processes based on a pattern match for filenames beginning with the string opx.
40
Introduction
External Programs
Status messages for all external programs are located in a common directory. A new log file is
created every day. You can choose the information level required in the log file by selecting a
severity level. The available levels are: Debug, Information, Warning, Minor, Major and Critical. This
restricts low level severity logging in required cases.
The Log file Utility combines log messages from all logs into a common file for analysis or loading
into the database.
Log messages for external programs can be viewed by a number of different methods. The method
used will depend on the specific issue that is being investigated.
In this case the specific log file associated with a program is identified and all messages displayed
to a terminal using the UNIX tail command. For example the following command will cause the
terminal screen to update with new messages as they are appended to the bottom of the log file:
This is useful when monitoring, in real time, the operation of a particular program, for example a
loader.
Use the Log File Analyzer (opxlog utility) to search all external program logs and retrieve
particular messages for specific programs or time periods. This is useful when diagnosing
programs across all external programs or searching for historical messages.
During initial installation the system will be configured to load specific log messages from external
programs into the database. In general, messages with a severity of "Warning" and above are
loaded every hour into the following table:
LOGS.COMMON_LOGS
Use the Data Explorer, standard reports and modules to display these messages.
Database Programs
All database programs log messages to Oracle tables. These are detailed for each program in the
following chapters.
Use the Data Explorer, standard reports and modules to display these messages.
41
OPTIMA 8.0 Operations and Maintenance Guide
Maintenance
In usual operation, the data loading programs should not need any special maintenance.
However, it is recommended that the following basic maintenance checks are carried out for the
OPTIMA backend:
Log tables for any messages of Daily This will identify any major problems that have
Warning level or above for the occurred with the external programs.
previous day.
Note: An administration report will
be provided for this. For more
information, see the
implementation plan.
For broken Oracle jobs Daily This will indicate potential problems in the Oracle
processes.
All application input directories for a Weekly Files older than the scheduling interval should not
backlog of files. be in the input directories. If there is a backlog this
indicates a problem with a particular program.
The error directories for files. Weekly In normal operation files should not be located in
error directories. Check any files in these directories.
System resources on the server Weekly If resources, for example disk space, are limited
and loading workstations then backup files or data may need to be archived.
Note: An administration report will
be provided for this. For more
information, see the
implementation plan.
42
Introduction
Troubleshooting
Troubleshooting information is also provided for each application in the following chapters of this
guide.
Data not loaded for Raw data has not been Check error logs for any warning
an interface or received from the network. messages for this period. Investigate any
periods of missing messages of Warning or above severity
data. Invalid or corrupt data has
been received from the Check Process Monitor to ensure that all
network. processes are scheduled for that interface.
An application in the data For a file-based interface:
loading process has failed.
• Check all external directories for file
A backend application has backlogs or error files.
not been scheduled.
• If files are in an error directory for a
particular application, move these into
the input directories and see if these
are processed. If necessary, the
message severity level of the
application can be lowered to output
more debugging information.
Investigate any errors output.
• Check the FTP backup directories to
see if files have been received for this
time period. If files have not been
received for this time period,
investigate if files have been produced
from the network.
For a database-based interface or
summary data:
• Check that data exists in the source
database or raw tables for this period.
• Check that the relevant summary
report has run.
No data is being The backend has been Check cron entries for the OPTIMA user.
loaded. stopped.
Check for any broken Oracle jobs.
Database connection
problem. Check Process Monitor to ensure that all
processes for that interface are scheduled
Network problems. and are running.
Check for files in the input directories of
the FTP application. If these exist then
check the input directories of the following
applications to find at what point the
process is failing:
• FTP
• Parser
• Data Validation
• Loader
An application A monitor file exists that Remove the monitor file.
cannot be started has not been removed by
or scheduled. the Process Monitor. Check the Process Monitor settings.
43
OPTIMA 8.0 Operations and Maintenance Guide
You run the Log File Analyzer from the command prompt. You specify various options in a separate
INI file. For more information about these options, see Configuring the Log File Analyzer on page
45.
Important: Before running the Log Files Analyzer, you should ensure that you have set up all of the
prerequisites. For more information, see Prerequisites for Using the Log File Analyzer on page 44.
To start the Log File Analyzer, type the script name and a configuration file name into the command
prompt. If you are creating a new configuration file, this is when you choose the file name.
In Windows type:
opx_SCR_GEN_813.exe opx_SCR_GEN_813.ini
In Unix type:
opx_SCR_GEN_813 opx_SCR_GEN_813.ini
For more information about the configuration (INI) file, see Example Log File Analyzer
Configuration (INI) File on page 47.
2. If PERL is not installed, install 5.8.9 build 825 - for Windows, use the 'ActivePerl-5.8.9.825-
MSWin32-x86-288577.msi' file available on the TEOCO Intranet. Ensure that you leave
everything as default.
For Windows, the PATH and PERL5LIB environment variables should look similar to the following:
• PATH=C:\Perl\site\bin;C:\Perl\bin;...oracle_path ...
• PERL5LIB=C:\Perl\lib;C:\Perl\site\lib
For more information on setting the environment variables, please see About Environment
Variables on page 34.
44
Introduction
Parameter Description
Parameter Description
fileMask By default, the Log File Analyzer will analyze any file with the extension .log.
However, you can use this parameter to filter on log file names containing a
particular string.
For example, if you use the fileMask '.FTP.*' then the application will return
opx_FTP_GEN_302_111302111_20100108.log.
log_severity Sets the level of information required in the log file. The available options are:
1 - Debug
2 - Information (Default)
3 - Warning
4 - Minor
5 - Major
6 - Critical
StandAlone 0 – Run the application without a monitor file. Do not select this option if the
application is scheduled or the OPTIMA Process Monitor is used.
1 – Run the application with a monitor file.
myprid Specify the Program ID of the Log File Analyzer (in order to handle multiple
start-ups).
HeartBeatDeferTime Use this option to defer the HeartBeat function if it was run recently.
Specify a number of seconds - the default is 5.
newf Set the base name for the output file (use with -outdir). (Alternative to 'outf'.).
outf Set the output file name for appending. (Alternative to 'newf'.)
historyfile Set history file name. Match input files (and input lines) only with timestamps
greater than the previous one, as read in from the history file (if available). The
file is updated with the latest timestamp at the end. (In this way, one can
generate a sequence of logs that cover the input files without overlap.)
45
OPTIMA 8.0 Operations and Maintenance Guide
Parameter Description
ds, de, hs, he Use this parameter instead of the historyfile parameter to set the start and end
times in terms of days, hours or minutes. Only one pair of values (ds/de,
hs/he) can be specified.
For example you can define 'hs' and 'he' to match input files/lines only with
timestamps from 'hs' hours ago to 'he' hours ago.
The default is ds=1, de=0.
sev You can choose to match to a minimum severity level. Possible values are:
1 : Match 1 (DEBUG) and above
2 : Match 2 (INFORMATION) and above (the default)
3 : Match 3 (WARNING) and above
4 : Match 4 (MINOR) and above
5 : Match 5 (MAJOR) and above
6 : Match 6 (CRITICAL) and above
rec 1 - (Default) Recurse into sub-directories.
0 - Do not recurse into sub-directories.
prid By default, the Log File Analyzer will analyze any file with the extension .log.
However, you can use this parameter to filter on log file names matching a
specified Program ID.
pridf By default, the Log File Analyzer will analyze any file with the extension .log.
However, you can use this parameter to filter on log file names matching any
of the IDs contained in the specified file.
quote 1 - Quote the log message in each output line (with "").
0 - (Default) Do not quote the log message.
txtsev 0 - Transform the severity strings into their numeric codes (for example
'INFORMATION' becomes '2').
The numeric codes are described in 'log_severity'.
1 - (Default) Do not transform.
header 1 - (Default) Insert a header with column names as the first line of the output
file.
0 - Do not insert header.
mdgno Filter lines by Message Number (in addition to all other rules).
If the message number field of the input line matches the msgno string
(regular expression), then proceed with normal filtering rules (including by
severity).
opxlog Print to screen all log messages, for today, from files under /var/aircom
optima/log.
opxlog - Print all log messages under /aircom optima/archive/log.
source=/aircom
optima/archive/log
46
Introduction
opxlog -ds=1 Print all log messages from midnight yesterday to now.
opxlog -ds=7 Print all log messages for the past week.
opxlog -ds=1 -de=1 Print all log messages from yesterday only (midnight to midnight).
opxlog -ds=1 -de=1 - Print all Minor, Major and Critical messages from yesterday.
sev=3
opxlog -hs=1 Print all messages from the previous hour to now. Based on whole hours. For
example, if now is 12:30 then it would print all messages from 11.00 to 12:30.
opxlog -hs=1 -he=1 Print all messages from the previous hour. For example, if now is 12:30 then it
would print all messages from 11.00 to 12:00.
opxlog -hs=1 -he=1 - Print all messages from the previous hour for PRID 000010001 only.
prid=000010001
opxlog -hs=1 -he=1 - Append all messages from the previous hour to file my.log in local directory.
outf=my.log
opxlog | grep mytext Print all log messages for today containing the text mytext to screen. This is
useful for finding messages with a particular error codes or string – Unix only.
opxlog | grep mytext > Create file myfile.log containing all log messages containing the text
myfile.log mytext to screen – Unix only.
opxlog | sort -k2 Print all log messages sorted in date/time order to screen – Unix only.
[DIR]
source=/OPTIMA_DIR/<application_name>/in
outdir=/OPTIMA_DIR/<application_name>/out
mondir=/OPTIMA_DIR/<application_name>/pid
tempdir=/OPTIMA_DIR/<application_name>/temp
logdir=/OPTIMA_DIR/<application_name>/log
[OPTIONS]
fileMask=.PAR.*
fileMask=.FTP.*
fileMask=*.*
log_severity=1
StandAlone=0
myprid=001813001
HeartBeatDeferTime=5
newf=commonlog
outf=test\output.log
historyfile=test\history
ds=2
de=1
hs=3
he=0
sev=1
rec=1
prid=12345678
pridf=PRIDfile
quote=0
txtsev=1
header=1
mdgno=500
47
OPTIMA 8.0 Operations and Maintenance Guide
Important: In the [DIR] section (also called [Directory Parameters] in some INI files), you should
replace:
• OPTIMA_DIR with the OPTIMA home directory, which is set as an environment variable,
and will be different depending on whether you are using Windows or UNIX
• <application_name> with the name of the application (minus any file extension)
• Forward slashes with back slashes, if you are using Windows
1. Ensure all users are logged out as you do not want them to perform transactions on the
database.
3. Stop the loaders/cron. Using Edit crontab, comment out the FTP process, this will prevent
any further collection of data.
Important: The other applications should finish processing collected files before you
continue shutting-down the server.
Tip: For a faster shutdown process, export the crontab using crontab -
1>$OPTDIR/scripts/crontab.txt, which should then be deleted using crontab -r.
4. Shutdown all databases, including Citrix and test databases, using the shutdown
immediate command.
This may take some time as the command waits until all transactions are complete before
committing the database to disk and stopping the RDBMS.
The Oracle Listener and databases (Citrix, Production, Test) will startup automatically.
4. If you backed up and removed the crontab, replace it with the crontab
$OPTDIR/scripts/crontab.txt command.
5. If you commented out the FTP process, make the process active again.
48
Introduction
For example, you may be running a daily network summary that covers a network across multiple
time zones. If the last hour of data from the farthest part of the network is 5 hours behind the rest of
the network, there will be a delay of 5 hours on the summary. This in turn will affect the schedule.
If time zone support is not used and the client and database machines are in different time zones,
there could be ambiguity in scheduled time.
You may also have network elements that have child nodes that span time zones - for example,
MSCs with BSCs in regions that have different time zones. If time zone support is not used, this
could cause problems because there would be data from two different time zones coming in - for
example, 9am ET (Eastern Time) is 8am CT (Central Time). This means that if the BH is
summarized at 9am, it would not be truly representative of the elements in both time zones.
To manage time zone support, there are a number of different time definitions used in OPTIMA,
which are described in this table:
Term Description
Local Time Date and time of data, stored as the date and time of the data.
Also known as consistent.
Natural Time Date and time of data, driven by the local time zone.
Universal Time Date and time of data, driven by the universal time zone.
Also known as the System Time.
Selected Time Date and time of data, driven by the selected time zone.
By default, this is the same as the Universal Time.
User Time Zone The time zone that the connected user/process is within. This is displayed in the
OPTIMA Message Log.
Note: If the client is run over Citrix, the User Time Zone is still regarded as where the
client is located, not where the Citrix server is located.
Universal Time Zone The time zone in which the database is located.
Also known as the System, Global or Database Time Zone.
Important: Currently, time zone support for alarm forwarding is not available.
49
OPTIMA 8.0 Operations and Maintenance Guide
However, for most users, this is just read-only access - for example, they cannot edit or delete any
data or tables.
Important: A special 'power user', the DBACCESS user, can access the database to create new
objects.
In Oracle, when the OPTIMA database is created, all users are assigned the Oracle role
'OPTIMA_DEFAULTS', which does not require a password.
A separate, dedicated role exists for each OPTIMA backend application, which are described in the
following table:
Application Role
These application roles are session-based, and only activated when the user logs into the
appropriate application - if the same user tries to use an application outside OPTIMA to access the
data and configuration tables (for example, SQLPLUS or TOAD) they will only have read-only
access again.
Important: Because grants are assigned through roles, users cannot grant themselves other rights.
Therefore, to extend the privileges for a user, the database administrator must:
• Grant the appropriate application role
• Grant the 'OPTIMA_DEFAULTS' role, and make this one the default role for the user
50
Introduction
OPTIMA_ADMINISTRATOR OPTIMA_ADMINISTRATORS_CG
OPTIMA_ADVANCED_USER OPTIMA_ADVANCED_USERS_CG
OPTIMA_USER OPTIMA_USERS_CG
OPTIMA_USER_ADMINISTRATOR OPTIMA_USER_ADMINISTRATORS_CG
OPTIMA_ALARM_ADMINISTRATOR OPTIMA_ALARM_ADMINISTRATORS_CG
This means that when a new user is created and assigned to a particular user type, they
will be assigned to the corresponding consumer group at the same time.
• For the OPTIMA back end, the consumer groups are based on the backend application:
This means that when a user logs into a particular application, they will be assigned to the
corresponding consumer group at the same time. For example, when
OPTIMA_LOADER_PROCS logs into the Loader, they will automatically be assigned to the
OPTIMA_LOADER_PROCS_CG consumer group, and receive the specified allocation of
database memory for a member of that group.
These resource groups must be used in conjunction with resource plans, which define how
resources are balanced across the system (in terms of % share) according to business rules.
Note: This percentage share will only be enforced when resource consumption has reached
capacity (in other words, 100%).
As a simple example, a 'DAYTIME' plan may distribute the resources in one way, while another
'NIGHTTIME' plan distributes them in another way:
51
OPTIMA 8.0 Operations and Maintenance Guide
OPTIMA_ADMINISTRATORS_CG 20 50
OPTIMA_ADVANCED_USERS_CG 10 10
OPTIMA_USERS_CG 60 20
OPTIMA_USER_ADMINISTRATORS_CG 5 10
OPTIMA_ALARM_ADMINISTRATORS_C 5 10
G
OPTIMA has a default resource plan, which is assigned at the start of the deployment of OPTIMA.
This contains a number of subplans associated to the consumer groups for the different
components of OPTIMA - for example, Loader, SNMP, Summary and so on.
52
Introduction
666 Started logging for Nokia 3G XML Parser Version - <ProgramCode>. INFORMATION
1003 Error Accessing Input Directory. CRITICAL
1004 Error Accessing Output Directory. CRITICAL
1005 Error Accessing Backup Directory. CRITICAL
1006 Error Accessing Error Directory. CRITICAL
1008 Error CRITICAL
1009 Started Parsing Input File : <fileName>. INFORMATION
1011 Error Accessing PID Directory. CRITICAL
1012 Ended Parsing Input File : <fileName>. DEBUG
1020 Error Accessing Temp Directory. CRITICAL
1021 File : <FileName> Successfully Copied To Archive Dir <ArchiveDir> INFORMATION
Refreshing list of files in Input Directory. DEBUG
Unable to Move temporary combiner Log File <fileName> to Combiner Log WARNING
Directory <CombinerDir>.
Unable to delete File : <fileName>. WARNING
53
OPTIMA 8.0 Operations and Maintenance Guide
Deleting temp new counter file because no new counters where found DEBUG
7013 Error Found when processing file, deleting input file : <filename>. INFORMATION
Closing new line file stream <NewCounterFileName>. DEBUG
7701 Input file header column <ColumnName> is not in any reports for new DEBUG
counters.
Creating output stream for report <FileName>. DEBUG
54
Introduction
7702 Input file header column <ColumnName> is not in any reports. WARNING
Failed to create output stream. DEBUG
55
OPTIMA 8.0 Operations and Maintenance Guide
56
About Data Acquisition Tools
As well as using these tools, an alternative method of data acquisition is to use the OPTIMA
Summary application to load data from one database directly into another. For more information,
see Using the Summary for Direct Database Loading on page 329.
Important: These acquisition methods apply to file based data acquisition only. Network data
acquisition carried out by OPTIMA using Simple Network Management Protocol (SNMP) is
described in the next chapter. SNMP data acquisition carried out using the new Netrac interface is
handled by Netrac as described in the SNMP Agent User Guide for Netrac.
FTP Process
When scheduled, the FTP application regularly monitors a remote directory for new files. When
new files are detected, they are transferred to the local machine. Transfer takes place using a local
temporary file to ensure that the Parser does not start to parse the file before transfer is complete.
Status and progress messages are recorded in a log file.
A local list file ensures that files are not transferred twice. The list file keeps a record of all files that
exist on the remote server that have been downloaded. The list file is refreshed every time the
application is run.
57
OPTIMA 8.0 Operations and Maintenance Guide
The script can be configured to only look for new files on the remote server for a given number of
previous days. For example, if configured for three days then only directories for the latest three
days are searched. This facility is based on all files on the remote server being located in a new
directory each day.
FTP Functionality
Function Action
Logging Status and error messages are recorded in a daily log file.
Error Files If the application detects an error in the input file that prevents processing of that
file, then the file is moved to an error directory and processing continues with the
next file.
Monitor Files The application runs in a scheduled mode. A monitor (PID) file, created each time
the application is started, ensures that multiple instances of the application cannot
be run. The PID file is also used by the OPTIMA Process Monitor to ensure that the
application is operating normally.
PRID The automatically-assigned PRID uniquely identifies each instance of the
application. It is composed of a 9-character identifier, made up of Interface ID,
Program ID and Instance ID.
The Interface ID is made up of 3 numbers but the Program ID and Instance ID are
both made up of 3 characters, which can be a combination of numbers and
uppercase letters.
For more information, see About PRIDs on page 29.
Backup The application can store a copy of each input file in a backup directory.
58
About Data Acquisition Tools
As well as providing basic transfer of data files from a remote server, the FTP application also
provides the following functions:
Item Description
Archive Storage You can backup any files that are transferred by the FTP, as well as store any
historical error files.
Overload Storage If the FTP input folder reaches its defined limit (in terms of percentage of disk space
usage), to avoid overloading the disc, any files over the limit can be moved to a
separate disk and processed from there after the input folder has been emptied.
Note: When moving files to the archive and/or overload folders, the FTP tars these files to keep the
number of files stored to a minimum.
To use these functions, you must ensure that both your folder mounts and FTP INI file parameters
are configured correctly.
Note: The Parser, Data Validation and Combiner applications also support these functions. For
more information, see the respective chapters for these applications.
Mode Description
FTP The standard FTP application, which transfers data files from a
remote server using File Transfer Protocol.
FTP PUT The FTP application that transfers data files in the opposite
direction – that is, from the local client to the remote server –
using File Transfer Protocol.
Important: If you are running the FTP application on Windows,
you can only use this mode if you are using the 24-hour time
format.
SFTP with SSH Key Authentication The FTP application in secured mode, using SSH Key
Authentication.
(Windows or UNIX)
This mode uses passwordless key authentication, rather than the
password parameter stored in the INI file, and so is particularly
useful for systems where the security policy requires account
passwords to be changed periodically.
Important: This method is strongly recommended, as it is faster
and more reliable than the regular FTP method (described
below).
59
OPTIMA 8.0 Operations and Maintenance Guide
Mode Description
Important: If you want to use the FTP in either secured mode or secured mode with SSH Key
Exchange Authentication, then you must install a number of additional modules. For more
information, see:
• Prerequisites for Using the SFTP with SSH Key Exchange Authentication (for Windows
Servers) on page 62
• Prerequisites for Using the SFTP on page 61
To install the FTP application, install the following files in the backend binary directory:
• opx_FTP_GEN_302.exe (Windows)
• opx_FTP_GEN_302 (Unix)
Important: You must install the backend components using the OPTIMA combined backend
package, which will install the required components quickly and easily. For more information on
how to use this, see Installing the OPTIMA Combined Backend Package on page 21.
Note: For Windows, a Perl Interpreter must also be installed. TEOCO recommends using
ActivePerl. You can read more about ActivePerl at this location:
http://www.activestate.com/activeperl/
To start the FTP, type the script name and a configuration file name into the command prompt. If
you are creating a new configuration file, this is when you choose the file name.
In Windows type:
opx_FTP_GEN_302.exe opx_FTP_GEN_302.ini
In Unix type:
opx_FTP_GEN_302 opx_FTP_GEN_302.ini
In usual operation within the data loading process, all applications are scheduled. You should not
need to start the FTP.
60
About Data Acquisition Tools
Important: If you want to use the FTP in either secured mode or secured mode with SSH Key
Exchange Authentication, then you must install a number of additional modules. For more
information, see:
• Prerequisites for Using the SFTP on page 61
• Prerequisites for Using the SFTP with SSH Key Authentication (for Unix Servers) on page
65
Important: You cannot use this method with Windows operating systems. Instead, you must use
Prerequisites for Using the SFTP with SSH Key Exchange Authentication (for Windows Servers) on
page 62. This method is strongly recommended for both Windows and non-Windows operating
systems, as it is faster and more reliable.
You can check if these are already installed, but this is done in different ways, depending
on your OS. This table describes the options:
Check OS Command
If Perl has already been installed, and Sun Solaris pkginfo | grep -i perl
at which version
perl -v
HP-UX swlist | grep -i perl
perl -v
RedHat Linux rpm -qa | grep -i perl
perl -v
If openssl and GMP have been installed Sun Solaris pkginfo | grep -i gmp
pkginfo | grep -i openssl
61
OPTIMA 8.0 Operations and Maintenance Guide
2. If you are installing the Net::SFTP CPAN module on a server that has HTTP network
access to the Internet, then run:
- or -
If you are installing the Net::SFTP CPAN module on a server that does not have HTTP
network access to the Internet, then:
o Copy the SFTP CPAN library distribution (provided by TEOCO) for your platform to the
server where the OPTIMA FTP program will run.
This consists of a gzip compressed TAR file containing all of the required perl modules.
o Decompress the file and extract it to the 'perl5' library directory on the server.
3. Follow the operating system-specific instructions that are included with the distribution.
If you also want to use SFTP compression, you must additionally install the following CPAN Perl
modules:
• Compress::Zlib
• Compress::Raw::Bzip2
• Compress::Raw::Zlib
• Crypt::Blowfish
• Crypt::CBC
• Crypt::DES
• Crypt::DH
• Crypt::DSA
• Crypt::Primes
• Crypt::RSA
• Crypt::Random
• Digest::BubbleBabble
• Digest::HMAC
• Digest::MD2
• Digest::SHA
• IO::Compress
• IO::Zlib
• Net::SSH
• Net::SSH::Perl
62
About Data Acquisition Tools
Prerequisites for Using the SFTP with SSH Key Exchange Authentication (for
Windows Servers)
To use the FTP in secured mode (SFTP) with SSH Key Authentication (which is the strongly
recommended SFTP option) on a Windows server, you must complete the following prerequisites:
Note: If you are an TEOCO installation engineer, all of these files are available on the intranet.
Otherwise, please contact Product Support.
3. Reboot the machine if required, to ensure that the PATH and PERL5LIB environment
variables are updated.
In the dialog box that appears, click the Advanced tab, and then click the
Environment Variables.
In the System Variables pane, add or amend the PATH and the PERL5LIB
environment variables.
To see the new/updated environment variables on the command prompt, you need to
open a new one.
5. Check that the correct Perl version has been installed, by typing perl -v in the command
prompt.
6. On the 'C:\' drive, browse to the Perl location, and delete the Perl folder and all of its
contents.
7. Run the 'Perl_5.8.9.825_with_packages.exe' file, which will re-create and re-populate the
original Perl folder on the C:\ drive with all necessary packages.
63
OPTIMA 8.0 Operations and Maintenance Guide
Notes:
o To add users from a domain that is not the primary domain of the machine, add the
domain name after the user name.
o Omitting the username switch adds ALL users from the machine or domain, including
service accounts and the Guest account.
9. Still using the command prompt, create the SSH authentication private key:
o Start the OpenSSH server, by typing: net start opensshd.
o Create the private and public key by typing: ssh-keygen -t rsa.
o When prompted to enter the file in which to save the key, type id_rsa.
o When prompted to enter the passphrase, press <ENTER>.
o When prompted to enter it again, press <ENTER>.
o The private key included in the 'id_rsa' file should be created in 'C:\Program
Files\OpenSSH\bin'. Ensure that a copy also exists in 'C:\Documents and
Settings\<user_name>\.ssh' as well.
Tip: If the '.ssh' folder does not exist and Windows does not allow you to create one
manually, then you should:
When prompted to continue connecting, choose Yes. The .ssh folder created for you
under 'C:\Documents and Settings\<user_name>\'.
o Make a copy of the public key (file name id_rsa.pub), which should be in the same
location as the private key/file (for example, 'C:\Program Files\OpenSSH\bin') as
id_rsa.pub_username.
10. FTP the 'id_rsa.pub_username' public key/file into the home/.ssh folder on the server where
you will connect to download the files using the FTP application:
o The .ssh folder on the server may need to have the mode set up as: 'chmod 700 .ssh'
o Run cat id_rsa.pub_<username> >> authorized_keys
o It may be that the 'authorized_keys' and 'id_rsa' files in the .ssh folder need to have the
mode set up as 'chmod 600 authorized_keys_id_rsa'
11. Check the IP Address and Host Name of your Windows machine, by typing ipconfig
/all in the command prompt.
64
About Data Acquisition Tools
12. Log in as root on the server, to be able to add your IP address and host name in the
/etc/hosts file.
13. Ensure that you can connect to the server machine by typing ssh user@IP_address in
the command prompt.
Important: If you have followed all of the steps, you should NOT get prompted for
password. If you are, double check that you have got the private key/file in your home dir
and the public key/file is in place in the server machine.
14. You can now run the FTP application in secure mode.
Prerequisites for Using the SFTP with SSH Key Authentication (for Unix Servers)
To use the FTP in secured mode (SFTP) with SSH Key Authentication (which is the strongly
recommended SFTP option) on a Unix server, you must complete the following prerequisites:
Note: If you are an TEOCO installation engineer, all of these files are available on the intranet.
Otherwise, please contact Product Support.
3. On each host where the FTP Application is installed, as the 'optima' user generate an SSH
key pair using the following command:
ssh-keygen -t rsa
The RSA public key is written to the ~/.ssh/id_rsa.pub file and the private key to the
~/.ssh/id_rsa file.
cp id_rsa.pub opt_authorized_keys
6. Copy the opt_authorized_keys file to the remote SFTP server, for example, using SFTP,
and append the public key in the opt_authorized_keys file into the authorized_keys file on
the remote server:
cd ~/.ssh
65
OPTIMA 8.0 Operations and Maintenance Guide
Using Commenting
The FTP configuration (INI) file supports the following types of commenting:
• Windows, using this symbol (;).
• UNIX, using this symbol (#).
Lines are parsed for the first occurrence of a comment symbol. Once a comment symbol is found,
the rest of the line is ignored. Lines using the [Grouping] notation are also ignored but only if this
symbol ([) is found at the beginning of the line.
In Windows type:
SET ENV_VAR=xyz
In UNIX type:
Note: If you are batching the program, then the environment may not inherit the user environment.
In this case, it is safer to reset environment variables before running the FTP application.
http://search.cpan.org/dist/perl/pod/perlre.pod
66
About Data Acquisition Tools
This table gives examples of some regular expressions that you might use in the FTP configuration
(INI) file:
Regular Description
Expression
^ Match the beginning of a string. For example, the expression ^CSV will match
CSV at the beginning of a string.
$ Match the end of a string. For example, the expression CSV$ will match CSV at
the end of a string.
. Match any character except newline. For example, the expression C.V will
match a C followed by any single character (except newline) followed by a V.
* Match 0 or more times. For example, the expression CS*V will match a C
followed by zero or more S's followed by a V.
+ Match 1 or more times. For example, the expression CS+V will match a C
followed by one or more S's followed by a V.
? Match 1 or 0 times. For example, the expression CS?V will match a C followed
by an optional S followed by a V.
| Alternation. For example, the expression C|V will match either C or V.
() Grouping. For example, the expression CSV(04|05) will match CSV04 and
CSV05.
[] Set of characters. For example, the expression [CSV] will match any one of C,
S, and V.
{} Repetition modifier. For example, the expression CS{2,4}V will match a C
followed by 2, 3 or 4 S's followed by a V.
\ Quote (escape) the next character. For example, the expression C\.V will match
C.V exactly.
2. If it successfully connects, it starts downloading files and exits itself after finishing.
Note: If it does not connect to the first IP address, it tries the next IP address in the list and
it logs a message for the failed connection. If all IP addresses fail to connect, then it logs a
message to indicate this.
The FTP downloads from the first host with the following settings in the configuration (INI) file:
remoteUser = optima,optima,optima
remotePass = optima,optima,optima
remoteDir = /export/home/optima
67
OPTIMA 8.0 Operations and Maintenance Guide
The FTP downloads from the second host if the first host is not valid with the following settings in
the configuration (INI) file:
remoteUser = xxx,optima,optima,optima
remotePass = yyy,optima,optima,optima
remoteDir = /export/home/optima
Processing Specifying how the FTP program will run. For example, it includes the parameters for
setting when the FTP process should start and for how many days it should collect data.
For more information, see Processing Parameters on page 69.
Directory Specifying the directories that are used by the FTP program. For example, it includes the
parameters for setting the directories for processing files and outputting logs. For more
information, see Directory Parameters on page 73.
Filename Matching FTP files and directories. For example, it includes the parameters for setting the
file and directory masks and for prepending directory names to filenames. For more
information, see Filename Parameters on page 74.
FTP Specifying FTP connection details. For example, it includes the parameters for setting the
location and login details of the remote host. For more information, see FTP Parameters on
page 76.
Important: If you are using this FTP PUT option, you should read all references in this chapter
(except where this parameter is specifically mentioned) as follows:
• Local/client machine becomes the source
• Server/remote host becomes the destination
• 'Download' becomes 'upload'
68
About Data Acquisition Tools
Processing Parameters
The following table describes the Processing parameters for the FTP script:
69
OPTIMA 8.0 Operations and Maintenance Guide
PRID Uniquely identifies each instance of the - Yes, in all INI files.
application. It is composed of a 9-
character identifier, made up of Interface
ID, Program ID and Instance ID.
The Interface ID is made up of 3
numbers but the Program ID and
Instance ID are both made up of 3
characters, which can be a combination
of numbers and uppercase letters.
For more information, see About PRIDs
on page 29.
startOnDay Number of days back from today to start 0 No.
searching for files to download.
If you set this to a negative value (for
example, startOnDay=-1), it will collect
files with a date in the future. You should
do this if you are working across multiple
timezones.
numberOfDays Number of days counting back from - Yes, in all INI files.
startOnDay to search for new files on the
remote host.
datedDir 0 - Directories are not dated. However, - Yes, in all INI files.
files can be dated if dateFormat is not set
to 0.
n - The directory n levels down from
remoteDir is a dated directory of format
dateFormat.
dateFormat Date format to use for datedDir, where - Yes, in all INI files.
the available options are:
M-D-YYYY
M_D_YYYY
MDYYYY
M-D-YY
M_D_YY
noFilesInList The number of downloaded files that will - Yes, if
be maintained in a list file if datedDir and datedDir=dateForm
dateFormat are both set to 0. For at=0.
example, if noFilesInList=1000, then
1000 files will be maintained in the list
file.
MaxFilesInDownloadDir The maximum number of files that 10000 No.
should be in the FTP Download Directory
at any time.
If the number of files in the output
directory is greater than this, then no
more files will be downloaded.
verbose 0 - Run silently. 0 No.
1 - Print status messages to the screen.
backup Indicates whether a copy of the input file 1 No.
will be copied to the backup directory (1)
or not (0).
If you do not choose to backup, the input
file is deleted after it has been
processed.
useMonitorFile 0 - Script does not use a monitor file. 1 No.
1 - Script uses a monitor file.
70
About Data Acquisition Tools
71
OPTIMA 8.0 Operations and Maintenance Guide
72
About Data Acquisition Tools
Directory Parameters
The following table describes the Directory parameters for the FTP script:
73
OPTIMA 8.0 Operations and Maintenance Guide
Important: If you are using the FTP PUT functionality (in other words, the 'upload' parameter is set
to 1) then all of these parameters except for the FTPOutDirectory will still be used. This means that
all of the log directories, PID file directories and so on will remain on the client machine. However,
instead of the FTPOutDirectory, you should specify the FTPInDirectory, which represents the
source directory for the files that you want to upload. For more information, see FTP Parameters on
page 76.
Filename Parameters
The following table describes the Filename parameters for the FTP script:
dirMask Regular expression mask used for Yes, in all INI files.
directories.
Use ^$ to prevent directory recursion.
fileMask Regular expression mask used for files. Yes, in all INI files.
74
About Data Acquisition Tools
75
OPTIMA 8.0 Operations and Maintenance Guide
FTP Parameters
The following table describes the FTP parameters for the FTP script:
remoteHost Remote hostname (or IP address) from Yes, in all INI files.
which to download files.
If you are using SFTP, you can also use
this parameter to specify an alternative
port for the FTP traffic, if you do not want
to use the default.
To do this, add the port number after the
hostname/IP address, separating the two
with a colon (':').
For example:
remoteHost=100.200.300.1:4567
Where
100.200.300.1 is the IP address
4567 is the port number
remoteUser Username for login to the remote host. Yes, in all INI files.
remotePass Password for login to the remote host. Yes, in all INI files.
76
About Data Acquisition Tools
remoteDir Parent directory on remote host from Yes, in all INI files.
which to download files.
{dir1}[,{dir2}]
Tip: You can specify multiple remote
directories by listing directories
separated by commas.
Important: If you are using FTP PUT (in
other words, the 'upload' parameter is set
to 1), then this will be the destination
folder on the remote host.
remoteArchiveDir Archive directory (flat structure) on No.
remoteHost to which files, when
downloaded, are moved using the same
final filename as used by OPTIMA.
removeOnDownload 0 - Files are not deleted from 0 Yes, in all INI files.
remoteHost.
1 - Files are deleted from remoteHost
when downloading is complete
FTPType The mode of FTP: ASCII Yes, in all INI files.
• ASCII
• BINARY
FTPActive Indicates whether the FTP is in passive 0. No.
mode (0) or active mode (1).
In passive mode, the client initiates both
connections to the server, solving the
problem of firewalls filtering the incoming
data port connection to the client from
the server.
In active mode, the client connects from
a random unprivileged port (N > 1023) to
the FTP server's command port, port 21.
Then, the client starts listening to port
N+1 and sends the FTP command
PORT N+1 to the FTP server. The server
will then connect back to the client's
specified data port from its local data
port, which is port 20.
FTPSafetyPeriod Safety period, in minutes, for files still No.
being written to local machine.
FTPStyle The style of FTP: Yes, in all INI files.
• stdUNIX
• stdWINDOWS
- or -
DIR,X,X,X,SIZE,DATE,TIME,NAME,DAT
E,TIME,SIZEORDIR,NAME
FTPDateFormat The FTP date format. Yes, if FTPSafetyPeriod
or PrependTimestamp,
and standard FTPStyle
are not used.
FTPTimeFormat The FTP time format. Yes, if FTPSafetyPeriod
or PrependTimestamp,
and standard FTPStyle
are not used.
77
OPTIMA 8.0 Operations and Maintenance Guide
If you are using the archiving or overload functionality of the FTP application, then you can use a
regular expression in the archivePeriodMask parameter to indicate the file name and which files
should be included in the tar file, subject to the limitations specified by the archiveMaxFiles and
archiveMaxSize parameters.
The default value for archivePeriodMask is *., which means that all files are matched and nothing is
extracted. In this case, all of the files will go into the same tar file. However, by using bracketed
sections, you can choose to match and group according to a narrower definition.
Consider a simple example, where you have the following files that will be tarred:
• File1: '1320_abc_20100322_1152_qwexoiqwe'
• File2: '2344_abc_20100322_1156_crwercwerw'
• File3: '4234_abc_20100322_1242_wcrwercwer'
Where
• 20100322 represents the creation date
• 1152, 1153 and 1242 all represent the creation time
You may want to group these files in tar files according to the date and hour they were created, and
use this for the file name. To do this, you could use the regular expression abc_(\d{8}_\d{2})\d{2}_.
78
About Data Acquisition Tools
[Processing parameters]
SFTP=0
SFTPcompression=0
localFile=0
localFileList=dir
PRID=001302011
startOnDay=0
numberOfDays=20
datedDir=0
dateFormat=0
noFilesInList=10000000
MAXFilesInDownloadDir=10000
verbose=1
backup=1
useMonitorFile=1
unzipCommand=C:\Programs\gunzip.exe
zipExtension=.zip
unzipAfterTar=0
untarCommand=C:\Programs\gtar.exe
tarExtension=.gtar
LogSeverity=1
HeartbeatDeferTime=5
archiveBackup=0
archivePeriodMask=.*
archiveMaxFiles=100
archiveMaxSize=10000
archiveCommand= /bin/gtar
maxOutputFilesystemPercent=90
alternativeOutputFilesystem=0
maxAlternativeFilesystemPercent=90
79
OPTIMA 8.0 Operations and Maintenance Guide
[Directory parameters]
optimaBase=/OPTIMA_DIR/<application_name>
LogDirectory=/OPTIMA_DIR/<application_name>/log
ProcDirectory=/OPTIMA_DIR/<application_name>/pid
FTPOutDirectory=/OPTIMA_DIR/<application_name>/out
FTPDownloadDir=/OPTIMA_DIR/<application_name>/download
FTPErrorDir=/OPTIMA_DIR/<application_name>/error
FTPFileListDir=/OPTIMA_DIR/<application_name>/list
FTPBackupDir=/OPTIMA_DIR/<application_name>/backup
FTPAltDirectory=/opt/AIoptima/opt_perf/ftp/
[Filename parameters]
dirMask=.*
fileMask=.*
excludeMask=^$
MAXFileSize=2000000000
MINFileSize=0
UseFolderFileLimit=1
FolderFileLimit=100
PrependSubDir=1,2
PrependSubStr=[0-9]{4}.*
PrependString=_BSC1_
PrependTimestamp=0
PrependHostname=1
PrependCollectDateTime=1
ReplaceColonWith=_
ReplacePoundWith=|
RemoveFromFileName=[0-9]{12}.
PrependSeparator=_
AppendSubDir=1
AppendSeparator=_
AppendString=_BSC2_
AppendBefore=.xls
AppendSubStr=[0-9]{4}.*
80
About Data Acquisition Tools
removeZipExtBeforeMatch=0
[FTP parameters]
remoteHost=192.168.3.35
remoteUser=optima
remotePass=ENC(kknbeX)ENC
remoteDir=/data02/home/optima/upender/ftp/input_files/pound
removeArchiveDir=<...path...>
removeOnDownload=0
FTPType=ASCII
FTPActive=1
FTPSafetyPeriod=10
FTPStyle=stdWINDOWS
FTPDateFormat=MM-DD-YY
FTPTimeFormat=HH:MIAM
FTPDirMatch=<DIR>
Important: In the [DIR] section (also called [Directory Parameters] in some INI files), you should
replace:
• OPTIMA_DIR with the OPTIMA home directory, which is set as an environment variable,
and will be different depending on whether you are using Windows or UNIX
• <application_name> with the name of the application (minus any file extension)
• Forward slashes with back slashes, if you are using Windows
However, TEOCO recommends the following basic maintenance checks are carried out for the FTP
application:
Backup directory to ensure Weekly A file not transferring indicates a problem with the
files have been transferred application.
Log messages for error Weekly In particular any Warning, Minor, Major and
messages Critical messages should be investigated.
81
OPTIMA 8.0 Operations and Maintenance Guide
A new log file is created every day. The information level required in the log file is defined in the
General Settings dialog box and will be one of the following:
• Debug
• Information
• Warning
• Minor
• Major
• Critical
These levels help the user to restrict low level severity logging if required. For example, if Minor is
selected then only Minor, Major and Critical logging will occur.
Note: Utilities are provided for searching and filtering log messages and also for loading the
messages into the database. For more information, see Checking Log Files on page 41.
The log file is expected to have information related to any error files found in the particular
directory. For more information about the log file, see Checking a Log File Message on page 82.
You can either obtain the version details from the log file or you can print the information by typing
the following command at the command prompt:
In Windows:
opx_FTP_GEN_302.exe -v
In Unix:
opx_FTP_GEN_302 -v
For more information about obtaining version details, see About Versioning on page 33.
82
About Data Acquisition Tools
Troubleshooting
FTP Application
Application not Application has not been Use Process Monitor to check last run status.
transferring files. scheduled.
Check crontab settings.
Crontab entry removed.
Check configuration settings.
Application has crashed and
Process Monitor is not configured. Check process list and monitor file. If there is
a monitor file and no corresponding process
Incorrect configuration settings. with that PID, then remove the monitor file.
Server not accessible - network Note: The Process Monitor will do this
problems. automatically.
Check log for error messages that may
indicate the network problem.
Application exits Another instance is running. Use Process Monitor to check instances
immediately. running.
Collection of Each individual FTP client has no If the date and time can be read out of the file
current data cannot control over the order in which it name, or if data from different days is sent to
be prioritised over collects its data. It scans a set of different directories, configure instances to
older days. directories and subdirectories in collect for particular days. INI file options
the order returned by the FTP fileMask, excludeMask and dirmask may be
server (normally alphabetical), useful for this purpose.
and INI file settings do not change
the order. If you have access to the machines from
which the files are collected, you can
temporarily move all files older than a
specified number of days to another directory.
83
OPTIMA 8.0 Operations and Maintenance Guide
Note: The database client libraries for these different databases should be on the system path in
order for the Database Acquisition Tool to connect to them.
Important: If you want to run the Database Acquisition Tool on a UNIX (Sun Solaris) machine to
retrieve data from an SQL Server database, there are a number of additional pre-requisites:
1. Install the unixODBC library from http://www.unixodbc.org/, following the online instructions.
2. Install the FreeTDS library from http://www.freetds.org/, following the online instructions.
For more information, see Configuring the Database Acquisition Tool for Sun Solaris
Machines and SQL Server Databases on page 93.
84
About Data Acquisition Tools
Method Description
Query Mode The tool queries the database using a static SQL statement and the entire result set
of the query is output to a CSV file. In this mode, the tool does not store a history of
rows that have already been parsed from the database. Hence, no .lst file is created.
Date Query Mode The query is filtered by the date and time field in the query. The tool maintains a
history of rows previously parsed from database in .lst files. This means that the tool
outputs only new rows to the output file.
The following is an example of Table, Data, and INI file in the Query Mode:
INI Example:
[MAIN]
InterfaceID=001
ProgramID=333
InstanceID=001
85
OPTIMA 8.0 Operations and Maintenance Guide
[DIR]
LogDir=/OPTIMA_DIR/<application_name>/log
TempDir=/OPTIMA_DIR/<application_name>/temp
PIDFileDir=/OPTIMA_DIR/<application_name>/prid
DirTo=/OPTIMA_DIR/<application_name>/out
[DBConfiguration]
DBString=OPT70
UserID=aircom
Password=ENC(Krw'jdep)ENC
DBClient=Oracle
[OPTIONS]
QueryMode=0
[QUERYMODE]
Name=MyQuery2
DateTimeFormat=YYYY/MM/DD HH24:MI:SS
Note: The Program ID, Interface ID and Instance ID make up the PRID. For more information, see
About PRIDs on page 29.
Important: In the [DIR] section (also called [Directory Parameters] in some INI files), you should
replace:
• OPTIMA_DIR with the OPTIMA home directory, which is set as an environment variable,
and will be different depending on whether you are using Windows or UNIX
• <application_name> with the name of the application (minus any file extension)
• Forward slashes with back slashes, if you are using Windows
86
About Data Acquisition Tools
KeyField1 Value, ….,KeyFieldN Value, Minute and second portion of the DateTimeField.
• The tool will adjust the DateTimeField values in the output file if AdjustForDST is set.
• The DateTimeFormat for formatting any DateTime fields in the output file.
The following is examples of a table, some data, and an INI file in the Date Query Mode:
INI example:
87
OPTIMA 8.0 Operations and Maintenance Guide
[MAIN]
InterfaceID=001
ProgramID=333
InstanceID=001
LogSeverity=2
Verbose=1
[DIR]
LogDir=/OPTIMA_DIR/<application_name>/log
TempDir=/OPTIMA_DIR/<application_name>/temp
PIDFileDir=/OPTIMA_DIR/<application_name>/prid
DirTo=/OPTIMA_DIR/<application_name>/out
DirLst=/OPTIMA_DIR/<application_name>/lst
[DBConfiguration]
DBString=OPT70
UserID=aircom
Password=ENC(ZqoT'h/r)ENC
DBClient=Oracle
[OPTIONS]
QueryMode=1
[DATEQUERYMODE]
Name=MyQuery
Granularity=3
LookBackPeriod=100
AdjustForDST=0
OffsetWhenDSTActive=0
OffsetWhenDSTInactive=0
88
About Data Acquisition Tools
DateTimeField=DATENTIME
DateTimeFormat=YYYY/MM/DD HH24:MI:SS
NumberKeyFields=2
KeyField1=BSC
KeyField2=CELL
Note: The Program ID, Interface ID and Instance ID make up the PRID. For more information, see
About PRIDs on page 29.
Important: In the [DIR] section (also called [Directory Parameters] in some INI files), you should
replace:
• OPTIMA_DIR with the OPTIMA home directory, which is set as an environment variable,
and will be different depending on whether you are using Windows or UNIX
• <application_name> with the name of the application (minus any file extension)
• Forward slashes with back slashes, if you are using Windows
Parameter Description
Parameter Description
89
OPTIMA 8.0 Operations and Maintenance Guide
Parameter Description
Parameter Description
Important: If you want to run the Database Acquisition Tool on a UNIX (Sun Solaris) machine to
retrieve data from an SQL Server database, then you must set these parameters in a specific way.
For more information, see Configuring the Database Acquisition Tool for Sun Solaris Machines and
SQL Server Databases on page 93.
90
About Data Acquisition Tools
Parameter Description
QueryMode 0 - The Database Acquisition Tool will read remaining parameters from the
[QUERYMODE] section.
1 - The Database Acquisition Tool will read remaining parameters from the
[DATEQUERYMODE] section.
Parameter Description
91
OPTIMA 8.0 Operations and Maintenance Guide
Parameter Description
92
About Data Acquisition Tools
Parameter Description
Important: If you want to run the Database Acquisition Tool on a UNIX (Sun Solaris) machine to
retrieve data from an SQL Server database, then you must also configure a separate .odbc.ini file
as well. For more information, see Configuring the Database Acquisition Tool for Sun Solaris
Machines and SQL Server Databases on page 93.
You should modify the [DBConfiguration] section of the main Database Acquisition Tool INI file as
follows:
Parameter Description
DBClient ODBC
DBString The name of database the Database Acquisition Tool will connect to.
Important: This must match the name of the [DBName] section of the .odbc.ini
file.
For more information, see Configuring the Database Acquisition Tool INI File on page 89.
It is recommended that you give the main section of the configuration (INI) file the same name as
the database from which you want to retrieve data. This will make it easier to refer to in the future.
93
OPTIMA 8.0 Operations and Maintenance Guide
Parameter Description
Description An optional description explaining, for example. what the INi file contains and
does.
Driver The network path for the driver or client library.
Trace To create logging files for the data retrieval process, specify 'Yes', otherwise set
as 'No'.
Server The IP address used to connect to the database.
Port The port used to connect to the database.
UID The user or schema name for the database.
PWD (Optional) The password for the user/schema name corresponding to the defined
UID.
Database The name of the database.
[database1]
Description=ODBC connection to DB name
Driver=/usr/local/freetds/lib/libtdsodbc.so
Trace=No
Server=127.0.0.1
Port=162
UID=admin
PWD=password
Database=database1
In Windows type:
opx_DAP_GEN_333.exe opx_DAP_GEN_333.ini
In Unix type:
opx_DAP_GEN_333 opx_DAP_GEN_333.ini
94
About Data Acquisition Tools
<timePeriodMsg>. DEBUG
8002 Empty result set or no new rows found. Deleted temp file. INFORMATION
8003 Empty result set. Deleted temp file. INFORMATION
8800 Failed to open file <fileName>. WARNING
8810 Reading lst file <fileName>. DEBUG
8811 First time seen data for this date and hour. File does not exist. DEBUG
95
OPTIMA 8.0 Operations and Maintenance Guide
The OPTIMA CORBA Client connects to the CORBA interface for PM data acquisition using the
CORBA Naming Service running on the host or data source, which is usually an Element
Management System (EMS) or a Network Management System (NMS).
This picture shows the process flow for the OPTIMA CORBA Client:
Note: The getHistoryPMData (…) is an asynchronous method, so the CORBA client does not wait
till the PM file generation by the server. It just request to the server to generate the file. Then the
server generates the PM file on the location passed in the getHistoryPMData method.
Equipment vendors publish the details of their specific CORBA interface as Interface Definition
Language (IDL) files. These IDL files are used to create the CORBA client and server applications.
As IDL files use a proprietary file format, a specific CORBA client is required for each specific
CORBA interface. The requested data will be output as CSV files, either directly on the OPTIMA
Mediation Device (MD) or on the NMS or EMS and downloaded to the MD via the OPTIMA FTP
application.
You should refer to specific interface documents for the OPTIMA CORBA client deployed on a
particular network.
96
About Data Acquisition Tools
Function Action
Logging Status and error messages are recorded in a daily log file.
Error Files If the application detects an error in the input file that prevents processing of
that file, then the file is moved to an error directory and processing continues
with the next file.
Monitor Files The application runs in a scheduled mode. A monitor (PID) file, created each
time the application is started, ensures that multiple instances of the application
cannot be run. The PID file is also used by the OPTIMA Process Monitor to
ensure that the application is operating normally.
PRID The automatically-assigned PRID uniquely identifies each instance of the
application. It is composed of a 9-character identifier, made up of Interface ID,
Program ID and Instance ID.
The Interface ID is made up of 3 numbers but the Program ID and Instance ID
are both made up of 3 characters, which can be a combination of numbers and
uppercase letters.
For more information, see About PRIDs on page 29.
Backup The application can store a copy of each input file in a backup directory.
For more details on these common functions, see Introduction on page 15.
Parameter Description
97
OPTIMA 8.0 Operations and Maintenance Guide
Parameter Description
MAX_TIME_MINUTES The maximum period of time in minutes (up to a limit of 30) for which the
data can be requested.
Based on this value, the START_TIME and END_TIME are modified
accordingly.
FTP_ADDR The full link(either the ftp hyperlink or the local drive path) of the file in
which the data from the server is to be stored.
The filename contains a placeholder for END_TIME which is replaced by
the value specified for the END_TIME parameter.
FTP_USER The username for FTP data.
FTP_PASS The password for FTP data.
TP_SELECT_LIST A server parameter for the gethistoryPMdata() method, which must be
one of these types.
• If the request is for data from a single managed element (ME), for
example: 'TP_SELECT_LIST=
EMS:Huawei/T2000,ManagedElement:4063236'
• If the request is for data from all managed elements of the network to
be fetched. For each ME one file will be produced at FTP_ADDR.
This can be done by using the parameter value
'ManagedElementFromServer' or (on certain supported servers) by
leaving this parameter blank.
WAIT_FOR_UPLOAD A server parameter for the gethistoryPMdata() method.
PM_PARAMETERS Specifies the list of parameter types for which you want to query the
performance.
If this list is null, the performance of all the parameter types is queried.
MANAGED_ELEMENT_FIL The name of the file (including the directory path) which contains the list
ENAME of managed elements.
If this parameter is not specified, a file called “ManagedElement.txt” will
be created in the current working directory if
TP_SELECT_LIST=ManagedElementFromServer.
ME_COUNT This parameter is used by the getAllManagedElementNames method.
It fetches the specified number (the default is 20) of managed elements in
one iteration.
GRANULARITY The granularity of the data to be fetched from server.
The default value is 15 minutes (15min).
DATA_SAFETY_PERIOD The time in minutes that the client will wait before making a request to
fetch data.
WAIT_BEFORE_REQUEST The time in milliseconds that the client will wait before requesting data for
ING the next Managed Element.
_ANOTHER_ME
Values can start from 0, and a recommended value would be between 0
and 999.
This is particularly useful for avoiding putting excessive demands on the
server when it is not able to respond quickly.
98
About Data Acquisition Tools
EMS_SESSION_FACTORY=TMF_MTNM.Class/Ericsson.Vendor/Ericsson/SOO-
TMFv1_3_anknms.EmsInstance/3.0.Version/Ericsson/SOO-
TMFv1_3_anknms.EmsSessionFactory_I
AIRCOM_TEST=0
EMS_USER=corbausr
EMS_PASS=huawei1
STATUS_DIR=/home/optima/TestCorba/execution/status
START_TIME=20100820040000.0Z
END_TIME=20100102011203.0Z
MAX_TIME_MINUTES=20
FTP_ADDR=10.6.104.44:/home/optima/TestCorba/execution/out/data-
END_TIME.csv
FTP_USER=optima
FTP_PASS=optima
TP_SELECT_LIST=ManagedElementFromServer
WAIT_FOR_UPLOAD=0
PM_PARAMETERS=
MANAGED_ELEMENT_FILENAME=/home/optima/TestCorba/execution/status/ME.txt
ME_Count=30
GRANULARITY=15min
DATA_SAFETY_PERIOD=30
WAIT_BEFORE_REQUESTING_ANOTHER_ME=1000
99
OPTIMA 8.0 Operations and Maintenance Guide
This section describes the message log codes for the CORBA server:
100
About SNMP Data Acquisition
OPTIMA can use SNMP (Simple Network Management Protocol) to detect SNMP devices such as
routers, servers and switches on the network and to collect reports from them. It can be configured
to use SNMP auto-collection or to use collection criteria that are manually provided.
Important: SNMP data acquisition carried out using the new Netrac interface is handled by Netrac
as described in the SNMP Agent User Guide for Netrac.
This table describes which of the SNMP data acquisition components are used under what
circumstances by OPTIMA. There are three possible approaches to data acquisition. These are
represented by following the actions described in rows:
• ABC
• DBC
• E
• Set up the:
Mediation Agent
(see About the Mediation Agent on page 157)
and
Web Services
(see Setting up the Web Server on page 158)
101
OPTIMA 8.0 Operations and Maintenance Guide
• Set up the:
Mediation Agent
(see About the Mediation Agent on page 157)
and
Web Services
(see Setting up the Web Server on page 158)
• Set up the:
Mediation Agent
(see About the Mediation Agent on page 157)
and
Web Services
(see Setting up the Web Server on page 158)
102
About SNMP Data Acquisition
Component Description
103
OPTIMA 8.0 Operations and Maintenance Guide
Component Description
Network Management Uses different applications to monitor and control managed devices, and
System (NMS) provide the bulk of processing and memory resources required for network
management.
One or more NMSs can exist on any managed network.
MIBs use the notation defined by ASN.1, and describe the structure of the management data of a
device subsystem, using a hierarchical namespace containing object identifiers (OID). An example
MIB could be 1.3.6.1.4.1.XXXX.1.2.102.
The MIB hierarchy can be depicted as a tree with a nameless root, the levels of which are assigned
by different organizations:
• The top-level MIB OIDs belong to different standards organizations
• Lower-level OIDs are allocated by associated organizations
The original MIB for managing a TCP/IP Internet was called MIB-I. MIB-II, published later, added a
number of useful variables missing from MIB-I.
Each OID identifies a variable that can be read or set using SNMP. The OIDs describe a tree
structure, where each number separated by a decimal point represents a branch on that tree. Each
OID begins at the root level of the OID domain and gradually becomes more specific.
Version Description
Typically, SNMP uses UDP ports 161 for the Agent and 162 for the Manager.
The Manager may send requests from any available ports (source port) to port 161 in the agent
(destination port). The agent response will be given back to the source port. The Manager will
receive traps on port 162. The agent may generate traps from any available port.
104
About SNMP Data Acquisition
The SNMP Poller Configuration Interface enables you to establish configuration details for SNMP
Auto-Collection on the OPTIMA database:
3. You have run the 'create_SNMP_tables.sql' script on the database to which you want to
connect.
4. You have installed the SNMP Poller Interface, using the setup.exe provided with the
installation package.
5. You have a pre-defined set of MIBs (Management Information Bases), stored as CSV files.
These contain data for the managed objects (that is, the characteristics of the managed
device that you want to manage).
105
OPTIMA 8.0 Operations and Maintenance Guide
Note: The user must have been granted the OPTIMA_SNMPPOLLER_USERS role.
OPTIMA_SNMPPOLLER_USER is the default user provided.
In this dialog box, you can configure the SNMP Poller. The current configuration details are
loaded and kept in memory, until you save them.
106
About SNMP Data Acquisition
1. On the Reports-Managed Objects tab, load the managed objects and create reports in
report groups.
For more information, see Loading MIBs and Creating Reports on page 107.
2. On the Devices tab, define the devices on which you want to run the reports.
For more information, see Defining Devices (Agents) to be Polled on page 115.
3. On the Reports-Devices tab, set which reports you want to run on which devices.
For more information, see Assigning Reports to Device Types on page 129.
4. On the Pollers-Devices tab, define the poller machines, and set which devices they will
poll.
For more information, see Assigning Devices (Agents) to Machines on page 135.
5. On the Summary tab, view the details that you have configured.
For more information, see Viewing a Summary of the SNMP Poller Configuration on page
140.
7. Generate an INI file containing the settings that you have configured.
For more information, see Generating an INI File of SNMP Poller Settings.
8. Manually tune the INI file with a number of additional parameters, if required.
For more information, see Manually Tuning the SNMP Poller Settings INI File on page 144.
Important: As you complete the details on each tab, it is recommended that you save your
configuration using the 'Save to database' button.
Important: To do this, you must have a pre-defined set of MIBs (Management Information Bases),
stored as CSV files. If they are not stored as CSV files, then you can convert them using the MIB to
CSV option. For more information, see Converting MIB Files to CSV Files on page 111.
1. From the MIBs menu, click Load Managed Objects from CSV file.
2. In the dialog box that appears, locate the CSV file containing the MIBs that you want to
load, and then click Open.
107
OPTIMA 8.0 Operations and Maintenance Guide
The required managed objects are loaded into the Managed Objects pane:
Tip: You can remove a managed object (or an entire branch of managed objects) from the
Managed Objects pane by right-clicking it, and then clicking Remove Managed Object
(or Remove Managed Object Branch) from the menu that appears.
To create a report:
1. Right-click in the Reports pane, and from the menu that appears, click Add Report:
2. In the Report Details dialog box, type the name of the report and a description of what it
contains:
108
About SNMP Data Acquisition
3. Click OK.
4. To add a managed object to the report, in the Managed Objects pane, select the required
managed object and then either:
o Drag it into the Reports pane, and drop it onto the report name
- or -
o Drag it into the Report Managed Objects pane, and drop it into the white space
Tip: You can assign an entire group of managed objects to a report, by dragging and
dropping the folder that contains them.
This picture shows an example report, which will return management data on the 'System' group of
managed objects (for example, sysName and sysLocation):
SNMP Report
After you have created a report, you must add it to a report group, so that it can be assigned to a
device later on.
109
OPTIMA 8.0 Operations and Maintenance Guide
2. Right-click in the Report Groups pane, and from the menu that appears, click Add Report
Group:
3. In the Report Groups dialog box, type the name of the report group and a description of
what it contains:
4. Click OK.
5. To add a report to the report group, in the Reports pane, select the required report and
then either:
o Drag the report into the Report Groups pane, and then drop it onto the report name
- or -
o Drag the report into the Reports in group pane, and then drop it into the white space
Tip: You can select multiple reports, by clicking on each one by holding down the Ctrl
button.
110
About SNMP Data Acquisition
6. Click the 'Save to database' button to save the report groups and reports.
Important: To use this conversion option, you must have the Java Runtime Environment (JRE)
installed, and the Java command should be set on the system PATH. The conversion option uses a
number of jar files (MibToCSV.jar and MIB parser library jar files), which are installed automatically
the first time that you open the MIB to CSV dialog box, and are stored in C:\Program Files
(x86)\Aircom International\Optima Backend 8.0\Bin\MIBToCSV.
To do this:
111
OPTIMA 8.0 Operations and Maintenance Guide
112
About SNMP Data Acquisition
5. If any errors have occurred during the conversion process, you can save the error log as a
separate file to be, for example, distributed to relevant groups. To do this:
o Click the Save Log to file button
o In the dialog box that appears, choose a suitable location and type an appropriate
filename
o Click Save
To do this:
1. In the SNMP Poller Configuration dialog box, from the MIBs menu, click Import.
2. In the dialog box that appears, locate the required *.smc file and then click Open.
You can also export all of the loaded MIB files in the same format, and merge the MIBs within
different systems.
To do this:
1. In the SNMP Poller Configuration dialog box, from the MIBs menu, click Export.
2. In the dialog box that appears, locate the required folder, type a name for the *.smc file and
then click Save.
113
OPTIMA 8.0 Operations and Maintenance Guide
Important: Before deleting a report or report group, ensure that it is not in use, otherwise you may
affect the rest of your configuration.
1. In the Reports pane, right-click the report that you want to edit.
The Report Details dialog box appears, in which you can edit the name and description of
the report.
3. Click OK.
1. In the Report Managed Objects pane, right-click the managed object that you want to
remove from the report.
Tip: To remove all managed objects from a report, right-click in the Report Managed
Objects pane, and from the menu that appears, click Remove All.
To delete a report:
1. In the Reports pane, right-click the report that you want to delete.
1. In the Report Groups pane, right-click the report group that you want to edit.
2. Right-click, and from the menu that appears, click Edit Report Group.
The Report Group Details dialog box appears, in which you can edit the name and
description of the report group.
3. Click OK.
1. In the Reports in group pane, right-click the report that you want to remove from the report
group.
2. From the menu that appears, click Remove report from group.
Tip: To remove all reports from a report group, right-click in the Reports in group pane,
and from the menu that appears, click Remove All.
114
About SNMP Data Acquisition
1. In the Report Groups pane, right-click the report group that you want to delete.
1. Select the required view to determine whether devices are organised by vendor or by
function.
2. Right-click in the left pane, and from the menu that appears, click Add Vendor or Add
Function as appropriate.
3. In the dialog box that appears, type the vendor or function name and then click OK, for
example:
4. Right-click the vendor name or function name, and from the menu that appears, click Add
Type Group.
5. In the dialog box that appears, type the name of the type group and then click OK:
6. Right-click the type group name, and from the menu that appears, click Add Type.
In the dialog box that appears, type the device type and then click OK:
You now have the correct structure for organising your individual devices and you can
define the individual devices to be polled in two ways:
o Find and load existing devices, either automatically or manually
- or -
o Add devices manually
When you have defined your devices, click the 'Save to database' button to save them.
115
OPTIMA 8.0 Operations and Maintenance Guide
The first rule identifies any devices that include a description beginning with HP.
The second rule uses a regular expression to find any device that includes Cisco anywhere in the
description.
The third rule uses a regular expression to find any device that includes SCE or NZR anywhere in
the description.
For any function defined in the Devices tab of the SNMP Poller Configuration Interface GUI, it is
possible to combine multiple rules to implement the AND operation. However, a single rule is
required to implement the OR operation, using the example Regex (third) rule format.
To add a recognition rule for a function that you have selected in the left-hand pane:
1. From the Rule Type drop-down list, select the required rule type from:
o Begins with
o Ends with
o Contains
o Does not contain
o Equals
o Does not equal
o Regex (Regular Expression)
Note: If, in the left-hand pane, you select All device type groups/All device types/Any
functions, the Rule Type drop-down list is unavailable.
2. If applicable, select from the OID drop-down list the required object identifier.
5. Click the Return key. The rule is listed below the entry row and you can add further rules if
required.
2. From the menu that appears, click Delete. The rule is removed.
116
About SNMP Data Acquisition
Note: If you want to find and load devices manually, using the SNMP Poller Configuration Interface,
see Finding and Loading Devices Manually on page 125.
1. Configure one or more scan definitions, which specify the search criteria for the devices.
2. Open the SNMP Poller Configuration Interface. The SNMP Discoverer scans for devices
based on the parameters defined in the Scan Definition file. Any devices that are found are
saved in the database, and displayed on the Already Discovered subtab of the Devices
tab:
Devices that are found automatically are also automatically assigned to a device type
according to the recognition rules specified in the Rules pane on the Devices tab if
possible.
3. Choose the Devices assigned for a selected item option and then an item in the left-hand
pane to see the devices assigned for that item.
- or -
Select the Unassigned devices option to see a list of devices that have been found
automatically but could not be assigned automatically because they do not comply with any
of the recognition rules.
4. Click on each of the unassigned devices and drag it onto the required item in the left-hand
pane. The devices are removed from the unassigned devices list and appear when the
Devices assigned for a selected item option is chosen.
117
OPTIMA 8.0 Operations and Maintenance Guide
Tips: To automate the search for devices even further, you can:
• Schedule the automatic device scan via Discoverer as a service using Cron.
• Configure a system alarm to be raised whenever a new device is discovered. For more
information on how to do this, see Creating Alarms for Discovered Devices on page 124.
1. In the left pane of the Devices tab, select the item to which the device is assigned.
2. In the SNMP Devices pane, on the Already discovered tab, select the Devices assigned
for a selected item option.
3. Right-click on the required device and from the menu that appears, select Edit Device. The
Device Details dialog box appears:
118
About SNMP Data Acquisition
To do this:
1. In the SNMP Poller Configuration Interface, from the Actions menu, select Run Scan
Definition Editor. The SNMP Scan Definition Editor appears:
2. Click New. A new name in the format Scan000 is generated and can be selected from the
Scan Definitions drop-down list.
3. Select the newly generated name. It appears in the Name field and you can change it if you
wish.
4. Specify the port number of the port with which the scanning will be conducted, the
community string that identifies the logical group of devices to be included in the scan, and
the SNMP version to be used.
119
OPTIMA 8.0 Operations and Maintenance Guide
Tip: If you want to delete an IP address, click the Delete button that appears when you
hover to the right of the required address:
Tip: The number of addresses that fall within this range is displayed in brackets to the right
of the range.
If you want to delete the range, click the Delete button that appears when you hover to the
right of the end address:
7. If you have defined an IP address range, you can add an exclusion, which will ignore a
particular IP address (or addresses) within this range. To do this:
o Click the Add Exclusion button to the right of the range:
120
About SNMP Data Acquisition
o In the Exclusions pane that appears, if you want to exclude a range of addresses,
then type the addresses that you want to exclude:
- or -
If you want to exclude a single address, then click the green left arrow button to
change to a single address, and then type the address that you want to exclude:
8. Repeat steps 5-7 to add all of the IP addresses, ranges and exclusions that you require.
9. Click Save.
You can also export the scan definition to an XML file using the Export button, and you can import
scan definition XML files that you have previously exported or have created manually, by using the
Import button. For more information about the XML file structure, see About Scan Definition Files
on page 121.
If you want to create or edit a scan definition file manually, then this topic describes the structure for
the scan definition XML file.
All XML files containing scan definitions begin with the header:
<?xml version="1.0" encoding="utf-8" ?>
More than one scan definition file can be included in the XML file which is delimited by <ScanDefinitions></ScanDefinitions>
tags.
Individual scan definitions are enclosed by a pair of <ScanDefinition> tags, and made up of:
• Four variables - Name, Port, Version, Community
• If required, a series of IP addresses, IP address ranges and IP address range exclusions,
all enclosed within their own tags
121
OPTIMA 8.0 Operations and Maintenance Guide
Scan Definition File Including Particular IP Addresses, Address Ranges and Ranges with
Exclusions
In this example:
• Each individual IP address is specified in its own pair of <StartIP> tags
• A range of IP adresses is specified using <StartIP> and <End IP> tags
• A range of exclusions is specified using <ScanExclusions><ScanExclusion> and
</ScanExclusion></ScanExclusions> tags
<?xml version="1.0" encoding="utf-8"?>
<ScanDefinitions
xmlns="http://schemas.datacontract.org/2004/07/Aircom.Optima.SNMP.ScanDef
initionEditor" xmlns:i="http://www.w3.org/2001/XMLSchema-instance">
<ScanDefinition>
<Name>192.168.13.20-28</Name>
<Port>161</Port>
8.02</Version>
<Community>public</Community>
<ScanAddress>
<ScanAddress>
<StartIp>192.168.13.21</StartIp>
<EndIp i:nil="true" />
</ScanAddress>
<ScanAddress>
<StartIp>192.168.13.22</StartIp>
<EndIp>192.168.13.31</EndIp>
<ScanExclusions>
<ScanExclusion>
<StartIp>192.168.13.30</StartIp>
<EndIp>192.168.13.31</EndIp>
</ScanExclusion>
<ScanExclusion>
<StartIp>192.168.13.29</StartIp>
<EndIp i:nil="true" />
</ScanExclusion>
</ScanExclusions>
</ScanAddress>
<ScanAddress>
<StartIp>192.168.13.20</StartIp>
<EndIp i:nil="true" />
</ScanAddress>
</ScanAddress>
</ScanDefinition>
</ScanDefinitions>
122
About SNMP Data Acquisition
In this example, all the IP addresses within each Name are included:
123
OPTIMA 8.0 Operations and Maintenance Guide
1. In the SNMP Poller Configuration Interface, from the Actions menu, select Run Scan
Definition Editor. The SNMP Scan Definition Editor appears.
- or -
To export an individual scan definition to a file, select the required scan definition from the
drop-down list in the Scan Definitions field and click Export.
If you have selected an individual file in the Scan Definitions field, a message appears.
Click No to save an individual file or Yes to save all files.
3. In the Save As dialog box, select the folder in which your exported file is to be stored, type
a file name for the file and click Save.
1. In the SNMP Poller Configuration Interface, from the Actions menu, select Run Scan
Definition Editor. The SNMP Scan Definition Editor appears.
2. Click Import.
3. In the Open dialog box, select the required file and click Open. If scan definitions with the
same names already exist, you will be asked if you wish to overwrite them.
4. An import summary appears showing how many scan definitions have been added and
how many replaced. Click OK.
To do this, create a system alarm using the Alarms Editor, in which the Set Alarm SQL and Clear
Alarm SQL query the DISCOVERED_DATETIME parameter from the AIRCOM.SNMP_DEVICE
table.
124
About SNMP Data Acquisition
To do this:
1. In the SNMP Devices pane, on the Discover now tab, click the Find Devices button.
Tip: The Discover Devices dialog box can be used to scan the network for any existing
SNMP devices in the network, and identify any SNMP ALIVE devices.
2. Define the criteria that you want to use to search for existing devices. The criteria are
described in this table:
Item Description
IP Address Begin Range Set the start and finish range for the IP addresses of the required
and IP Address End devices. If you know the IP address of the device that you are looking
Range for, type the same value in the start and finish range.
Exclude IP Begin and If required, exclude certain IP addresses. This is particularly useful if you
Exclude IP End only eliminate a portion of IP addresses within the range because you
already know that these are not applicable (for example, they may be
assigned to servers).
For example, rather than scan a whole network from (begin range)
192.168.0.0 to (end range) 192.168.255.255, you could choose to
exclude a group of servers in between, from (exclude IP begin)
192.168.10.10 to (exclude IP end) 192.168.120.150.
125
OPTIMA 8.0 Operations and Maintenance Guide
Item Description
Port Choose the IP address port from which the device transmits the
information.
Read Community Define the Read Community, which is the community string to use in all
poller requests.
3. When you have set all of the criteria, click the Run Discovery button. The Discover
Devices dialog box appears.
4. In the Discover Devices dialog box, select the folder to which you want to save the results,
and then click OK.
All of the found devices that meet the chosen criteria and are SNMP-compliant are
displayed in the Discovered Devices pane:
The results are saved automatically in a time-stamped CSV file in the folder that you
specified, for example:
126
About SNMP Data Acquisition
Warning: The details of the devices identified are not yet saved to the database even if
you click the Save to database button.
5. Click OK.
The discovered devices appear on the Discover now sub-tab on the Devices tab of the
SNMP Poller Configuration Interface:
6. To assign them to their corresponding type, drag and drop them onto the type name in the
left-hand pane. The selected devices appear on the Already discovered sub-tab when the
Devices assigned for a selected item option is selected.
7. Click the Save to database button. The device details are saved to the SNMP_DEVICE,
SNMP_DEVICE_TUPLES and SNMP_TUPLES database tables.
Tip: If you have run a device discovery before, rather than re-scanning the network again (and
creating superfluous data in the network), you can load the previous results. To do this:
o In the Discover Devices dialog box, click the 'Load discovered devices' button
o Locate the required file
o Click Open
The devices discovered at that time and date are loaded into the Discovered Devices
pane.
3. In the SNMP Devices pane, on the Already discovered sub-tab, select the Unassigned
devices option.
127
OPTIMA 8.0 Operations and Maintenance Guide
4. Right-click in the SNMP Devices pane and from the menu that appears, click Add Device.
Item Description
IP Address The IP address of the device that the SNMP Poller will connect to.
Port The number of the port from which the device transmits the information.
Read Community The community string used in all poller requests.
Note: The community string is encrypted when it appears in the Selected Type
Devices pane, but is decrypted if you choose to edit the device details.
Hostname The host name of the device.
Time Out The period of time after which the device will be considered as 'timed out'.
The unit of measure used for this value is 10 milliseconds. For example, if you
set the value to 100, the timeout will be 1 second (100*10 milliseconds = 1000
milliseconds).
Retry The number of retries that the SNMP Poller will attempt after a 'timeout' when
polling data from this device.
SNMP Version The version of SNMP to use.
6. Click OK.
The device is listed in the SNMP Devices pane when the Unassigned devices option is
selected.
Note: If you create a device using the Devices Details dialog box and click the Save to database
button, the device details are saved to the appropriate table in the database (SNMP_DEVICE
table). The device only qualifies as a partially discovered device when you drag-and-drop it to a
device type in the Device Types frame of the Devices tab and you click the Save to database
button (updating the SNMP_DEVICE_TUPLES and SNMP_TUPLES database tables).
128
About SNMP Data Acquisition
Once you have dragged and dropped a device to assigned it to a device type, it is removed from
the Unassigned devices list and appears under the Devices assigned for a selected item list
when the associated device type is selected:
Tip: Before deleting a report or report group, ensure that it is not in use, otherwise you may affect
the rest of your configuration.
2. From the menu that appears, click Edit Vendor, Type Group or Type as required.
3. In the dialog box that appears, rename the item, and then click OK.
To edit a device:
1. In the Selected Type Devices pane, right-click the device that you want to edit.
3. In the Device Details dialog box, edit the device parameters as required.
4. Click OK.
To delete a device:
1. In the Selected Type Devices pane, right-click the device that you want to delete.
You can map the two together, assigning the required reports to the device types that will run them.
To do this:
129
OPTIMA 8.0 Operations and Maintenance Guide
1. In the All Report Groups pane, select the required report group:
2. Select the required view to determine whether devices are organised by vendor or by
function.
3. Drag the report group into the left pane, and then drop it onto the required device.
Depending on the tree level onto which you drop it, the report group will be assigned at a
different level. For example:
o If you drop it onto the vendor name, the report group will be assigned to all device
types associated with this vendor:
o If you drop it onto the device type group, the report group will be assigned to all device
types associated with this group:
130
About SNMP Data Acquisition
o If you drop it onto the device type, the report group will be assigned to all devices of
that device type:
Tip: To undo an assignment, right-click the report group name, and from the menu that
appears, click Remove Report Group. Then click Yes to confirm the deletion.
4. Click the 'Save to database' button to save the assignments that you have configured.
Tip: You can import and export the vendors, device type groups, device types, report
groups, reports and associated OIDs that are saved on this tab, in order to share them
across different systems. For more information, see Importing and Exporting Device Types
and Reports on page 131.
To do this:
3. In the dialog box that appears, locate the required file, and then click Open.
4. If there are potential conflicts between the report groups/reports that already exist and
those that are being imported, then you will be asked if you want to overwrite the
duplicates:
Important:
o Duplicate vendors and device type groups will always be merged
o Identical report groups (those with the same name and the same reports as a report
group already present in the SNMP Poller) and/or identical reports (those with the
same name and same OIDs as a report already present in the SNMP Poller) are
ignored, and are not included in the import messages
131
OPTIMA 8.0 Operations and Maintenance Guide
As device types are unique per vendor, if you import a device type group which includes
device types that already exist in another device type group under the same vendor, the
duplicate device types are ignored:
5. Click OK.
You can also export device types and reports in the same format, and merge them within different
systems. To do this:
2. In the left-hand pane, right-click the appropriate element, depending on what you want to
export:
o Right-click in white space to export all details of all vendors:
132
About SNMP Data Acquisition
o Right-click a vendor to export all of its subitems (device type groups, device types and
reports and so on):
o Right-click a device type group to export the vendor, device type group, device types,
report groups, reports and OIDs for that device type group only:
o Right-click a device type to export the vendor, device type group, device type, report
groups, reports and OIDs for that device type only:
3. In the dialog box that appears, locate the required folder, type a name for the *.asd file and
then click Save.
133
OPTIMA 8.0 Operations and Maintenance Guide
Use an http POST to send an XML file containing details of devices to the database.
The command must include the path to the public add function. Typically the url will end
with:
.../aircom/optima/discovery-api/2011/02/Add
<SnmpDevice>
<IpAddress>12.1.1.0</IpAddress>
<Port>777</Port>
<ReadCommunity>comm</ReadCommunity>
<Hostname>hosta</Hostname>
<Vendor>Motorola</Vendor>
<Type></Type>
<Functions>
<Function>RNC</Function>
</Functions>
<Timeout>100</Timeout>
<Retry>1</Retry>
<SnmpVersion>2</SnmpVersion>
<Discovered>04-05-11 13:49:31</Discovered>
<CollectionDeviceProximity>
<CollectionDevice MachineId="101" roundtrip="29"/>
<CollectionDevice MachineId="102" roundtrip="25"/>
<CollectionDevice MachineId="103" roundtrip="55"/>
<CollectionDevice MachineId="104" roundtrip="150"/>
</CollectionDeviceProximity>
<PollerInstance></PollerInstance>
<OidWeight>10</OidWeight>
</SnmpDevice>
This table describes the conditions that apply to the parameters used:
134
About SNMP Data Acquisition
To do this:
2. Select the device type containing the device(s) that you want to assign. The associated
PRIDS are listed in the right-hand pane. Click on the plus symbol beside a PRID to see
the PRID details:
Note: If a device has not yet been assigned to a machine, it will not have a PRID.
3. To assign a particular device, drag it into the Poller Instances pane and drop it either onto
the required machine or onto an existing instance.
If you choose an existing instance, the PRID is added to the list of PRIDs already existing
for that instance in the Devices assigned to the selected Discoverer pane.
135
OPTIMA 8.0 Operations and Maintenance Guide
If you choose a machine so that a new instance is created, the Instance Parameters
dialog box appears:
Complete the details for the polling device on this machine. This table describes the
editable parameters:
Item Description
LOG The folder for the log files created by the SNMP Poller.
TEMP The folder for the temporary files created by the SNMP Poller.
PRID The folder for the monitor (PRID) file.
OUT The folder for the generated reports.
CONFIG The folder for a copy of the SNMP Poller configuration file
PollerConfig_[PRID].xml. This is normally retrieved from the web
service. This copy is used if the web service is unable to retrieve the file.
ERROR The folder for incomplete reports.
LOG SEVERITY The extent of logging required. Choose from: Debug (default),
Information, Warning, Minor, Major and Critical.
LOG GRANULARITY The frequency of logging required. Choose from Continuous, Daily,
Weekly and Monthly.
ITERATION GAP The delay before polling the next device.
THREADS The number of processes to be run in parallel.
136
About SNMP Data Acquisition
Item Description
FOLDER FILE LIMIT The maximum number of output files that can be created in the output
directory.
NO. OF ITERATIONS The number of times that the process is to be run.
VERBOSE Determines whether or not the log file information is displayed on screen.
STANDALONE Determines whether or not a monitor file is run with this process.
EnableDebugCSV Determines whether or not CSV file debugging is performed.
RemoveSubFolder Indicates whether the subfolder level identifying the device host name
should be removed from the output folder structure (1) or not (0).
AddHostName Indicates whether the host name of the device should be added as the
first column in the output CSV files (1) or not (0).
DeviceCheckThreads Defines the number of devices that will be checked for availability in
parallel.
MaxRepetition Indicates the number of instances of each column object that the SNMP
Poller will try to get in each report.
OutputType Indicates the format used to report the octet string OIDs.
TerminatingASCIINumber If the OutputType parameter is set to 2, this indicates the number of
the character with which to terminate the string.
NonPrintableCharacter If the OutputType parameter is set to 2 or 3, this indicates the character
with which to replace non-printable characters.
Tip: For fields requiring a path, you can use the Browse button to locate the required
destination.
You can increase or decrease the number of threads (simultaneous running processes).
4. Click OK.
The device appears in the Poller Instances pane and you can click on it to see the details
in the Devices assigned to the selected Discoverer pane:
It is also shaded in the top pane, and has a PRID assigned to it:
o In the Devices assigned to the selected Discoverer pane, right-click the required
device name, and from the menu that appears, click Remove Device. Then click Yes
to confirm the removal.
- or -
o Right-click in the Devices assigned to the selected Discoverer pane, and from the
menu that appears, click Remove All Devices. Then click Yes to confirm the removal.
5. Click the 'Save to database' button to save the assignments that you have configured.
137
OPTIMA 8.0 Operations and Maintenance Guide
To do this:
2. To ensure that the details shown on the tab are up to date (if for example you have been
using the SNMP Scan Definition Editor), click the Reload data button.
4. To assign a particular scan definition, drag it into the Discoverer Instances pane and drop
it either onto the required machine or onto an existing instance.
If you choose an existing instance, the PRID is added to the list of PRIDs already existing
for that instance in the Scan definitions assigned to the selected Discoverer pane.
If you choose a machine so that a new instance is created, the Instance Parameters
dialog box appears:
138
About SNMP Data Acquisition
Complete the details for the polling device on this machine, by either typing or clicking the
Browse button to find the appropriate location:
Item Description
LOG The folder for the log files created by the SNMP Poller.
TEMP The folder for the temporary files created by the SNMP Poller.
PRID The folder for the monitor (PRID) file.
CONFIG The folder where the snmpdiscoveryrules.xml and scanndef.xml files are
stored for local use if they are not to be retrieved by the web service.
LOG SEVERITY The extent of logging required. Choose from: Debug, Information (default),
Warning, Minor, Major and Critical.
LOG GRANULARITY The frequency of logging required. Choose from Continuous, Daily, Weekly
and Monthly.
ITERATION GAP The delay before polling the next device.
THREADS The number of processes to be run in parallel.
FOLDER FILE LIMIT The maximum number of output files that can be created in the output
directory.
NO. OF ITERATIONS The number of times that the process is to be run.
VERBOSE Determines whether or not the log file information is displayed on screen.
STANDALONE Determines whether or not a monitor file is run with this process.
Timeout The time in seconds that the Discoverer will spend attempting to identify a
device.
Retry The number of times that the Discoverer will try to identify a device again
after failing once.
Tip: For fields requiring a path, you can use the Browse button to locate the required
destination.
5. Click OK.
The scan definition is added to the Discoverer Instances pane and you can click on it to
see the details in the Scan Definitions assigned to the selected Discoverer pane:
- or -
o Right-click in the Scan Definitions assigned to the selected Discoverer pane, and
from the menu that appears, click Remove All Scan definitions. Then click Yes to
confirm the removal.
6. Click the 'Save to database' button to save the assignments that you have configured.
139
OPTIMA 8.0 Operations and Maintenance Guide
If you select an item in the Pollers pane, you can view more information in the Details pane.
- or -
• Locally (for more information, see About the SNMP Agent on page 433).
This decision determines the content of the INI files generated at the next step.
1. In the SNMP Poller Configuration Interface, from the Actions menu, select WebService
settings. The WebService Settings dialog box appears:
140
About SNMP Data Acquisition
3. Type the required web address into the WebService URL field.
4. Click OK.
2. In the Browse For Folder dialog box that appears, select the required location for the new
INI file:
Tip: You can create a new folder by selecting the required folder level and then clicking the
Make New Folder button.
3. Click OK.
141
OPTIMA 8.0 Operations and Maintenance Guide
[DIR]
LogDir=log3
TempDir=tmp
PIDFileDir=pid
DirTo=out
DirError=err
[MAIN]
MachineID=105
ProgramID=301
InstanceID=m01
LogGranularity=3
LogSeverity=1
PollingTime=300
StandAlone=0
RunContinuous=0
Iterations=1
Verbose=0
UseFolderFileLimit=1
FolderFileLimit=10000
[OPTIONS]
Threads=10
EnableDebugCSV=0
RemoveSubFolder=1
AddHostName=1
DeviceCheckThreads=20
MaxRepetition=5
OutputType=1
TerminatingASCIINumber=0
NonPrintableCharacter=.
OutputIncompleteReports=1
[SNMP_DEVICES]
NumberOfDevices=1
Device1=1
[1]
IPAddress=111.111.111.111
Port=100
Hostname=none
SNMPVersion=1
CommunityRead=ENC(lp\eaZ)ENC
RetryNo=2
Timeout=3
NumberOfReportsUsed=2
Report1=R1
Report2=R2
[REPORTS]
NumberOfReports=2
Report1=R1
Report2=R2
[R1]
Name=R1
NumberOfManagedObjects=3
mo1=sysDescr,.1.3.6.1.2.1.1.1.0,0
142
About SNMP Data Acquisition
mo2=sysObjectID,.1.3.6.1.2.1.1.2.0,0
mo3=sysUpTime,.1.3.6.1.2.1.1.3.0,0
mo4=syslocation,.1.3.6.1.2.1.1.6.0,0
[R2]
Name=R2
NumberOfManagedObjects=2
mo1=sysName,.1.3.6.1.2.1.1.5.0,0
mo2=sysServices,.1.3.6.1.2.1.1.7.0,0
Important: In the [DIR] section (also called [Directory Parameters] in some INI files), you should
replace:
• OPTIMA_DIR with the OPTIMA home directory, which is set as an environment variable,
and will be different depending on whether you are using Windows or UNIX
• <application_name> with the name of the application (minus any file extension)
• Forward slashes with back slashes, if you are using Windows
[DIR]
LogDir=log3
ConfigDir=asd
[MAIN]
MachineID=105
ProgramID=301
InstanceID=m01
WebServiceUrl=http://centralurl:12345
[DIR]
LogDir=a
TempDir=a
PIDFileDir=a
ConfigDir=a
[MAIN]
MachineID=000
ProgramID=311
InstanceID=001
LogGranularity=3
LogSeverity=1
PollingTime=300
StandAlone=0
RunContinuous=0
Iterations=1
Verbose=0
WebServiceUrl=http://centralurl:12345
[OPTIONS]
Threads=10
143
OPTIMA 8.0 Operations and Maintenance Guide
TimeOut=100
RetryNo=1
You can define the following details in the generated INI file, by adding the appropriate
parameter(s):
144
About SNMP Data Acquisition
145
OPTIMA 8.0 Operations and Maintenance Guide
Typically, the OutputType parameter defined in the OPTIONS section of the INI file influences the
default output type in use for all managed objects defined in the INI file. However, you can override
the output type for any managed object by manually defining a CUSTOM_OUTPUT section in the
INI file with the appropriate managed object(s) and output type parameter.
For example, you can add a CUSTOM_OUTPUT section to the example Poller INI file as shown
here:
[CUSTOM_OUTPUT]
Number=2
Output1=.1.3.6.1.2.1.1.1.0,3
Output2=.1.3.6.1.2.1.1.6.0,2
This means managed object - .1.3.6.1.2.1.1.1.0 - will override the OutputType value 1 (plain text -
non printable characters replaced with value of NonPrintable Character parameter) and use
OutputType value 3 (HEX string).
Also, managed object - .1.3.6.1.2.1.1.6.0 - will override the OutputType value 1 (plain text - non
printable characters replaced with value of NonPrintable Character parameter) and use OutputType
value 2 (plain text - up to TerminatingASCIICharacter value and non-printable characters replaced
by NonPrintableCharacter value). All other managed objects will use the OutputType value 1.
This feature is particularly useful in overcoming issues related to the SNMP Poller returning
required strings alongside junk strings because the default output type (for example, plain text) is
different from the polled string output type (for example, Hex string).
Parameter Indicates
ping_run Whether ping test is run (1) or not (0). The default value is 1.
ping_timeout The time in milliseconds to wait for each reply. Values in the range 1 to 10000 are
valid. The default value is 5000.
ping_requests The number of echo requests to send. Values in the range 1 to 5 are valid. The
default value is 5.
When configured to do so, the SNMP poller will use ICMP echo requests to ping each device in the
device availability threads.
The results of the ping tests are written to a csv file named with the convention
HOST_PORT_PINGSTATS_DATETIME.csv. For example:
Windows2000432_8010_PINGSTATS_20111222153633134.csv
146
About SNMP Data Acquisition
• Packets_Transmitted
• Packets_Received
• Min_Resp_Time
• Avg_Resp_Time
• Max_Resp_Time
Whether or not the ping test was successful is shown in the PingStatus column of the performance
CSV file. This table describes the file:
Column Indicates
147
OPTIMA 8.0 Operations and Maintenance Guide
Column Indicates
NextSuccessMinTime The minimum time for a successful SNMP GETNEXT (OIDS) request.
NextSuccessMaxTime The maximum time for a successful SNMP GETNEXT (OIDS) request.
NextSuccessMeanTime The mean time for a successful SNMP GETNEXT (OIDS) request.
NextSuccessMedianTime The median time for a successful SNMP GETNEXT (OIDS) request.
NextFailed The number of timed out SNMP GETNEXT (OIDS) requests.
NextFailedTotalTime The total time taken by timed out SNMP GETNEXT (OIDS) requests.
To perform a ping test, the Windows version of the SNMP Poller uses the Windows system ping
program. UNIX versions use the appropriate optimaping program (for HP Itanium, Linux Redhat or
Sun Solaris). The optimaping programs are installed in the same directory as the SNMP Poller
program (opx_DAP_GEN_301).
su
chown root optimaping
chmod a+x optimaping
chmod u+s optimaping
If the optimaping program is not in the system path or working folder, you must specify the path to it
using the parameter OptimaPingPath in the MAIN section of the INI file.
Parameter Description
traceroute_run Indicates whether a traceroute test is run when a ping test fails (1) or not
(0). The default value is 1.
traceroute_timeout Indicates the time in milliseconds to wait for each reply. Values in the range
1 to 10000 are valid. The default value is 5000.
traceroute_maximum_hops This value, also known as Time to Live (TTL), indicates the number of
routers through which Internet Control Message Packets will pass before
returning an ICMP Time Exceeded error message. Values in the range 1 to
31 are valid. The default value is 31.
traceroute_numberOfPackets The number of packets to be sent. Values in the range 1 to 5 are valid. The
default value is 5.
When configured to do so, the SNMP poller will generate a text file with the traceroute output in a
sub folder of [DirTo] called TRACEROUTE. The filename of this file will be in this format:
x_x_x_x_port_YYYYMMDDHHMMSSmmm_traceroute_txt.csv
148
About SNMP Data Acquisition
To do this, type the executable name and a configuration file name into the command prompt.
In Windows type:
opx_DAP_GEN_301.exe opx_DAP_GEN_301_001301001.ini
In Unix type:
opx_DAP_GEN_301 opx_DAP_GEN_301_001301001.ini
Configuration file names should contain the PRID name, to make them unique.
After the SNMP Poller has collected information, the data is loaded into the database using the
Loader.
Tip: You can view the performance details for the SNMP Poller in a csv file, which can help with
troubleshooting problems.
However, TEOCO recommends the following basic maintenance checks are carried out for the
SNMP Poller:
Backup directory to ensure Weekly Files not transferring indicates a problem with
CSV files have been the application.
transferred.
Log file for error messages. Weekly In particular any Warning, Minor, Major and
Critical messages should be investigated.
You can either obtain the version details from the log file or you can print the information by typing
the following command at the command prompt:
In Windows:
opx_DAP_GEN_301.exe -v
149
OPTIMA 8.0 Operations and Maintenance Guide
In Unix:
opx_DAP_GEN_301 –v
For more information about obtaining version details, see About Versioning on page 33.
1003 Requesting Report Output workers to stop when file group queue is DEBUG
empty.
3001 ThreadID: <threadID> OID: <oid> Value: <VBValueString>. DEBUG
3002 Start Verify Device Availability for device IP <ipAddress> Port <port>. DEBUG
End Verify Device Availability for device IP <ipAddress> Port <port>. DEBUG
150
About SNMP Data Acquisition
Troubleshooting
The following table shows troubleshooting tips for the SNMP Poller:
The file is read only or is Close the Parser to release the configuration
being used by another (INI) file
application
Application exits Another instance is running. Use Process Monitor to check instances
immediately running
Invalid or corrupt (INI) file
SNMP session not Network problem Report to system administrator
created
Report not created Error in the MIBReportINI Check the OIDs are valid
file
151
OPTIMA 8.0 Operations and Maintenance Guide
The SNMP Discoverer reports SNMP devices to the OPTIMA database via a mediation agent:
Once devices have been discovered they become visible on the Devices tab of the SNMP Poller
Configuration Interface. For more information, see Finding and Loading Devices Automatically on
page 117.
You can configure the SNMP Discoverer by editing the parameters in the Discoverer INI file. This
file is created if you have opted to control polling instances centrally and you click the Create INI
files button in the SNMP Poller Configuration Interface. For more information see Selecting Web
Service Settings on page 140 and Generating an INI File of SNMP Poller Settings on page 141. To
see what the INI file looks like, see Example Discoverer INI File for Central Control on page 143.
- or -
• Fully Discovered, such that it is available in the SNMP Poller Configuration Interface (the IP
address and community string are known), the ping process is successful (the round trip
time (RTT) is assigned), it is assigned to a device type, and details are saved in the
appropriate database tables (SNMP_DEVICE, SNMP_DEVICE_TUPLES &
SNMP_TUPLES, SNMP_DEVICE_MACHINE)
The SNMP Assigner does not automatically assign a device without a corresponding RTT attribute
in the SNMP_DEVICE_MACHINE table.
The manual discover devices process fulfils only the partial discovery definition. You can only
assign a manually discovered device to a poller instance by manually dragging-and-dropping it on
the Pollers-Devices tab of theSNMP Poller Configuration Interface. For more information, see
Finding and Loading Devices Manually on page 125. Even at this stage the device is not fully
discovered. However, It is expected that device can be polled when the SNMP Poller runs as
scheduled.
152
About SNMP Data Acquisition
To ensure that a device is fully discovered and ready for the SNMP Assigner process the SNMP
Discoverer must run as scheduled or triggered in an ad-hoc manner.
Important: You must install the backend components using the OPTIMA combined backend
package, which will install the required components quickly and easily. For more information on
how to use this, see Installing the OPTIMA Combined Backend Package on page 21.
There are three stages to the processing carried out by the SNMP Assigner. These are:
• Gathering information already found by the SNMP Discoverer
• Performing calculations based on configuration data and performance information
• Updating the information for poller assignment
Note: The SNMP Assigner does not assign devices that have been manually created or assigned.
You can configure the SNMP Assigner by editing its INI file. This table describes the editable
parameters in the file:
Parameter Description
WebServiceUrl The full http reference for the web service, for example:
WebServiceUrl=http://www.foo.com/cgi-bin/aircom/optima
Rebuild Indicates whether the assigner ignores existing device assignments and recalculates
(1) or not (0).
Usually, without a change of algorithm, a rebuild will produce the same results as
before, and since it only updates devices which have changed, it will do nothing.
On a change of algorithm however, it should change significantly.
153
OPTIMA 8.0 Operations and Maintenance Guide
Parameter Description
[MAIN]
MachineID=123
ProgramID=312
InstanceID=001
Iterations=1
Verbose=1
LogSeverity=1
WebServiceUrl=http://www.foo.com/cgi-bin/aircom/optima
[DIR]
TempDir=./tmp
LogDir=./log
PIDFileDir=./pid
[ASSIGNER]
Rebuild=1
154
About SNMP Data Acquisition
155
OPTIMA 8.0 Operations and Maintenance Guide
Important: You must install the backend components using the OPTIMA combined backend
package, which will install the required components quickly and easily. For more information on
how to use this, see Installing the OPTIMA Combined Backend Package on page 21.
156
About SNMP Data Acquisition
Together, these configurable components manage the communication between the SNMP
Discoverer, the SNMP Assigner, the SNMP Poller and the OPTIMA database:
The configuration service provided by the Mediation Agent standalone application allows the SNMP
Discoverer, the SNMP Assigner, and the SNMP Poller to retrieve their configuration details.
The data service provided by the Mediation Agent standalone application allows Create, Read,
Update and Delete type operations to be carried out on entities in the OPTIMA database.
You can configure the Mediation Agent standalone application by editing its INI file. This table
describes the editable parameters in the file:
Parameter Description
157
OPTIMA 8.0 Operations and Maintenance Guide
[MAIN]
MachineID=123
ProgramID=309
InstanceID=001
Iterations=1
Verbose=1
LogSeverity=1
[DIR]
TempDir=${OPTDIR}/tmp
LogDir=${OPTDIR}/log
PIDFileDir=${OPTDIR}/pid
[DBConfiguration]
DBString=VM148DB1
UserID=aircom
Password=ENC(l\mlofhY)ENC
DBClient=oracle
ReconnectDelay=2
[Other]
Agent=MATTHEW
Important: You must install the backend components using the OPTIMA combined backend
package, which will install the required components quickly and easily. For more information on
how to use this, see Installing the OPTIMA Combined Backend Package on page 21.
158
About SNMP Data Acquisition
mkdir -p /var/www/cgi-bin/aircom/optima/dataservice/
mkdir -p /var/www/cgi-bin/aircom/optima/configservice/
2. From the webservice executables, copy Create, Add, Fetch, Update, Save and Delete to:
/var/www/cgi-bin/aircom/optima/dataservice/
1. Locate vi /etc/sysconfig/httpd
2. Add line:
umask 0000
at the end.
3. Restart apache:
/etc/init.d/httpd restart
When opx_DAP_GEN_309 is run, type "umask 0000" before the run to set the permissions on the
message queues it creates.
2. When prompted to add a new password, type, and then when prompted re-type the
password gooptimago.
vi /etc/httpd/conf/httpd.conf
159
OPTIMA 8.0 Operations and Maintenance Guide
5. Ensure that the mod_auth_digest.so Apache module exists in the Apache modules
directory.
Failure to do this will result in an invalid user error being returned when a request is made
to the web service.
<Directory "/var/www/cgi-bin">
AllowOverride All
Options None
Order allow,deny
Allow from all
</Directory>
This will protect all files in that directory and its subdirectories:
AuthType Digest
AuthName "Optima Restricted"
AuthUserFile /etc/httpd/passwords/optima_digest
Require valid-user
Important: This procedure works on Apache 2.2. Other versions of Apache require the
AuthUserFile to be replaced with an AuthDigestFile, both in httpd.conf and .htaccess.
<Files "printenv">
Require valid-user
</Files>
/etc/init.d/httpd restart
160
About SNMP Data Acquisition
OptimaTMPDIR
and set it to a directory suitable for containing temporary files (for example C:\temp).
You must also add this (PassEnv line) to the Apache configuration httpd.conf file:
<Directory "/var/www/cgi-bin">
...
PassEnv OptimaTMPDir
</Directory>
161
OPTIMA 8.0 Operations and Maintenance Guide
162
About the OPTIMA Parser
The OPTIMA Parser converts the raw network files from proprietary file format into comma
separated values (CSV) format.
Specific parsers are provided for each interface. This section describes settings that are common to
all parsers. You should refer to specific interface documents for the parsers deployed on a
particular network.
The parser's interface enables you to configure the required information for the CSV files, for
example directory settings, common settings and specific reports settings. This means that only the
required data is loaded into the database.
When the CSV files are created, the files are sent to the input directory for data validation. Once
validated, the CSV files are moved to the input directory for the loader. The data is then loaded into
the database table by the loader application.
Function Action
Logging Status and error messages are recorded in a daily log file.
Error Files If the application detects an error in the input file that prevents processing of that file, then
the file is moved to an error directory and processing continues with the next file.
Monitor Files The application runs in a scheduled mode. A monitor (PID) file, created each time the
application is started, ensures that multiple instances of the application cannot be run. The
PID file is also used by the OPTIMA Process Monitor to ensure that the application is
operating normally.
PRID The automatically-assigned PRID uniquely identifies each instance of the application. It is
composed of a 9-character identifier, made up of Interface ID, Program ID and Instance ID.
The Interface ID is made up of 3 numbers but the Program ID and Instance ID are both
made up of 3 characters, which can be a combination of numbers and uppercase letters.
For more information, see About PRIDs on page 29.
Backup The application can store a copy of each input file in a backup directory.
For more details on these common functions, see Introduction on page 15.
163
OPTIMA 8.0 Operations and Maintenance Guide
If raw data input files are already available in CSV format, OPTIMA can be run without the Parser,
but in order to use the Parser, you must install at least one Parser configuration file and an
associated executable file. A Parser configuration file contains parameters that determine how a
proprietary file format is converted into CSV format. There is a Parser configuration file for each
proprietary file format that you want OPTIMA to interface with. You can install many, but this
section describes how to install one.
1. Choose which proprietary file format you want OPTIMA to interface with.
2. If you have access to the vendor interfaces on Plone, locate your chosen format from the
list shown at:
http://plone:9080/intranet/projects/vendor-interfaces/vendor-
interfaces-optima-interface-repository
If you do not have access to Plone, and you are an TEOCO employee, you can contact the
VI team who will supply you with the required file. If you are not an TEOCO employee you
can obtain the file from Product Support
3. Extract the contents the downloaded zip file and any further zip files within it. Among the
folders and files extracted, find the Parser configuration file (.ini) and the executable file
(.exe for Windows, no suffix for Unix) required.
164
About the OPTIMA Parser
4. Move the executable file to the OPTIMA Backend Bin folder on the mediation server, for
example in Windows:
5. Move the Parser configuration file (.ini) to the appropriate sub folder under the OPTIMA
Backend Interface folder on the mediation server, for example:
OPTIMA Backend\Interface\ERI\UTRAN\Parser
6. Start the Parser by typing the executable file name followed by the configuration file name
at the command prompt. For example:
- or -
Note: For more information on .ini files, see Example Parser Configuration (INI) File on page 177.
165
OPTIMA 8.0 Operations and Maintenance Guide
The Parser checks for any file(s) in the input folder that match the file mask. The Parser opens the
file and starts processing the data.
By default, the Parser extracts from the input raw file all available measurement objects. The output
file will be in comma separated value (CSV) format. While a file is being processed, it is stored in a
temporary folder. This is to prevent incomplete files being sent to the Data Validation application.
When the parsing process has finished successfully, the processed file is moved from the
temporary folder to the output folder. If any problem is encountered, the file will be moved to the
error directory and a message will be added to the log file.
This file is parser-specific, for example, the Nortel XML Parser requires the following file:
• opx_PAR_NOR_711.exe (Windows)
• opx_PAR_NOR_711 (Unix)
Tip: A full list of the latest parser files for each vendor is available from Product Support.
Type the executable file name and a configuration file name into the command prompt.
For example, the Nortel XML Parser requires the following to be typed in:
In Windows:
opx_PAR_NOR_711.exe opx_PAR_NOR_711.ini
In Unix:
opx_PAR_NOR_711 opx_PAR_NOR_711.ini
Note: In usual operation within the data loading architecture, all applications are scheduled. In
normal circumstances, you should not need to start the program manually. For more information,
see Starting and Stopping the Data Loading Process on page 40.
166
About the OPTIMA Parser
The following table describes the common parameters in the [DIR] section:
Parameter Description
The following table describes the common parameters in the [MAIN] section:
Parameter Description
CheckOutputFile Indicates whether you want to monitor the output file system usage (1) or not
SystemUsage (0).
If this option is selected, then if the usage exceeds the threshold that you
have defined in the MaxOutputFilesystemPercent parameter, the parser will
stop.
ColumnsCase 0 (default) - The parser will convert output header columns and validation
Sensitive report column names into upper case when comparing.
1 - The parser will not convert output header columns and validation report
column names into upper case when comparing.
DiskUsageExe If you are using an overload directory, then this parameter specifies the
command that will report the disk usage level (free disk) in the file system,
returning the percentage.
The default value is '/bin/df -k'.
Important: This is used for Sun Solaris and other UNIX OS.
167
OPTIMA 8.0 Operations and Maintenance Guide
Parameter Description
EnableBackup Indicates whether a copy of the input file will be copied to the backup
directory (1) or not (0).
If you do not choose to backup, the input file is deleted after it has been
processed.
EnableCombiner 1 – Create combiner history files.
0 – Do not create combiner history files.
EnableValidation 1 - The parser will perform validation, and read the other validation
parameters contained in the INI file - ColumnsCaseSensitive, MissingValue,
RemoveHeader, SafeMode, SeparatorOut, TrimData and TrimHeader.
The parser also looks for a 'CounterGroups' section and all of its related
measurements with the counters lists.
0 (default) - The parser will not perform validation.
FolderFileLimit The maximum number of output files that can be created in each output (sub)
folder.
This must be in the range of 100-100,000 for Windows, or 100-500,000 on
Sun/UNIX, otherwise the application will not run.
Warning: Depending on the number of files that you are processing, the
lower the file limit, the more output sub-folders that will be created. This can
have a significant impact on performance, so you should ensure that if you
do need to change the default, you do not set the number too low.
IncludeSubDirectories Used to process input files in the directory specified in the 'DirFrom'
parameter, and all of its sub-directories:
0 (default) - Do not search sub-directories for input files
1 - Search sub-directories for input files
InputFileMask Filter for input file to process, for example, *C*.*
InputFileName Option for including the file name (excluding the path) as the first column in
AsColumn the output CSV file.
0 - False. Will not create the first column as that of the input file name.
1 - True. This is the value normally set by the VI team. It creates the first
column
as that of the input file name.
If the value is other than 0 or 1, or the option is commented, the program
default value of 0 is applied.
InputFileSortOrder Specifies the order in which input files are processed:
0 (default) - no sort order
1 - Ascending order
2 - Descending order
InstanceID The three-character program instance identifier (mandatory).
InterfaceID The three-digit interface identifier (mandatory).
Iterations This parameter is used when the application does not run in continuous
mode so that it will be able to check for input files in the input folder for the
number of required iterations before an exit. Integer values are allowed, like
1,2,3,4 and so on.
LogGranularity Defines the frequency of logging, the options are:
0 - Continuous
1 - Monthly
2 - Weekly
3 - Daily
168
About the OPTIMA Parser
Parameter Description
LogLevel Sets the level of information required in the log file. The available options are:
(or LogSeverity)
1 - Debug
2 - Information (Default)
3 - Warning
4 - Minor
5 - Major
6 - Critical
MaxOutput Specifies the parser output directory usage threshold (in % used), beyond
Filesystem which the parser should stop.
Percent
The default is 96%, which means that when usage reaches 97% the parser
will stop.
MissingValue This string value will be used for counters missing from the input file.
The default is an empty string.
NumberOfInput If you have configured the parser to perform validation, then use this
FilesPerBatch parameter to batch the input files. The batch size indicates the number of
files in each batch.
NumberOfMove The number of threads used to move input files to the error or backup
InputFileQueue directories.
Workers
The default (and minimum) value is 1.
NumberOfParser The total number of parser object instances, which perform the parsing,
Objects generate reports, post process report temp files and input files.
The parser objects are recycled from the Main thread -> Parsing threads ->
Report Generation threads -> Main thread.
The default value is NumberOfParsingInputFileQueueWorkers +
NumberOfParserOutputFileQueueWorkers instances, but for small input files
it is recommended that you set a value greater than this.
The minimum value for this parameter is 1.
NumberOfParsing The number of threads used to parse input files into memory.
InputFileQueue
Workers The default (and minimum) value is 1.
NumberOfParsing The number of threads used to generate reports and post-process the report
OutputFileQueue temp files to the report output folders.
Workers
The default (and minimum) value is 1.
OffsetWhen Define time adjustment in minutes whenever DST is active, for example,
DSTActive OffsetWhenDSTActive=+60.
OffsetWhen Define time adjustment in minutes whenever DST is inactive, for example,
DSTInactive OffsetWhenDSTInactive=-120.
Pollingtime If you have selected to run the parser continuously, type the number of
seconds that must pass between each check for input files. If the option is
commented, the default value is 5.
ParserOutputToInstance 0 (default) - Do not allow parsers to share output directories when Folder File
Subdirectory Limit is in use.
1 - Allow an additional folder to be appended to a subfolder so that multiple
parsers can share output directories without loss of data.
ProgramID The three-character program identifier (mandatory).
RefreshTime The pause (in seconds) between executions of the main loop when running
continuously.
RemoveHeader 0 (default) - The parser will write the header to the counter group output file.
1 - The parser will not write the header to the counter group output file.
169
OPTIMA 8.0 Operations and Maintenance Guide
Parameter Description
ReportGen The number of threads that each parser object instance will create in order to
Workers validate/output their reports.
These threads enable each parser object instance to validate/output more
than one report at a time.
The default value is 1, but a value of 3 is recommended.
The number of actual threads that this will generate can be calculated as
follows:
• Total number of ReportGenWorkers threads created =
NumberOfParserObjects*ReportGenWorkers
• Total active number of ReportGenWorkers threads =
NumberOfParserOutputFileQueueWorkers*ReportGenWorkers
• Total idle number of ReportGenWorkers threads =
NumberOfParsingInputFileQueueWorkers* ReportGenWorkers
RunContinuous 0 - Have the Parser run once.
1 - Have the Parser continuously monitor for input files.
SafeMode 1 - Run the parser in Safe Mode. The parser will log any new counters found
in the input file that are not in the validation reports. These will be logged in
warning messages.
0 (default) - Do not run the parser in Safe Mode. The parser will not log any
new counters found in the input file that are not in the validation reports.
SeparatorOut This specifies the separator that is used in the output file.
For a space-separated file, use SeparatorOut=SPACE.
For a tab-separated file, use SeparatorOut=TAB.
The default is a comma (,).
StandAlone 0 – Run the application without a monitor file. Do not select this option if the
application is scheduled or the OPTIMA Process Monitor is used.
1 – Run the application with a monitor file.
StatsLogSeverity The severity level for log messages related to statistics.
If this value is equal to or greater than LogSeverity, then statistics will be
logged.
The minimum (and default) value is 1 - DEBUG.
The maximum value is 2 - INFORMATION.
TrimData 1 - The parser will trim the white space from the beginning and end of a
column data value after the data line has been split.
0 (default) - The parser will not trim the column data value.
TrimHeader 1 - The parser will trim the white space from the beginning and end of a
header column after the data line has been split.
0 (default) - The parser will not trim the header column.
TruncateHeader 0 - Do not truncate header column names.
1 - Truncate any header column name which is more than 30 characters
long.
UseFolderFile Indicates whether the folder file limit should be used (1) or not (0).
Limit
The default value is 0 ('OFF').
Verbose 0 - Run silently. No log messages are displayed on the screen.
1 - Display log messages on the screen.
170
About the OPTIMA Parser
Parameter Description
TrimStringFields 1 (default) - The parser will trim the white space from the beginning and end of a
string of data.
0 - The parser will not trim the string.
QuoteFields 1 - The quote character is added before and after the data.
0 (default) - The quote character is not added.
Parameter Description
IndexCheck 1 (default) - Checks the index file to see if the database row has been written to
CSV in an older instance of the parser.
0 - Does not checks the index file to see if the database row has been written to
CSV in an older instance of the parser.
FormatDateTime 1 (default) - Uses the format in the INI parameter DateTimeFormat to format any
date time fields read from the database.
0 - Does not use the format in the INI parameter DateTimeFormat to format any
date time fields read from the database.
If you are using validation, you should also define a [CounterGroups] section, using the following
parameters:
Parameter Description
For each counter group mentioned there must be a corresponding section in the INI file that shows
the associated column mapping.
If you do not want to generate reports for any measurement objects that are currently inactive, then
you should define a [SUPPRESS_REPORTS] section, using the following parameters:
Parameter Description
171
OPTIMA 8.0 Operations and Maintenance Guide
Maintenance
In usual operation, the Parser should not need any special maintenance. During installation, the
OPTIMA Directory Maintenance application will be configured to maintain the backup and log
directories automatically.
Input directory for a backlog of Weekly Files older than the scheduling interval should
files not be in the input directory. A backlog
indicates a problem with the program.
Error directory for files Weekly Files should not be rejected. If there are files in
the error directory analyze them to identify why
they have been rejected.
Log messages for error Weekly In particular any Warning, Minor, Major and
messages Critical messages should be investigated.
The log file is expected to have information related to any error files found in the particular
directory. For more information about the log file, see Checking a Log File Message on page 172.
A new log file is created every day. The information level required in the log file is defined in the
General Settings dialog box and will be one of the following:
• Debug
• Information
• Warning
• Minor
• Major
• Critical
These levels help the user to restrict low level severity logging if required. For example, if Minor is
selected then only Minor, Major and Critical logging will occur.
Note: Utilities are provided for searching and filtering log messages and also for loading the
messages into the database. For more information, see Checking Log Files on page 41.
172
About the OPTIMA Parser
173
OPTIMA 8.0 Operations and Maintenance Guide
174
About the OPTIMA Parser
175
OPTIMA 8.0 Operations and Maintenance Guide
If the Parser is run continuously, then the input directory is monitored continuously. In this case, the
Parser can be terminated. For more information, see Starting and Stopping the Data Loading
Process on page 40.
176
About the OPTIMA Parser
You can either obtain the version details from the log file or you can type in the print command at
the command prompt.
For example, the Nortel XML Parser requires the following to be typed in:
In Windows:
opx_PAR_NOR_711.exe –v
In Unix:
opx_PAR_NOR_711 –v
For more information about obtaining version details, see About Versioning on page 33.
Troubleshooting
The following table shows troubleshooting tips for the OPTIMA Parser:
Application not Application has not been Use Process Monitor to check last run status.
processing input scheduled.
files. Check crontab settings.
Crontab entry removed.
Check configuration settings.
Application has crashed and
Process Monitor is not configured. Check process list and monitor file. If there is
a monitor file and no corresponding process
Incorrect configuration settings. with that PID, then remove the monitor file.
File(s) do not match the input Note: The process monitor will do this
mask(s). automatically.
Change the input masks.
Application exits Another instance is running. Use Process Monitor to check instances
immediately. running.
Invalid or corrupt (INI) file.
Files in Error Incorrect configuration settings. Check log file for more information on the
Directory. problems.
Invalid input files.
Check error file format.
177
OPTIMA 8.0 Operations and Maintenance Guide
[MAIN]
InterfaceID=002
ProgramID=722
InstanceID=001
LogGranularity=3
LogSeverity=1
RunContinuously=0
PollingTime=1
StandAlone=0
InputFileNameAsColumn=1
TruncateHeader=0
InputFileSortOrder=0
IncludeSubDirectories=0
AdjustForDST=0
OffsetWhenDSTActive=+60
OffsetWhenDSTInactive=-60
InputFileMask=*
EnableBackup=1
EnableCombiner=0
NumberOfParsingInputFileQueueWorkers=1
NumberOfParserOutputFileQueueWorkers=1
NumberOfMoveInputFileQueueWorkers=1
NumberOfParserObjects=1
UseFolderFileLimit=1
FolderFileLimit=1000
Verbose=0
ParserOutputToInstanceSubdirectory=1
EnableValidation=0
TrimHeader=1
TrimData=0
separatorOut=,
MissingValue=
RemoveHeader=0
ColumnsCaseSensitive=0
SafeMode=0
[CounterGroups]
NumberOfCounterGroups=24
CounterGroup1=IubDataStreams
CounterGroup2=NbapCommon
CounterGroup3=PlugInUnit
...
CounterGroup23=DownlinkBaseBandPool
CounterGroup24=Sccpch
[IubDataStreams]
ColumNumber=105
Column1=DATETIME
Column2=DATETIMEZONE
Column3=DATETIMEUTC
178
About the OPTIMA Parser
Column4=EMS_NAME
Column5=NE_VERSION
...
Column100=pmCapAllocIubHsLimitingRatioSpi07
Column101=pmCapAllocIubHsLimitingRatioSpi06
Column102=pmCapAllocIubHsLimitingRatioSpi05
Column103=pmCapAllocIubHsLimitingRatioSpi04
Column104=pmHsDataFramesReceivedSpi15
Column105=pmHsDataFramesReceivedSpi14
[NbapCommon]
ColumNumber=28
Column1=DATETIME
Column2=DATETIMEZONE
Column3=DATETIMEUTC
Column4=EMS_NAME
...
Column25=NodeBFunction
Column26=Iub
Column27=NbapCommon
Column28=pmNoOfDiscardedMsg
[PlugInUnit]
ColumNumber=29
Column1=DATETIME
Column2=DATETIMEZONE
Column3=DATETIMEUTC
Column4=EMS_NAME
...
Column26=Subrack
Column27=Slot
Column28=PlugInUnit
Column29=pmProcessorLoad
[DownlinkBaseBandPool]
ColumNumber=67
Column1=DATETIME
Column2=DATETIMEZONE
Column3=DATETIMEUTC
Column4=EMS_NAME
Column5=NE_VERSION
...
Column63=pmCapacityDlCe
Column64=pmSamplesCapacityDlCe
Column65=pmSumCapacityDlCe
Column66=pmSumSqrCapacityDlCe
Column67=pmUsedADch
[Sccpch]
ColumNumber=32
Column1=DATETIME
Column2=DATETIMEZONE
Column3=DATETIMEUTC
Column4=EMS_NAME
...
Column29=pmNoOfTfc1OnFach1
Column30=pmNoOfTfc2OnFach1
Column31=pmNoOfTfc3OnFach2
Column32=pmMbmsSccpchTransmittedTfc
179
OPTIMA 8.0 Operations and Maintenance Guide
Important: In the [DIR] section (also called [Directory Parameters] in some INI files), you should
replace:
• OPTIMA_DIR with the OPTIMA home directory, which is set as an environment variable,
and will be different depending on whether you are using Windows or UNIX
• <application_name> with the name of the application (minus any file extension)
• Forward slashes with back slashes, if you are using Windows
180
About Data Validation
Data validation checks the CSV files created by the parser, ensuring column order, defaulting
missing data values and splitting files if required. Once validated, the files are loaded into the
database.
Important: The loader contains a number of validation options, which you can use instead of the
separate Data Validation application. For more information, see About Loading in OPTIMA on page
213.
The data validation application uses a configuration file (INI) to store information about processing
the files. The configuration file can be edited using a suitable text editor. For more information, see
Configuring the Data Validation Application on page 183.
INI file
Optima Loader
DB
The content of each output file from the data validation application is defined in a report, which is
stored in the configuration file. Within a report, you can specify which columns of data from the
input file will be included in the output file and the order in which they are required. You can define
multiple reports to create multiple output files from a single input file.
If the data validation application is running when you make changes to the configuration (INI) file,
you must restart the application for the changes to be effective.
181
OPTIMA 8.0 Operations and Maintenance Guide
Function Action
Logging Status and error messages are recorded in a daily log file.
Error Files If the application detects an error in the input file that prevents processing of that file, then
the file is moved to an error directory and processing continues with the next file.
Monitor Files The application runs in a scheduled mode. A monitor (PID) file, created each time the
application is started, ensures that multiple instances of the application cannot be run. The
PID file is also used by the OPTIMA Process Monitor to ensure that the application is
operating normally.
PRID The automatically-assigned PRID uniquely identifies each instance of the application. It is
composed of a 9-character identifier, made up of Interface ID, Program ID and Instance ID.
The Interface ID is made up of 3 numbers but the Program ID and Instance ID are both
made up of 3 characters, which can be a combination of numbers and uppercase letters.
For more information, see About PRIDs on page 29.
Backup The application can store a copy of each input file in a backup directory.
For more details on these common functions, see Introduction on page 15.
The application checks for any file(s) in the input folder that match the file mask for the report and
opens these files. The first row (header) of a file is split into different columns to get the actual order
that will be compared with the column order listed in the report(s). If the column order comparison
matches successfully, the file is validated and moved into the correct output folder. The output file
may also be renamed based on the output file. If the column order does not match then the file
needs to be processed further and the column order for the whole file is updated.
While a file is being processed by the data validation application, it is stored in a temporary folder.
This is to prevent incomplete files being sent to the loader. When the validation process has
finished successfully, the processed file is moved from the temporary folder to the output folder.
The output filename may also have a text value attached depending on the settings in the report.
For more information about report settings, see Defining Reports on page 186.
Important: You must install the backend components using the OPTIMA combined backend
package, which will install the required components quickly and easily. For more information on
how to use this, see Installing the OPTIMA Combined Backend Package on page 21.
182
About Data Validation
Parameter Description
Parameter Description
DoCpyToErr Indicates whether the input file will be moved to the error folder when the
validation fails (1) or not (0).
EnableBackup Indicates whether a copy of the input file will be copied to the backup
directory (1) or not (0).
If you do not choose to backup, the input file is deleted after it has been
processed.
The default value is 0.
FolderFileLimit The maximum number of output files that can be created in each output
(sub) folder.
This must be in the range of 100-100,000 for Windows, or 100-500,000 on
Sun/UNIX, otherwise the application will not run.
Warning: Depending on the number of files that you are processing, the
lower the file limit, the more output sub-folders that will be created. This
can have a significant impact on performance, so you should ensure that if
you do need to change the default, you do not set the number too low.
The default is 10,000.
InputFileMask Filter for input file to process, for example, *C*.*
InputFileNameAsColumn If this is set to 1, it adds an INPUT_FILE_NAME_CMB column (with its
data values underneath) to the output file.
By default (0), this is not done.
InstanceID The three-character program instance identifier (mandatory).
InterfaceID The three-digit interface identifier (mandatory).
Iterations This parameter is used when the application does not run in continuous
mode so that it will be able to check for input files in the input folder for the
number of required iterations before an exit. Integer values are allowed,
like 1,2,3,4 and so on.
The default is 5.
Note: If, during an iteration, no input files are found in the input folder, the
program will exit.
183
OPTIMA 8.0 Operations and Maintenance Guide
Parameter Description
UseFolderFileLimit Indicates whether the folder file limit should be used (1) or not (0).
The default value is 0 ('OFF').
Verbose 0 - Run silently. No log messages are displayed on the screen.
1 - Display log messages on the screen.
Note: These settings are not mandatory and the user can decide to use them.
Parameter Description
184
About Data Validation
Parameter Description
MissingValue This string value will be used for counters missing from the input file.
The default is an empty string.
OutputFileMask Type the extension to give to output file names, for example, .csv.
RemoveHeader 0 (default) - The Data Validation Application will write the header to the
counter group output file.
1 - The Data Validation Application will not write the header to the counter
group output file.
SeparatorIn Separator character for input files.
The possible characters are:
Comma ","
Pipe "|"
Tab "TAB"
Spaces "SPACE"
SeparatorOut Separator character for output files.
The possible characters are:
Comma ","
Semicolon ";"
Pipe "|"
Tab "TAB"
Spaces "SPACE"
TrimData Remove any spaces found around the data values. Values 0-1.
TrimHeader Remove any spaces found around the header columns. Values 0-1.
UseETLLoader 0 - End line of output files will be dependent on operating system - WIN32
\r\n - UNIX \n
1 - End line of output files will always be UNIX format - UNIX \n
WindowsInputFiles This parameter should be used when the input files are in the Windows
format (the lines end with \r\n).
0 (Default) - Input files are not Windows
1 - Input files are Windows
Important: If this parameter is not set correctly for the input files that are
used, then the data is still processed, but because of the extra character
added while transferring, the last column is ignored and the value for this is
filled up using the Missing Value parameter.
In Windows, type:
opx_DVL_GEN_411.exe opx_DVL_GEN_411.ini
In Unix, type:
opx_DVL_GEN_411 opx_DVL_GEN_411.ini
185
OPTIMA 8.0 Operations and Maintenance Guide
Defining Reports
Reports specify what information will be validated by the data validation application. You define
reports by editing parameters in the configuration (INI) file with a suitable text editor.
Parameter Description
The following example shows the definitions for two reports called UTRANCELL_A and
UTRANCELL_B:
[REPORTS]
Number=2
Report1=UTRANCELL_A
Report2=UTRANCELL_B
[UTRANCELL_A]
ReportActive=1
ColumNumber=8
Column1=subNetwork
Column2=subNetwork_1
Column3=ManagedElement
Column4=Start_Date
Column5=End_Date
Column6=RncFunction
Column7=UtranCell
Column8=VS.RadioLinkDeletionUnsuccess
[UTRANCELL_B]
ReportActive=1
ColumNumber=9
Column1=subNetwork
Column2=subNetwork_1
Column3=ManagedElement
Column4=Start_Date
Column5=End_Date
Column6=RncFunction
Column7=UtranCell
Column8=VS.3gto2gHoDetectionFromFddcell.RescueCs
Column9=VS.3gto2gHoDetectionFromFddcell.RescuePs
For more information, see Example Data Validation Configuration (INI) File on page 190.
186
About Data Validation
Maintenance
In usual operation, the data validation application should not need any special maintenance. During
installation, the OPTIMA Directory Maintenance application will be configured to maintain the
backup and log directories automatically.
However, TEOCO recommends the following basic maintenance checks are carried out for the data
validation application:
Input directory for a backlog of Weekly Files older than the scheduling interval should
files not be in the input directory. A backlog
indicates a problem with the program.
Error directory for files Weekly Files should not be rejected. If there are files in
the error directory analyze them to identify why
they have been rejected.
Log messages for error Weekly In particular any Warning, Minor, Major and
messages Critical messages should be investigated.
The log file is expected to have information related to any error files found in the particular
directory. For more information about the log file, see Checking a Log File Message on page 187.
A new log file is created every day. The information level required in the log file is defined in the
General Settings dialog box and will be one of the following:
• Debug
• Information
• Warning
• Minor
• Major
• Critical
These levels help the user to restrict low level severity logging if required. For example, if Minor is
selected then only Minor, Major and Critical logging will occur.
Note: Utilities are provided for searching and filtering log messages and also for loading the
messages into the database. For more information, see Checking Log Files on page 41.
187
OPTIMA 8.0 Operations and Maintenance Guide
If the application is run continuously, then the input directory is monitored continuously. In this case,
the application can be terminated. For more information, see Starting and Stopping the Data
Loading Process on page 40.
You can either obtain the version details from the log file or you can print the information by typing
the following command at the command prompt:
In Windows:
opx_DVL_GEN_411.exe -v
In Unix:
opx_DVL_GEN_411 –v
For more information about obtaining version details, see About Versioning on page 33.
Troubleshooting
The following table shows troubleshooting tips for the data validation application:
188
About Data Validation
Application not Application has not been Use Process Monitor to check last run status.
processing input scheduled.
files. Check crontab settings.
Crontab entry removed.
Check configuration settings.
Application has crashed and
Process Monitor is not configured. Check process list and monitor file. If there is a
monitor file and no corresponding process with
Incorrect configuration settings. that PID, then remove the monitor file.
Note: The process monitor will do this
automatically.
Application exits Another instance is running. Use Process Monitor to check instances
immediately. running.
Invalid or corrupt (INI) file.
Files in Error Incorrect configuration settings. Check log file for more information on the
Directory. problems.
Invalid input files.
Check error file format.
1000 Validation instance started. Creating list of files in the Input Directory DEBUG
Validation instance finish processing Input Directory. DEBUG
189
OPTIMA 8.0 Operations and Maintenance Guide
[MAIN]
EnableBackup=1
LogGranularity=3
LogSeverity=2
PollingTime=10
RunContinuous=0
StandAlone=1
InterfaceID=000
ProgramID=222
InstanceID=ABC
Iterations=1
InputFileMask=*.csv
InputFileNameAsColumn=1
verbose=1
DoCpyToErr=1
UseFolderFileLimit=1
FolderFileLimit=100
[OPTIONS]
SeparatorIn=,
SeparatorOut=,
HeaderLineNumber=1
AvoidLineWithSubStrings=ignore
TrimHeader=1
TrimData=1
ColumnsCaseSensitive=0
MissingValue=NULL
RemoveHeader=0
WindowsInputFiles=1
[REPORTS]
Number=2
Report1=CL
Report2=CL2
[CL]
ColumNumber=12
Column1=Header_1
Column2=HEADER_2
Column3=HEADER_5
Column4=HEADER_4
Column5=HEADER_3
Column6=HEADER_6
190
About Data Validation
Column7=HEADER_13
Column8=HEADER_8
Column9=HEADER_9
Column10=HEADER_10
Column11=HEADER_11
Column12=Header_222
[CL2]
ColumNumber=2
Column1=HEADER_26
Column2=HEADER_27
Important: In the [DIR] section (also called [Directory Parameters] in some INI files), you should
replace:
• OPTIMA_DIR with the OPTIMA home directory, which is set as an environment variable,
and will be different depending on whether you are using Windows or UNIX
• <application_name> with the name of the application (minus any file extension)
• Forward slashes with back slashes, if you are using Windows
191
OPTIMA 8.0 Operations and Maintenance Guide
192
About the File Combiners
The File Combiners enable you to merge the CSV files output by certain parsers into new
combined CSV files. CSV files can be combined before data validation or as part of the data
validation process. For more information on data validation, see About Data Validation on page
181.
Note: The File Combiners only work with specific parsers. For more information contact TEOCO
support.
The File Combiners use a configuration file (INI) to store information about combining the files. The
configuration file can be edited using a suitable text editor.
The content of each combined file is defined in a report, which is stored in the configuration file.
Within a report, you specify which type of input files will be combined and which common columns
of data from the input files will be included in the output file. For more information, see Defining
Reports on page 197.
Function Action
Logging Status and error messages are recorded in a daily log file.
Error Files If the application detects an error in the input file that prevents processing of that file, then
the file is moved to an error directory and processing continues with the next file.
Monitor Files The application runs in a scheduled mode. A monitor (PID) file, created each time the
application is started, ensures that multiple instances of the application cannot be run. The
PID file is also used by the OPTIMA Process Monitor to ensure that the application is
operating normally.
PRID The automatically-assigned PRID uniquely identifies each instance of the application. It is
composed of a 9-character identifier, made up of Interface ID, Program ID and Instance ID.
The Interface ID is made up of 3 numbers but the Program ID and Instance ID are both
made up of 3 characters, which can be a combination of numbers and uppercase letters.
For more information, see About PRIDs.
Backup The application can store a copy of each input file in a backup directory.
For more details on these common functions, see Introduction on page 15.
OPTIMA provides a single input File Combiner and a Multiple input File Combiner. This section
describes how to set up the multiple input File Combiner, which is recommended. In order to use
the multiple input File Combiner, you must install a configuration file and an associated executable
file.
193
OPTIMA 8.0 Operations and Maintenance Guide
NAME=CELLHANDOVERS_CMB
PRID=001903001
IN_DIR=/OPTDIR/Interfaces/ERI/UTRAN/CMB/in
OUT_DIR=/OPTDIR/Interfaces/ERI/UTRAN/CMB/out
ERROR_DIR=/OPTDIR/Interfaces/ERI/UTRAN/CMB/error
BACKUP_DIR=/OPTDIR/Interfaces/ERI/UTRAN/CMB/backup
TEMP_DIR=/OPTDIR/Interfaces/ERI/UTRAN/CMB/tmp
LOG_DIR=/OPTDIR/log/
PRID_DIR=/OPTDIR/prid
NUM_REPORTS=2
REMOVEFROMFILENAME=_1_[_\dA-z]+
FILEFORMAT=.*
DATEFORMAT=CYYYYMMDD
KEYHEADERS=SENDERNAME,MEASTIMESTAMP,GRANULARITYPERIOD,MEASOBJINSTID
KEEPUNIQUEHEADERS=HOVERCNT,HOVERSUC,HORTTOCH,HOASBCL,HOASWCL,HOSUCB
CL,HOSUCWCL,HOTOLCL,HOTOKCL,HOUPLQA,HODWNQA,HOEXCTA,HODUPFT,HOTOHCS
,HOATTLSS,HOATTHSS,HOATTHR,HOSUCHR
[REPORTS]
REPORT1=NCELLREL
REPORT2=NECELASS
2. Move the configuration file (.ini) to the appropriate sub folder under the OPTIMA Backend
Interface folder on the mediation server, for example:
OPTIMA Backend\Interface\ERI\UTRAN\CMB
4. Add some input data in CSV file format to the input folder.
5. Copy the appropriate opx_CMB_GEN_903 executable file for your chosen platform from
the folder to which it has been extracted by the OPTIMA backend installation, normally
under:
to the OPTIMA Backend Bin folder on the mediation server, for example in Windows:
- or -
194
About the File Combiners
What is Combining?
The following process describes how files are combined by the File Combiners:
1. During parsing, the File Combiner-specific parser extracts data from raw input files and
stores it in CSV files. The file name of each CSV file contains the object type that the File
Combiner will use in the combining process, in the format:
2. On start up, the File Combiner loads the report(s) from the configuration file into memory.
The report(s) contain the types of the files that are to be combined and their expected
common columns. If there are multiple reports, the application will process them one at a
time.
3. The File Combiner application loads the CSV files from the combined log file(s) and checks
that the CSV files contain the types specified in the report.
4. The File Combiner opens the first CSV file and stores the number of rows it has in memory.
This row count is used as a reference value to ensure that all files to be combined have the
same number of rows. Files cannot be combined if they have different numbers of rows.
5. The File Combiner checks that the header columns of the CSV file match the common
columns specified in the report. If the column comparison matches successfully, the other
columns of the CSV file are stored in memory and the next CSV file is processed. When all
CSV files have been processed, the File Combiner combines all of the stored columns into
a new CSV file with a single common header. The new combined file is moved into the
correct output folder.
6. While a file is being processed by the File Combiner, it is stored in a temporary folder. This
is to prevent incomplete files being sent to the Loader input directory. When the combining
process has finished successfully, the processed file is moved from the temporary folder to
the Loader input directory.
If you are using the single input File Combiner, once a CSV file has been successfully
created and saved, its file name and directory path are logged by the parser in a combined
log file. For more information on parsing, see The Parsing Process on page 166.
For this File For this Operating System Install this file
Combiner
Unix opx_CMB_GEN_903
Important: You must install the backend components using the OPTIMA combined backend
package, which will install the required components quickly and easily. For more information on
how to use this, see Installing the OPTIMA Combined Backend Package on page 21.
195
OPTIMA 8.0 Operations and Maintenance Guide
For this For this Operating Type the following into the command prompt
File System
Combine
r
EnableBackup Indicates whether a copy of the input file will be copied to the backup
directory (1) or not (0).
If you do not choose to backup, the input file is deleted after it has been
processed.
InputFileMask Filter for input file to process, for example, *C*.*
InstanceID The three-character program instance identifier (mandatory).
196
About the File Combiners
Parameter Description
MoveUncombinedFiles Specifies what the File Combiner should do with uncombined files. The
available options are:
0 - Do not move file.
1 - Move file to directory specified in DirIncomplete parameter.
2 - Delete file.
3 - Move file to directory specified in DirTo parameter plus last sub-path of
original path.
For more information, see Example Single Input File Combiner Configuration (INI) File on page
198.
Defining Reports
Reports specify which input files will be combined and which common columns of data from the
input files will be included in the output file. You define reports by editing parameters in the
configuration (INI) file with a suitable text editor.
197
OPTIMA 8.0 Operations and Maintenance Guide
Parameter Description
ExcludeFields Type the columns you want to exclude from the report.
KeyFields Type the common columns that you want to use in the report.
Number Type the number of reports to be combined.
Reportn Type the unique name of the report, where n is the execution order position of the
report, for example, Report1 will be executed before Report2.
Types Type the names of the files you want to combine.
The following example shows the definitions for two reports called UtranCell and UeRc:
[REPORTS]
Number=2
Report1=UtranCell
Report2=UeRc
[UtranCell]
Types=pmSamplesCs12RabEstablish,pmNoDirRetrySuccess,pmUlTrafficVolumePsSt
r64Ps8
KeyFields=ffv,SubNetwork,SubNetwork1,MeContext,st,vn,cbt,mff,NewVnNode,ne
un,nedn_SubNetwork,nedn_SubNetwork1,nedn_MeContext,mts,gp,ManagedElement,
RncFunction,UtranCell
ExcludeFields=DUMP
[UeRc]
Types=pmTransportBlocksAcUl,pmUlRachTrafficVolume
KeyFields=ffv,SubNetwork,SubNetwork1,MeContext,st,vn,cbt,mff,NewVnNode,ne
un,nedn_SubNetwork,nedn_SubNetwork1,nedn_MeContext,mts,gp,ManagedElement,
RncFunction,UeRc
ExcludeFields=DUMP
For more information, see Example Single Input File Combiner Configuration (INI) File on page
198.
[COMMON]
DirFrom=/OPTIMA_DIR/<parser_name>/combine_log
DirTo=/OPTIMA_DIR/<application_name>/out
DirBackup=/OPTIMA_DIR/<application_name>/backup
ErrorDir=/OPTIMA_DIR/<application_name>/error
TempDir=/OPTIMA_DIR/<application_name>/temp
PIDFileDir=/OPTIMA_DIR/<application_name>/pid
LogDir=/OPTIMA_DIR/<application_name>/log
DirIncomplete=/OPTIMA_DIR/<application_name>/incomplete
[MAIN]
LogGranularity=3
LogLevel=1
StandAlone=0
InterfaceID=001
ProgramID=900
InstanceID=050
InputFileMask=*.log
198
About the File Combiners
[REPORTS]
Number=5
Report1=RNC_STATS
Report2=CELLRRCRABCONNSTATS
Report3=CELLTRANSCODES
Report4=CELLSHOSTATS
Report5=CPSTATS
[RNC_STATS]
Types=rnc_paging1UraUtran,rnc_dhtAllocAtt
KeyFields=nEUserName,ElementFromFileName,fileFormatVersion,senderName,sen
derTypePadded,senderType,vendorName,collectionBeginTime,measFileFooter,me
asTimeStamp,granularityPeriod,measObjInstId,measObjInstIdPadded,measObjIn
stIdSenderName,suspectFlag
ReportActive=1
ExcludeFields=DUMP
[CELLRRCRABCONNSTATS]
Types=rrcEstabAtt,pchUsageRate
KeyFields=nEUserName,ElementFromFileName,fileFormatVersion,senderName,sen
derTypePadded,senderType,vendorName,collectionBeginTime,measFileFooter,me
asTimeStamp,granularityPeriod,measObjInstId,measObjInstIdPadded,measObjIn
stIdSenderName,suspectFlag
ReportActive=1
ExcludeFields=DUMP
[CELLTRANSCODES]
Types=transFromCellDchAtt,rabsPerQosClass
KeyFields=nEUserName,ElementFromFileName,fileFormatVersion,senderName,sen
derTypePadded,senderType,vendorName,collectionBeginTime,measFileFooter,me
asTimeStamp,granularityPeriod,measObjInstId,measObjInstIdPadded,measObjIn
stIdSenderName,suspectFlag
ReportActive=1
ExcludeFields=DUMP
[CELLSHOSTATS]
Types=hhoAllOutAtt,hhoAllOutAtt
KeyFields=nEUserName,ElementFromFileName,fileFormatVersion,senderName,sen
derTypePadded,senderType,vendorName,collectionBeginTime,measFileFooter,me
asTimeStamp,granularityPeriod,measObjInstId,measObjInstIdPadded,measObjIn
stIdSenderName,suspectFlag
ReportActive=1
ExcludeFields=DUMP
[CPSTATS]
Types=rncUsageRatio,rncUsageRatio
KeyFields=nEUserName,ElementFromFileName,fileFormatVersion,senderName,sen
derTypePadded,senderType,vendorName,collectionBeginTime,measFileFooter,me
asTimeStamp,granularityPeriod,measObjInstId,measObjInstIdPadded,measObjIn
stIdSenderName,suspectFlag
ReportActive=1
ExcludeFields=DUMP
Important: In the [DIR] section (also called [Directory Parameters] in some INI files), you should
replace:
• OPTIMA_DIR with the OPTIMA home directory, which is set as an environment variable,
and will be different depending on whether you are using Windows or UNIX
• <application_name> with the name of the application (minus any file extension)
• Forward slashes with back slashes, if you are using Windows
199
OPTIMA 8.0 Operations and Maintenance Guide
The following table describes the parameters in the configuration (INI) file:
200
About the File Combiners
201
OPTIMA 8.0 Operations and Maintenance Guide
202
About the File Combiners
For more information, see Example Multiple Input File Combiner Configuration (INI) File on page
205.
If you have chosen to perform validation on the output file, then the ini file should also contain a
[VALIDATION] section with the following entries:
203
OPTIMA 8.0 Operations and Maintenance Guide
Parameter Description
Column n The name of each column (new counter), where n is the column
number.
ColumnNumber The total number of columns (new counters) in the report.
If you have chosen to use aliases when combining, then the ini file should contain a [ALIASES]
section, with a row for each column name for which you want to specify aliases. Each row should
follow this format:
Where:
• columnname is the column name with which you want to replace any aliases.
• aliasname1, 2 and so on are the aliases for the column name. The Combiner will find and
replace these with the specified column name.
Item Rules
A20051023_1100_20051023_1400_xmlpd_total_number_of_successful_account_che
cks_20060329111650.csv
This means that the DATEFORMAT was set to AYYYYMMDD in the INI file.
204
About the File Combiners
Therefore, based on the rules of the Perl File Combiner, the DATEFORMAT is converted as
follows:
1. The input filename is compared with the DATEFORMAT regular expression, to ensure that
the DATEFORMAT regular expression matches part of the filename.
2. The input filename is compared with the FILEFORMAT regular expression, to find which file
group the input file belongs to.
If the filename does not match any part of the FILEFORMAT regular expression, it does not
belong to any current file group and is ignored.
3. After the file group has been found, the File Combiner removes the list of
REMOVEFROMFILENAME patterns from the file group name.
A20051023_1100_20051023_1400_xmlpd_
NAME=CELLGPRS_CMB
PRID=001411028
IN_DIR=/OPTIMA_DIR/<application_name>/in
OUT_DIR=/OPTIMA_DIR/<application_name>/out
ERROR_DIR=/OPTIMA_DIR/<application_name>/error
BACKUP_DIR=/OPTIMA_DIR/<application_name>/backup
LOG_DIR=/OPTIMA_DIR/<application_name>/log
LIST_DIR=/OPTIMA_DIR/<application_name>/tmp
PRID_DIR=/OPTIMA_DIR/<application_name>/pid
DO_BACKUP=1
REMOVEFROMFILENAME=GPR[S]*[0-9]
FILEFORMAT=[A-Z]+[0-9]-[A-Z]{2}[0-9]{6}
DATEFORMAT=MTH2DD
STALEAFTER=360
KEYHEADERS=SDATE,STARTTIME,OBJECTID
EXCLUDEHEADERS=INPUT_FILE_NAME,EXID2,EXID3,EXID4,ELEMENT
SAFETYPERIOD=5
205
OPTIMA 8.0 Operations and Maintenance Guide
SINGLE_DIRECTORY=0
VERBOSE=1
LOG_SEVERITY=0
NUM_REPORTS=3
UseFolderFileLimit=0
FolderFileLimit=10000
VALIDATE_COLUMNS=0
CASESENSITIVE=1
[REPORTS]
REPORT1=CELLGPRS
REPORT2=TRAFDLGPRS
REPORT3=TRAFULGPRS
Important: In the [DIR] section (also called [Directory Parameters] in some INI files), you should
replace:
• OPTIMA_DIR with the OPTIMA home directory, which is set as an environment variable,
and will be different depending on whether you are using Windows or UNIX
• <application_name> with the name of the application (minus any file extension)
• Forward slashes with back slashes, if you are using Windows
However, TEOCO recommends the following basic maintenance checks are carried out for the File
Combiners:
Input directory for a backlog of Weekly Files older than the scheduling interval should not be in
files the input directory. A backlog indicates a problem with the
program.
Error directory for files Weekly Files should not be rejected. If there are files in the error
directory analyze them to identify why they have been
rejected.
Log messages for error Weekly In particular any Warning, Minor, Major and Critical
messages messages should be investigated.
The log file is expected to have information related to any error files found in the particular
directory. For more information about the log file, see Checking a Log File Message on page 207.
206
About the File Combiners
A new log file is created every day. The information level required in the log file is defined in the
General Settings dialog box and will be one of the following:
• Debug
• Information
• Warning
• Minor
• Major
• Critical
These levels help the user to restrict low level severity logging if required. For example, if Minor is
selected then only Minor, Major and Critical logging will occur.
Note: Utilities are provided for searching and filtering log messages and also for loading the
messages into the database. For more information, see Checking Log Files on page 41.
However, if the File Combiner is run continuously, then the input directory is monitored continuously
and in this case, it can be terminated.
You can either obtain the version details from the log file or you can print the information by typing
the following command at the command prompt:
For this File For this Type the following into the command prompt
Combiner Operating
System
For more information about obtaining version details, see About Versioning on page 33.
207
OPTIMA 8.0 Operations and Maintenance Guide
208
About the File Combiners
This section describes the Perl message log codes common to the File Combiner:
209
OPTIMA 8.0 Operations and Maintenance Guide
210
About the File Combiners
211
OPTIMA 8.0 Operations and Maintenance Guide
212
About Loading in OPTIMA
The ETL (Extract/Transform/Load) Loader Package primarily loads transformed performance data
into the database.
The Loader Package is configured via a Windows-based configuration utility with database
connectivity. Multiple loaders can be configured and the necessary configuration information is
written to both a configuration file and the loader configuration tables stored within the database.
This diagram shows the basic loader process using an external table:
In this case the ETL Loader reads the current Loader report (configuration) information and then
sends the data to a temporary Oracle external table. The data in this temporary table is then
mapped to the destination table as specified in the loader configuration. The mapping of the raw
data to the temporary table and the mapping of the temporary table to the destination table are
defined using the ETL Loader Configuration window, identified in the diagram as Loader GUI .
The loader can also be configured to use direct path loading rather than an external table. For more
information see About Direct Path Loading on page 251.
The Loader is invoked manually on the command line or automatically via a scheduler program,
such as the Unix Crontab functionality.
As well as loading data, the Loader also contains validation options, which enable you to check the
CSV files created by the parser, ensuring column order, defaulting missing data values and splitting
files if required.
213
OPTIMA 8.0 Operations and Maintenance Guide
Important: If you use the validation options, you do not need to use the separate Data Validation
application. However, for more useful information on the data validation process, see About Data
Validation on page 181.
Prerequisites
To run the OPTIMA ETL Loader Application you will need to have:
• Created an OPTIMA database
• Run the OPTIMA Backend Installer
If you use the OPTIMA Installation Tool you do not need to add grants as this is done automatically.
For more information see the OPTIMA Installation Tool User Reference Guide.
Add Grants
Make these grants to the AIRCOM user:
• SELECT, INSERT and UPDATE on the destination table
<SCHEMA>.<DST_TABLE>
TO AIRCOM
1. From the Start menu, select All Programs, Aircom International, AIRCOM OPTIMA
Backend 8.0, Loader.
2. In the Connect to Optima database dialog box, type the required log on details and click
OK.
3. In the Machine Filter dialog box, select the machine on which the ETL Loader client will be
run and click OK.
4. In the ETL Loader Configuration window, click Add. The Configure Report window
appears.
214
About Loading in OPTIMA
7. On the DB and Processing tab, complete the DB connectivity details, selecting External
Table as the Staging Option.
9. Click the Configure aliases button. The Configure loader file mappings dialog box
appears.
10. Click the Load headers from file button and browse to the file from which the first row will
be used to configure the header columns.
11. In the Configure loader file mappings dialog box, right-click and from the menu that
appears select Auto Create Aliases.
12. In the Configure loader file mappings dialog box, right-click and from the menu that
appears select default types to number.
13. Manually adjust individual entries in the Type and Data Format columns as necessary and
then click OK.
14. On the Table Settings tab of the Configure Report window, click the Configure
mappings button.
15. In the Configure loader table mappings dialog box, click to load column data for the
destination table from the database.
16. Click to match aliases to the destination table columns (where a match exists).
Important: In a Unix environment, omit .exe from the above command line.
Once the Loader has finished, check the log file and the LOADER_LOG table.
For the Loader Configuration window (which can be used to configure both the External Table
Loader and the Direct Path Loader):
• opx_LOD_GEN_110_GUI.exe (Windows)
Important: You must install the backend components using the OPTIMA combined backend
package, which will install the required components quickly and easily. For more information on
how to use this, see Installing the OPTIMA Combined Backend Package on page 21.
You must also ensure that you have installed or upgraded your AIRCOM OPTIMA database and all
of the required packages.
To run the Loader, type in the executable file name and the configuration (INI) file name into the
command prompt. For example:
$OPTDIR/bin/opxLoad opxLoad_000000001.ini
216
About Loading in OPTIMA
2. Type a username and password. You can see a list of recently used usernames by clicking
the Browse button.
3. From the list, select the database to which the Loader will send the data.
Note: The database name must match the database alias on the local machine for the
remote database, which is normally configured in the tnsnames.ora file.
4. Click OK.
1. From the Machine list, select the name of the machine on which the Loader will be run.
2. Click OK.
217
OPTIMA 8.0 Operations and Maintenance Guide
This table describes the information that is shown for each report:
Column Description
PRID The automatically-assigned PRID uniquely identifies each instance of the application. It
is composed of a 9-character identifier, made up of Interface ID, Program ID and
Instance ID.
The Interface ID is made up of 3 numbers but the Program ID and Instance ID are both
made up of 3 characters, which can be a combination of numbers and uppercase letters.
For more information, see About PRIDs on page 29.
Report The user assigned name for the report.
Name
Table The name of the table to be loaded.
Name
Load Type Internal use only.
Log The severity level of the log.
Severity
From the ETL Loader Configuration window, you can add, modify or delete loader configuration
reports.
218
About Loading in OPTIMA
Configuring Reports
When you select to add or modify a loader configuration in the ETL Loader Configuration window,
the Configure Report window appears. This window has the following tabs, which you can use to
configure the loader:
• General
• Files and Directories
• DB and Processing
• Table Settings
• Log Messages
• Validator Options
Important: When configuring reports in the Loader, it is recommended that you also read Tuning
the Loader on page 237, which suggests how to get the best results from it.
General tab
219
OPTIMA 8.0 Operations and Maintenance Guide
Field Description
This table describes the information to complete the File and Directories tab:
220
About Loading in OPTIMA
Input File Mask Type in the file mask which the Loader will match when selecting
files to process.
Input Directory Type the location of the input directory from which the raw data files
will be processed.
Input Temp Directory Type the location of the temporary directory.
Field Delimiter Type the value you want to configure the Loader to use as a
delimiter. If you want tab or space to be used as a delimiter, select
the appropriate checkbox instead.
Tab Select this checkbox if you want to configure the Loader to use tab
as a delimiter.
Space Select this checkbox if you want to configure the Loader to use
space as a delimiter.
UNIX Platform Select this radio button if the input files use a Unix style end of line
character (Hex = 0A).
Windows Platform Select this radio button if the input files use a Windows style end of
line character (Hex = 0D 0A).
Missing Field Values are Null Select this checkbox if you want a null value to be used when a
value is missing from a record in the Loader input file. If you do not
select this checkbox and there are missing field values, then an
error will occur.
Header Lines to Skip Select the number of lines from the top of the input file which you do
not want to be loaded. For example, you can use this option to skip
lines which contain headers or bad data.
Important: When you are deciding how many lines to skip, consider
whether you are going to be using the setting which causes the
header line to be removed. For more information on removing the
header line, see Defining the Validator Options for the Loader on
page 232. If the header line is to be removed, you will have one less
line to skip.
The direct path loader requires the header line. For more
information on using this setting for direct path loading, see
Configuring for Direct Path Loading on page 252.
Input Threshold (BYTE) Type the value of the input threshold in bytes.
The Loader can load several CSV files at once by combining them
into a single external file. The input threshold is the maximum size of
the single external file.
Copy File to Backup on Select this checkbox if you want a copy of the input file to be stored
Successful Load in the backup directory when the Loader process is successful.
Copy File to Error Directory when Select this checkbox if you want a copy of the input file to be stored
Load Unsuccessful in the error directory when the Loader process is unsuccessful.
Log File Directory Type the location of the log file directory.
Security Level Select the severity level for logging information when processing this
report.
Move File to Error Directory if Set the minimum % of records to be loaded . If the number of
Less than % Successfully records is less than this number, the input file will be moved to the
Processed error directory.
Error Directory Type the location of the error directory.
Backup Directory Type the location of the backup directory.
Monitor File Directory Type the location of the directory where the Loader instances PID
will be stored.
INI File Directory Type the location of the directory where the initial configuration will
be written.
221
OPTIMA 8.0 Operations and Maintenance Guide
1. Select the appropriate load and error logging options, depending on your requirements:
222
About Loading in OPTIMA
Important: There are a number of other load options, which are not available on the
Loader GUI. You can only set these values by editing the database. For more information,
see About the Loader Options and Database Values on page 239.
223
OPTIMA 8.0 Operations and Maintenance Guide
2. Select the required staging option. This determines whether loading is performed with
external tables or with a direct path array using a Global Temporary Table. For more
information, see About Direct Path Loading on page 251.
3. This table describes the remaining fields on the DB and Processing tab:
Hints APPEND hint Use the APPEND hint option when loading the database with data
that will be appended to the end of a table.
This could provide increased performance under certain
circumstances. Contact TEOCO Support for more information, or
consult your Oracle documentation.
PARALLEL hint Use the PARALLEL hint option when loading the database with data
that will be divided between multiple threads.
This could provide increased performance under certain
circumstances. Contact TEOCO Support for more information, or
consult your Oracle documentation.
Degree of Use the up and down buttons to set the degree of parallelism.
Parallelism
Important: If you set the degree of parallelism to a value greater
than one, a warning appears and the APPEND hint and PARALLEL
hint options are disabled.
DB Name Type the name of the database as defined on the Unix loader client
machine containing the performance data table.
Username Type the username for the loader configuration instance.
Loader Use this list to specify error codes that the Loader should ignore
Error during loading. If the Loader encounters any of the error codes in
Codes this list during loading, it will ignore them and behave as if the
loading process was successful.
For more information, see Adding and Removing Loader Error
Codes on page 225.
Important: You cannot use this option to ignore loading errors for
the 'bulk load with error tables' or 'bulk then upsert with error tables'
methods.
Important: You should not select either of the hint options if there is more than one loader
report for a raw table, or you are using any of the error logging load options. For more
information, see Tuning the Loader on page 237.
Notes:
o You can also set the hint options directly in the database. For more information, see
About the Loader Options and Database Values on page 239.
o For the "Insert Only with Error Logging Tables" and "Bulk then Error Log Insert" logging
options, rows generating error codes which are in the "Loader Error Codes" list will be
counted as successful. This impacts the ErrThreshold parameter (for more information,
see About the Loader Configuration (INI) File Parameters on page 247).
For example, to avoid files being sent unnecessarily to the error directories when some
rows have duplicate data, add:
"ORA-00001"
(unique constraint violation) to the list. A message will be generated in the
LOADER_LOG table:
"144: Counting <nnn> rows from error log table as successful."
224
About Loading in OPTIMA
1. On the DB and Processing tab, click the Add Error Code button.
Note: You must type a valid Oracle Error Code otherwise an error message will be
displayed.
3. Click OK.
1. On the DB and Processing tab in the Loader Error Code list, select the error code that
you want to remove.
225
OPTIMA 8.0 Operations and Maintenance Guide
This table describes the information to complete the Table Settings tab:
External Table Directory (Data File) Provided that you have selected the External Table staging
option on the DB and Processing tab (if you have not, this
field and the next two fields will not be enabled), type the
location of the directory where the data file will be copied to.
This will usually be a mapped drive pointing to the directory
specified in the External Table Directory (DB) field.
Note: If the loader client is running on the database server,
then this location will be the same as the External Table
Directory (DB) location.
External Table Directory (DB) Type the location of the external table directory on the
database server.
Note: TEOCO recommends that this directory is always a
local directory on the database server and not a mapped
drive pointing to a directory on another machine.
226
About Loading in OPTIMA
Destination Table Name Type the name of the database target table.
Click the Configure Mappings button to:
• Define the one-to-one or counter expressions mappings
for raw data held in the external table to columns held in
the destination table.
• Define Threshold Crossing Alerts (TCAs), which are
loader-specific alarms raised on the data as it is loaded
into OPTIMA
For more information, see Configuring Loader Table
Mappings on page 228.
Threshold Crossing Alerts Select the Alarms enabled option if you want to enable any
TCAs that you have defined during the mapping
configuration.
From the Alarms Severity drop-down list, choose the severity
level for any TCAs that are raised.
SNMP If you have enabled TCAs - or want to use them in the future
- select the Forward SNMP traps option to send TCA
notifications by SNMP.
Select the type of event and probable cause for the TCA from
the available lists.
Column Description
Header The unique label given to the data position in the record. These are placeholder
strings which are redefined to meaningful names by the Alias mapping.
Alias Meaningful name given to the column position, which can be used in loadmap
expressions.
Type Oracle data type.
Size Oracle data size.
227
OPTIMA 8.0 Operations and Maintenance Guide
Column Description
Date Format If the data type is specified as Date then PL/SQL format string for the expected
date format is shown here.
Header Position The position of the header in the input file.
To configure the aliases, click one of the buttons as described in the following table:
Load Headers From Populate the header column from the first row of a data file.
File
If the file does not contain a header row, then the first row of data is used.
Auto Assign Alias Map the loaded headers directly to the alias column. Use this where the input file
will provide meaningful headers.
Define Alias Open the Assign Alias dialog box. Use this to modify an alias definition.
Import Alias From File Read from a file with alias definitions defined.
Export Alias From File Write alias definitions to a file.
Column Description
Alias Name Name of the defined aliases representing the data which is to be mapped.
Column Name Name of the target column when loading to the database.
Data Type The data type of the database column.
Position The column position in the table.
Formula The PL/SQL formula used to map aliased data to a column in the database.
228
About Loading in OPTIMA
Column Description
Load States if the column in the database is to be loaded and under what circumstances.
Right-click on a value in the Load column to access these options to be applied to the
whole column:
• Replace All - Load to Load if not null - This changes all instances of "Yes" in the
Load column to "Yes if not null". "Yes if not null" means load the value from the
input file provided it is not null.
• Replace All - Load if not null to Load - This changes all instances of "Yes if not
null" in the Load column to "Yes". "Yes" means load the value from the input file
irrespective of whether it is null or not.
PK States if the column in the database is a primary key column.
Operator, Alarm Value Define these values if you want a Threshold Crossing Alert (TCA) to monitor the value
of this column when it is loaded into the database, and signal if any of the loaded
values are incorrect.
TCAs are loader-specific alarms, which are raised as data is loaded into the OPTIMA
database using the Loader. They indicate a discrepancy between the expected values
according to the defined thresholds and the data loaded into the database after any
modification during the loading process.
A potential standard use may be to report on NULL values being inserted at load for
faster reporting. This needs evaluation against Data Quality Nullness reports.
• Set the operator such as =, >, < or BETWEEN
Note: If you select BETWEEN or NOT BETWEEN as the operator, you must enter
two values separated by a comma, representing the limits of the range.
• Set the value (used in the conjunction with the operator) for which an alarm will be
raised.
In the example picture above, TCAs have been set to trigger if the loaded value of
COUNTER1 is greater than 10, and/or the value of COUNTER2 is greater than 56.
Note: Like performance and system alarms, raised TCAs are written to the ALARMS
table, and can be forwarded using SNMP. For more information, see Defining the
Table Settings for the Loader on page 225.
These criteria are only available if the primary key of the destination table contains a
date.
Important: When you set the Operator and Alarm Values criteria, ensure that you
specify alarm values rather than acceptable values.
To configure the table mappings, click one of the buttons as described in the following table:
Remove Alias When Remove entries from the alias name column when a one-to-one match is found in the
Matched formula.
Load column data for the destination table from the database. For example Column
name, Data type, Position, Load and PK fields.
Match aliases to the destination table columns where a match exists.
Run Open the Locate Records dialog box. In the Locate Records dialog box, you can
specify WHERE conditions to use when loading data between the external table and
destination table.
Clear Clear the Where Condition pane.
229
OPTIMA 8.0 Operations and Maintenance Guide
1. In the Configure Loader Table Mappings dialog box, click Run. The Locate Records
dialog box appears. This picture shows an example:
3. Click the Add button . The new condition is added to the Expression Builder
pane.
Tip: To remove a condition from the Expression Builder pane, click the Clear button
.
4. If you want to add an AND clause to your WHERE condition, repeat steps 2 to 3.
5. If you want to add an OR clause to your WHERE condition, click the OR tab at the bottom
of the Expression Builder pane and then repeat steps 2 to 3.
Tip: To remove all of the conditions you have added, click Reset.
230
About Loading in OPTIMA
6. When you have finished, click OK to save your changes and close the Locate Records
dialog box.
Your WHERE condition is added to the Where Condition pane in the Configure Loader
Table Mappings dialog box.
TCAs are loader-specific alarms, which are raised as data is loaded into the OPTIMA database
using the Loader. They indicate a discrepancy between the expected values according to the
defined thresholds and the data loaded into the database after any modification during the loading
process.
To define TCAs:
1. On the Configure report dialog box, click the Table Settings tab.
3. In the Configure loader table mappings dialog box, define the TCA threshold for each
column for which you want to raise TCAs.
Select the operator and the corresponding alarm value - for example, '>' and '10' to raise a
TCA if the column value is greater than 10:
4. Click OK.
231
OPTIMA 8.0 Operations and Maintenance Guide
This table describes the information that is shown for each log message:
Column Description
DATETIME The date and time when the message was logged.
PRID The automatically-assigned PRID uniquely identifies each instance of the
application. It is composed of a 9-character identifier, made up of Interface ID,
Program ID and Instance ID.
The Interface ID is made up of 3 numbers but the Program ID and Instance ID
are both made up of 3 characters, which can be a combination of numbers
and uppercase letters.
For more information, see About PRIDs on page 29.
MESSAGE_TYPE The type of message that was logged.
SEVERITY The severity level of the log message.
MESSAGE The message that was logged.
232
About Loading in OPTIMA
2. Choose any trimming options that you want to use when validating the data. This table
describes the options:
Option Description
Trim Header Removes any spaces found around the header columns.
Trim Data Removes any spaces found around the data values.
3. Select the required separator for input files - comma, SPACE, TAB or another character.
4. Choose any additional options that you want to use when validating the data. This table
describes these options:
Option Description
Windows Input Files Select this option if the files that are to be loaded/validated are in
Windows format (where the lines end with \r\n), and you want to convert
them to UNIX.
Important: If you have already set the Platform to be Windows on the
Files and Directories tab, then you do not need to set this value here as
well.
Remove Header Does not include the header in the output file.
Important: If you remove the header line, do not count it among the
Header Lines to Skip. For more information on skipping header lines, see
Defining the Files and Directories for the Loader on page 220.
Columns Case Sensitive Compares the header columns to ensure that they are the same case.
233
OPTIMA 8.0 Operations and Maintenance Guide
5. In the Missing Value box, type the value to be used for any columns which are not in the file
and are to be added to the database.
6. In the Header Line Number box, specify the number of lines that need to be skipped in
order to process the data.
Safe Mode enables you to generate a file containing the data for any new counters (or
columns in the parser file header) that the parser outputs but were not expected based the
configuration of the original report.
Column Description
Primary Primary columns are those which will be needed to load the new counter file.
To add a primary column, click the Add Primary column button, type the name of
the column and then click OK.
Ignore Ignore columns are columns for any new counters that you know have been
added since the validation report was created, but are not interested in, and want
to exclude from the file.
To add an ignore column, click the Add Ignore column button, type the name of
the column and then click OK.
To do this:
2. In the Confirm dialog box, click OK to create an INI configuration file locally. The file is
created in the location specified in the Loader report configuration.
If you are loading on a Unix platform, then the INI file must be transferred to the
OPTIMA Unix platform and passed as a parameter to the Loader.
3. In the next Confirm dialog box, click OK. This creates Oracle directory objects in the
database that the Loader uses during processing.
However, TEOCO recommends the following basic maintenance checks are carried out for the
Loader:
234
About Loading in OPTIMA
Input directory for a backlog of Weekly Files older than the scheduling interval should not be
files in the input directory. A backlog indicates a problem
with the program.
Error directory for files Weekly Files should not be rejected. If there are files in the
error directory analyze them to identify why they have
been rejected.
Log messages for error Weekly In particular any Warning, Minor, Major and Critical
messages messages should be investigated.
A new log file is created every day. The information level required in the log file is defined in the
General Settings dialog box and will be one of the following:
• Debug
• Information
• Warning
• Minor
• Major
• Critical
These levels help the user to restrict low level severity logging if required. For example, if Minor is
selected then only Minor, Major and Critical logging will occur.
Note: Utilities are provided for searching and filtering log messages and also for loading the
messages into the database. For more information, see Checking Log Files on page 41.
Important: The filename will only be given in the LOADER_LOG table if the
INPUT_FILE_NAME has been defined as one of the aliases in the external table settings
(Loader File Mappings and Loader Table Mappings).
235
OPTIMA 8.0 Operations and Maintenance Guide
Also, the function test_for_filename will not log errors per file if a column name other than
INPUT_FILE_NAME is used.
• The ERROR_LOG table (called ERR_PRID, where PRID is the PRID value for the
instance) gives a detailed description of the load failures for each offending row. This
picture shows an example, as seen in TOAD:
You can either obtain the version details from the log file or you can print the information by typing
the following command at the command prompt:
In Windows:
opx_LOD_GEN_110.exe -v
- or -
opx_LOD_GEN_112.exe -v
In Unix:
opx_LOD_GEN_110 -v
- or -
opx_LOD_GEN_112 -v
For more information about obtaining version details, see About Versioning on page 33.
To ensure a program is running, the Process Monitor will examine the PIDs directory for a PID file
matching the PRID. If the PRID file exists, the PID is tested to check that it exists in the process
table. If it does not, the PID is removed and the next scheduled Loader invocation should restart the
236
About Loading in OPTIMA
Loader process. The Process Monitor can also be configured to periodically terminate the Loader
instance to ensure that a zombie process does not run unchecked for an extended period of time.
The Process Monitor functionality can also be performed at the command line using standard Unix
commands, for example (using external table loading):
• To identify the PID of the running opx_LOD_GEN_110, go into the PIDs directory:
cd $OPTDIR/pids
• To display the file contents:
cat <hostname>_opx_LOD_GEN_110_000000001.pid
[PID]
PID = 9191
• To identify the PID in the process table:
ps –ef | grep 9191
If the process cannot be found in the process table, the program has terminated. The PID file
should be removed from the monitor directory as other attempts to invoke the Loader will fail whilst
the PID file exists.
The load option that you choose depends on whether or not you are using combiners:
Loader with Upsert (Update Bulk Upsert using Error Logging Bulk Load then Upsert (5)
Combiner and Insert) Tables (12)
Loader without Insert Bulk Load then Error Log Insert Insert Only with Error
Combiner (15) Logging Tables (11)
Note: The specified database values indicate the LOAD_TYPE value stored in the
LOADER_PARAMETERS table. If you cannot access the Loader GUI, you can set the load type by
updating the LOADER_PARAMETERS table with the required LOAD_TYPE value. For more
information, see About the Loader Options and Database Values on page 239.
237
OPTIMA 8.0 Operations and Maintenance Guide
When setting the hint option(s) on the DB and Processing tab of the Configure Report dialog box in
the Loader GUI, then you should follow these guidelines:
• If there is more than one loader report for a raw table, then all loader reports loading into
the same raw table should not have any hint options selected (in other words,
SQL_HINT_OPTION should = 0 for these reports). You can use this query for checking to
which reports this applies:
select
schema, dest_table_name,
from loader_parameters
You should then go through the list of loader reports that are returned and manually update
the SQL_HINT_OPTION.
• For tables with one loader that have performance problems, you should select the
APPEND hint - in other words, SQL_HINT_OPTION should = 1 for these tables. You can
use this query for checking to which reports this applies:
select
schema, dest_table_name,
from loader_parameters
having count(*) = 1
You should then go through the list of loader reports that are returned and manually update
the SQL_HINT_OPTION.
Note: If any of the reports use an Error Logging load option, then hints will be switched off
automatically. This is because an APPEND or PARALLEL hint may cause an insert using error
logging to fail.
238
About Loading in OPTIMA
On the Files and Directories tab of the Configure Report dialog box, it is recommended that you set
the input threshold to 10 million bytes (10MB).
You should only use Debug mode (1) as a temporary value in order to diagnose errors.
Use the Validation Options in the Loader, rather than using a separate Validator
It is recommended that you configure validation by using the Validation Options tab of the
Configure Report dialog box, rather than using a separate Validator.
You should only use a separate Validator if a parser output file needs to be split into two loader
input files, to be loaded into two different raw tables
Loaders that are used for loading mediation machine-level files (for example, the common_logs log
loader and the maintain_dir loader) should have one loader per mediation device rather than one
loader per interface.
The OPTIMA Installation Tool currently creates one loader per interface; only one of these loaders
should be deployed per mediation machine per type.
Load Types
In the Configure Report dialog box, on the DB and Processing tab each combination of Load Option
and Error Logging Option has a unique LOAD_TYPE value, which is stored in the
LOADER_PARAMETERS table. If you cannot access the Loader GUI, you can change the Load
Option and/or Error Logging Option by editing the LOAD_TYPE value.
239
OPTIMA 8.0 Operations and Maintenance Guide
In addition, there are a number of load options not available on the DB and Processing tab. They
are described in the following table:
Bulk load 2 This option will use Not recommended. If the bulk insert will
then Bulk Load but, if the succeed in 50% or
single bulk load fails then This option was the previously more of the files
insert the same data will be recommended way of inserting loaded then use Bulk
loaded using single data, where Combiners were not then Error Log Insert.,
inserts. being used and the data was otherwise use Insert
clean most of the time. Only with Error
However this has now been Logging Tables.
replaced by Bulk then Error Log
Insert which uses error logging
inserts instead of single inserts.
Single 3 This option will insert Not recommended. Insert Only with Error
insert only only one record at a Logging Tables will
time. This method of This option is very slow, it is now ignore PK and other
loading is significantly much faster to use Insert Only violations, and will
slower than Bulk with Error Logging Tables when perform much faster
Load. PK errors are expected. than single inserts.
Bulk then 5 This option will Recommended in some If:
Upsert initially use Bulk circumstances.
Insert, but if this fails • Updates are
it will then run the This option can be used if the required in at least
MERGE query. Combiners have all counter half of the files
groups available when combining
for the majority of the time. - and/or -
Do not use this option if the raw Then use Bulk Upsert
table contains more than 500 using Error Logging
columns. Tables instead.
If you want to use any of these options, you can do so by setting the appropriate LOAD_TYPE
value.
Hint Options
In the Configure Report dialog box, on the DB and Processing tab each hint option has a unique
SQL_HINT_OPTION value, which is stored in the LOADER_PARAMETERS table. If you cannot
access the Loader GUI, you can change the hint option by editing the SQL_HINT_OPTION value.
No hint selected 0
Note: If you select a value greater than one for the Degree of parallelism option on the DB and
Processing tab, an SQL_HINT_OPTION value of 0 is stored.
240
About Loading in OPTIMA
Troubleshooting
Loader Application
Application not Application has not been Use Process Monitor to check last run status.
processing input scheduled.
files. Check crontab settings.
Crontab entry removed.
Check configuration settings.
Application has crashed and
Process Monitor is not configured. Check process list and monitor file. If there is
a monitor file and no corresponding process
Incorrect configuration settings. with that PID, then remove the monitor file.
Note: The process monitor will do this
automatically.
241
OPTIMA 8.0 Operations and Maintenance Guide
Application exits Another instance is running. Use Process Monitor to check instances
immediately. running.
Invalid or corrupt (INI) file. Check that the INI file is configured correctly -
recreate if corrupted.
Files in Error Incorrect configuration settings. Check log file for more information on the
Directory. problems.
Invalid input files.
Check error file format.
1000 Loader instance started. Creating list of files in the Input Directory. DEBUG
Loader instance finished processing the Input Directory. DEBUG
1003 Skipping input file because no longer exist or is a core dump file. DEBUG
Requesting database workers to stop when move temp external file DEBUG
queue is empty.
Requesting backup and error workers to stop when backup and error DEBUG
queue is empty.
1004 Input file is an empty file. DEBUG
1005 Input file is an empty file. DEBUG
1006 File group started: <fileGroupId>. DEBUG
1007 File group ready for processing: <fileGroupId>. DEBUG
1008 Still below file group threshold: <fileGroupId>. DEBUG
1009 Above file group threshold so start processing file group: DEBUG
<fileGroupId>.
1010 Main mediation thread finished processing input directory. DEBUG
Processing last file. DEBUG
242
About Loading in OPTIMA
Could not rename temp external file to external file, is external file MAJOR
locked?
Expected to find a temp external file. MAJOR
243
OPTIMA 8.0 Operations and Maintenance Guide
Started processing temp external file for file group <fileGroupId>. DEBUG
Finishing processing temp external file for file group <fileGroupId>. DEBUG
Started processing backup and error for file group <fileGroupId>. DEBUG
Finishing processing backup and error for file group <fileGroupId>. DEBUG
244
About Loading in OPTIMA
[MAIN]
InterfaceId=001
ProgramId=110
InstanceId=009
PRID=001110009
LogGranularity=3
LogSeverity=2
Verbose=0
RunContinuous=0
Pollingtime=10
Standalone=0
Iterations=5
UseFolderFileLimit=1
FolderFileLimit=10000
[LoaderConfiguration]
Database=OPTPROD62
UserName=AIRCOM
Password=ENC(l\mlofhY)ENC
ExtFileName=opx_LOD_GEN_110_001110009.ext
DoCpyToErr=1
DoBackup=0
FileMask=*.csv
ErrThreshold=100
NumberOfHeaderLines=1
InputThresHold=0
ValidateInputFile=1
TimeInterval=10
ExeName=opx_LOD_GEN_110
ReportName=CSV_NA_swFCPort_DC
[DIR]
LogDir=/OPTIMA_DIR/<application_name>/log
TempDir=/OPTIMA_DIR/<application_name>/tmp
PIDFileDir=/OPTIMA_DIR/<application_name>/prids
InputDir=/OPTIMA_DIR/<application_name>/out
BackupDir=/OPTIMA_DIR/<application_name>/backup
ErrDir=/OPTIMA_DIR/<application_name>/error
ExtTblDir=/OPTIMA_DIR/<application_name>/extdir
[VALIDATECONFIGURATION]
TrimHeader=1
TrimData=0
SeparatorIn=,
separatorOut=,
HeaderLineNumber=1
WindowsInputFiles=0
AvoidLineWithSubStrings=
InputFileNameAsColumn=0
MissingValue=
RemoveHeader=0
ColumnsCaseSensitive=0
SafeMode=1
245
OPTIMA 8.0 Operations and Maintenance Guide
[SAFE]
SafeDir=C:\Development\Test\opx_LOD_GEN_110\newCounters
IgnoreColumns=0
PrimaryColumns=1
PrimaryColumn1=DateTime
[REPORTS]
Number=1
Report1=VALDATION_REPORT
[VALDATION_REPORT]
ColumNumber=34
Column1=DateTime
Column2=IPADDRESS
Column3=PORT
Column4=Index
Column5=swFCPortCapacity
Column6=swFCPortIndex
Column7=swFCPortTxWords
Column8=swFCPortRxWords
Column9=swFCPortTxFrames
Column10=swFCPortRxFrames
Column11=swFCPortTxC2Frames
Column12=swFCPortRxC3Frames
Column13=swFCPortRxLCs
Column14=swFCPortRxMcasts
Column15=swFCPortTooManyRdys
Column16=swFCPortType
Column17=swFCPortNoTxCredits
Column18=swFCPortRxEncInFrs
Column19=swFCPortRxCrcs
Column20=swFCPortRxTruncs
Column21=swFCPortRxTooLongs
Column22=swFCPortRxBadEofs
Column23=swFCPortRxEncOutFrs
Column24=swFCPortRxBadOs
Column25=swFCPortC3Discards
Column26=swFCPortMcastTimedOuts
Column27=swFCPortPhyState
Column28=swFCPortTxMcasts
Column29=swFCPortLipIns
Column30=swFCPortLipOuts
Column31=swFCPortOpStatus
Column32=swFCPortAdmStatus
Column33=swFCPortLinkState
Column34=swFCPortTxType
Important: In the [DIR] section (also called [Directory Parameters] in some INI files), you should
replace:
• OPTIMA_DIR with the OPTIMA home directory, which is set as an environment variable,
and will be different depending on whether you are using Windows or UNIX
• <application_name> with the name of the application (minus any file extension)
• Forward slashes with back slashes, if you are using Windows
246
About Loading in OPTIMA
Note: Environment variables can be used within the directory specification, except for the External
Table Directory - ExtTblDir.
Warning: The direct modification of the GUI generated configuration file is not recommended.
This table describes the entries found in the [MAIN] section of the ETL GUI generated INI
configuration file:
Parameter Description
FolderFileLimit The maximum number of output files that can be created in each output (sub)
folder.
This must be in the range of 100-100,000 for Windows, or 100-500,000 on
Sun/UNIX, otherwise the application will not run.
Warning: Depending on the number of files that you are processing, the lower
the file limit, the more output sub-folders that will be created. This can have a
significant impact on performance, so you should ensure that if you do need to
change the default, you do not set the number too low.
The default value is 10,000.
Instance ID The three-character program instance identifier (mandatory).
Interface ID The three-digit interface identifier (mandatory).
Iterations This parameter is used when the application does not run in continuous mode
so that it will be able to check for input files in the input folder for the number of
required iterations before an exit. Integer values are allowed, like 1,2,3,4 and so
on.
LogGranularity Defines the frequency of logging, the options are:
0 - Continuous
1 - Monthly
2 - Weekly
3 - Daily
LogLevel (or Sets the level of information required in the log file. The available options are:
LogSeverity)
1 - Debug
2 - Information (Default)
3 - Warning
4 - Minor
5 - Major
6 - Critical
PollingTime (or The pause (in seconds) between executions of the main loop when running
RefreshTime) continuously.
PRID The automatically-assigned PRID uniquely identifies each instance of the
application. It is composed of a 9-character identifier, made up of Interface ID,
Program ID and Instance ID.
The Interface ID is made up of 3 numbers but the Program ID and Instance ID
are both made up of 3 characters, which can be a combination of numbers and
uppercase letters.
For more information, see About PRIDs on page 29.
Program ID The three-character program identifier (mandatory).
247
OPTIMA 8.0 Operations and Maintenance Guide
Parameter Description
UseFolderFileLimit Indicates whether the folder file limit should be used (1) or not (0).
The default value is 0 ('OFF').
Verbose 0 - Run silently. No log messages are displayed on the screen.
1 - Display log messages on the screen.
This table describes the entries found in the [Loader Configuration] section:
Parameter Description
Parameter Description
LogDir The location of the directory where log files will be stored.
TempDir The location of the directory where temporary files will be stored.
PIDFileDir The location of the directory where PID files will be created.
InputDir The location of the input directory from where the raw data files will be
processed.
BackupDir The location of the raw file backup directory.
ErrDir The location of the error directory.
ErrThreshold The error threshold.
248
About Loading in OPTIMA
Parameter Description
Parameter Description
249
OPTIMA 8.0 Operations and Maintenance Guide
Parameter Description
WindowsInputFiles This parameter should be used when the input files are in the Windows
format (the lines end with \r\n), and you want the Validator to convert the
line endings for UNIX.
0 (Default) - Input files are not Windows
1 - Input files are Windows
Important:
• If you have already set the Platform to be Windows, then you do not
need to set this value here as well.
• If this parameter is not set correctly for the input files that are used,
then the data is still processed, but because of the extra character
added while transferring, the last column is ignored and the value for
this is filled up using the Missing Value parameter.
This table describes the entries found in the [SAFE] section, which is only produced if the
SafeMode option is selected on the Validator Options tab of the Loader:
Parameter Description
IgnoreColumn n The name of each ignore column, where n is the column number.
IgnoreColumns The total number of ignore columns, which are columns for any new counters
that you know have been added since the validation report.
PrimaryColumn n The name of each primary column, where n is the column number.
PrimaryColumns The total number of primary columns, which are those which will be needed
to load the new counter report.
SafeDir The location of the directory of the new counter report generated in safe
mode.
If you have chosen to use validation, then this table describes the entries found in the [REPORTS]
section:
Parameter Description
Each validation report will have its own section, containing the following entries:
Parameter Description
Column n The name of each column (new counter), where n is the column number.
ColumnNumber The total number of columns (new counters) in the report.
250
About Loading in OPTIMA
With direct path loading, data is loaded into a Global Temporary Table (GTT). The GTT is loaded
with files directly from the input directory until the input threshold has been met or exceeded. No file
append is needed as the data is loaded directly from the files in the input directory.
Note: No other Oracle Session can see the data that is loaded into the GTT by the direct path
loader client.
This picture shows a simplified representation of the direct path loading process:
Important: When migrating loaders that are based on combined measurement objects, ensure that
no column headings in a combined file have the same name. If there is more than one column with
the same name, the Direct Path Loading process will fail with the error
‘DIRPATHSETCOLUMNS_ERROR’.
4. Update configuration to use direct path loading, for example to migrate all loaders for the
ERICSSON_UTRAN schema to work with the direct path loader client, execute the
following SQL:
5. Call the loader package to generate the SQLs and staging table:
On the DB and Processing tab of the Configure Report window, select Direct Path as
shown in this picture.
252
About Loading in OPTIMA
The direct path configuration options become available. this table describes them:
Oracle The number of Oracle Connections used for this loader PRID. This 1
Connections equates to the number of threads used in the load file worker. Each
separate thread populates its own copy of the staging Global
Temporary Table with data until the input threshold is met or exceeded.
It then calls the Loader Package to transfer the data.
For small tables use a value of 1. For large tables where performance is
critical you can specify up to 5 threads. Increasing the Oracle
connections significantly decreases loading times but uses additional
resources on the database and mediation machines.
Rows per Load Important: Under normal circumstances this parameter should be left 1000
NULL and you cannot change it here. The loader will then insert 1000
records into each array.
During direct path load into the staging table, the loader input file is
read by line and inserted into an array. This parameter determines how
many rows are added to the array before the array is sent to the
database to be loaded into the Global Temporary Table.
Note: A commit is NOT done at the end of inserting the array. The
commit frequency is determined by the input threshold.
Buffer Size Important: Under normal circumstances this parameter should be left MAX_
(bytes) NULL and you cannot change it here. It determines the buffer size ROWS_
required to store the array and is calculated automatically. If the buffer PER_
size is not big enough, the direct path load will fail with a LOAD
OCI_DPR_FULL error, for example: multiplied
by
WARNING 112106
Max
"Staging loader direct path convert error with file Record
C:\optdir\loader\in\950_201007220000_RNC_0.csv - Size
OCI_DPR_FULL - the internal stream is full -
(Row=0,Column=11) OCI_Error:"
The automatic calculation reads the value of the rows per load
(MAX_ROWS_PER_LOAD) and multiplies this by the record size in
bytes. The record size is calculated from the
LOADER_FILE_MAPPINGS configuration. For more information, see
Configuring Loader File Mappings
Important: When configuring loader file mappings, keep size values to
a minimum. For example, use VARCHAR2(50) rather than
VARCHAR2(500) where possible.
Notes:
• The settings on the Validator Options tab of the ETL GUI are not applicable to direct path
loading and you will not be able to access them if you have selected the Direct Path
Staging Option.
• Unlike external table loading, direct path loading requires a header line which must be read
even if it is not loaded. On the Files and Directories tab of the ETL GUI you can specify
Header lines to skip.
253
OPTIMA 8.0 Operations and Maintenance Guide
This table gives examples of what the Header lines to skip setting does in each case:
A Header lines to skip For external table loading For direct path loading means
setting of means
1 Skip line 1, load line 2 onwards. Read line 1, load line 2 onwards.
2 Skip lines 1 and 2, load line 3 Skip line 1, read line 2, load line 3
onwards. onwards.
3 Skip lines 1, 2 and 3, load line 4 Skip lines 1 and 2, read line 3, load line
onwards. 4 onwards.
• With direct path loading, header names in input files are processed as case insensitive.
This allows for the case to be different for the same column between input files, but does
not support two unique column headers which are only differ in their case.
254
About Loading in OPTIMA
The Global Temporary Table used as the staging table in the direct path loader client can not show
data from Oracle sessions other than the current session. This means that the data loaded into the
GTT by the loader is not visible to an OPTIMA Administrator.
If there is a data error such that a non-numeric character is loaded into a NUMBER column, then
the following message will be displayed:
Note: This error is only displayed if there is a datatype mismatch between the CSV data and the
GTT (loader file mappings). If there is a datatype mismatch between the GTT and the raw table
then the error is not displayed in the log.
The direct path loader normally loads 1000 records into an in-memory array. It then converts this to
the correct data types, and sends it to the database to be loaded. If there is an error in the
conversion, then the direct path loader will fail to load more than just the record that is invalid. All
the input files that are in the 1000 record array are consequently sent to the error directory. In the
above log message 5 files will be sent to the error directory. These include the file listed which was
the file loaded into record 1000 of the array, but not necessarily the file containing the error.
Note: The GTT column which has the invalid data is listed in the error message.The first column is
column 0. In the above message the 4th GTT column has the invalid data, which may not
necessarily be the 4th column in the input file. The LOADER_FILE_MAPPINGS table shows which
input file column is loaded into the GTT column that has caused the error.
If the direct path convert error is reported on row 0, column 0 then the error could be caused by a
missing INSERT grant on the GTT table to the OPTIMA_LOADER_PROC user (through the
OPTIMA_LOADER_PROCS role). The GTT tablename will be ETL_<PRID> and will exist in the
same schema as the raw table being loaded.
255
OPTIMA 8.0 Operations and Maintenance Guide
If the log file contains the error "<inputfile> has no matching columns of the expected header
columns" check that the "Field Delimiter" has been defined correctly in the Loader Configuration
dialog box.
[MAIN]
[LoaderConfiguration]
Database=OPTPROD62
UserName=AIRCOM
Password=ENC(l\mlofhY)ENC
DoCpyToErr=1
DoBackup=0
FileMask=*.csv
ErrThreshold=100
NumberOfHeaderLines=1
InputThresHold=0
[DIR]
InputDir=/OPTIMA_DIR/<application_name>/out
BackupDir=/OPTIMA_DIR/<application_name>/backup
ErrDir=/OPTIMA_DIR/<application_name>/error
Note: For more information on these parameters, see About the Loader Configuration (INI) File
Parameters on page 247
256
About the Process Monitor
The Process Monitor continuously checks the running of OPTIMA backend applications on a
particular machine to ensure that they have not crashed, run away or hung.
Each backend application creates a monitor file, which is used by the Process Monitor to identify
and check the health of these applications. If a process crashes, runs away or hangs, the Process
Monitor will remove that instance of the application to ensure the smooth running of the data
loading process.
The Process Monitor uses a configuration file (INI) to store its settings. The configuration file can be
edited using a suitable text editor.
The Process Monitor uses global settings to monitor applications but you can also specify
monitoring requirements for individual applications by defining reports. For more information about
reports, see Defining Monitoring Settings for an Application on page 261.
PROCESS MONITOR
MACHINE 001
2. Check if the process exists in the machine’s
process list Process List
COMBINER (002)
4a. If last timestamp is older than the acceptable
grace period, then issue SIGTERM to program and FTP (003)
add to SIGTERM List
If the configuration (INI) file is modified while the Process Monitor application is running, it has to be
restarted for the changes to have an effect.
257
OPTIMA 8.0 Operations and Maintenance Guide
Function Action
Logging Status and error messages are recorded in a daily log file.
Monitor Files The application runs in a scheduled mode. A monitor (PID) file, created
each time the application is started, ensures that multiple instances of the
application cannot be run. The PID file is also used by the OPTIMA
Process Monitor to ensure that the application is operating normally.
PRID The automatically-assigned PRID uniquely identifies each instance of the
application. It is composed of a 9-character identifier, made up of Interface
ID, Program ID and Instance ID.
The Interface ID is made up of 3 numbers but the Program ID and Instance
ID are both made up of 3 characters, which can be a combination of
numbers and uppercase letters.
For more information, see About PRIDs on page 29.
Important: An instance of the Process Monitor will need to be created for each distinct machine
that you are using. This is because the Process Monitor uses the hostname (the environment
variable that identifies the machine on which the backend application is running) to filter the monitor
directory and only monitors instances running on the same machine. The hostname environment
variable should be defined in the .profile (or equivalent, depending on the UNIX shell) running the
backend application.
Tip: To check that the hostname environment variable has been defined:
o Run the hostname command on any console (WIN/UNIX) and a value should be
returned (for example, server1).
o On UNIX check the .profile and/or .bash_profile file(s) for the HOSTNAME environment
variable (shown in capital letters, unlike the command). This should be equal to the
value returned by the command, for example HOSTNAME=server1.
1. On start up, it loads all the configuration settings from the Process Monitor INI file into
memory. The settings contain information on all the backend processes to be monitored.
2. The monitor files, created by each backend application, uniquely identify the application
instance using the PRID contained in its filename and the hostname. The Operating
System process identifier (PID), which identifies the unique process ID of a backend
application, is also written to the file. Each backend application regularly updates the
timestamp of the monitor file, which works as a 'heartbeat' for the process.
258
About the Process Monitor
3. In the Process Monitor INI file, a GlobalTimeOut period (also known as the 'grace period') is
specified. This is the maximum amount of time that the Process Monitor will allow between
'heartbeats' of the monitored process. As it runs, the Process Monitor regularly checks all
monitor files in the common monitor directory to ensure the grace period has not been
exceeded. Then:
o If the grace period has expired, then the Process Monitor issues a SIGTERM request,
which requests that the program cleanly shuts down the process.
o If the grace period has not expired, the Process Monitor checks that the PID in each file
is still in the current OS process list. If it is, then everything is working as it should, and
the Process Monitor moves on to check the next process. If not, this means that the
associated program has crashed, in which case the Process Monitor program removes
the monitor file.
4. The Process Monitor stores a list of SIGTERM requests that have been sent out, and
during the next iteration of monitoring, it checks if the process is still running. If it is, and the
elapsed time since the SIGTERM request is greater than the GlobalTimeOut period, then
the Process Monitor issues a SIGKILL, which forces the OS to terminate the process.
5. After it has issued a SIGKILL, the Process Monitor will wait a period of time for this to
succeed, as determined by the KillProcessDelay parameter.
When this time period has been exceeded, if the process has been terminated then the
monitor file is deleted. If it has not been terminated, then an error message is returned,
because this indicates a problem with the termination.
Important: You must install the backend components using the OPTIMA combined backend
package, which will install the required components quickly and easily. For more information on
how to use this, see Installing the OPTIMA Combined Backend Package on page 21.
Type in the executable file name and the configuration (INI) file name into the command prompt:
In Windows:
opx_MON_GEN_510.exe opx_MON_GEN_510.ini
In Unix:
opx_MON_GEN_510 opx_MON_GEN_510.ini
Note: In usual operation within the data loading architecture, all applications are scheduled. In
usual circumstances, you should not need to start the program. For more information, see Starting
and Stopping the Data Loading Process on page 40.
259
OPTIMA 8.0 Operations and Maintenance Guide
Parameter Description
LogDir The location of the directory where log files will be stored.
PIDFileDir The location of the directory where monitor (PID) files will be created.
TempDir The location of the directory where temporary files will be stored.
Parameter Description
260
About the Process Monitor
Parameter Description
GlobalTimeout Type the maximum time the Process Monitor should allow for the process being
monitored to update its timestamp.
This is known as the grace period. For information on how this is used, see How
the Process Monitor Works on page 258.
KillProcessDelay Type the maximum time the Process Monitor should wait after a SIGKILL signal
before deleting the monitor file.
TimeScale The time scale for the GlobalTimeout parameter. The available options are:
SEC - seconds
MIN - minutes
HOUR - hours
DAY - days
MONTH - months
YEAR - years
You define reports by editing parameters in the configuration (INI) file with a suitable text editor.
The following table describes the parameters in the [REPORTS] section:
Parameter Description
NoOfReports The number of reports to create, one for each application being monitored.
Reportn Type the unique name of the report, where n is the execution order position
of the report, for example, Report1 will be executed before Report2.
Parameter Description
261
OPTIMA 8.0 Operations and Maintenance Guide
Parameter Description
Timescale The time scale for the MaximumRunningTime parameter. The available
options are:
SEC - seconds
MIN - minutes
HOUR - hours
DAY - days
MONTH - months
YEAR - years
The following example shows the definitions for two reports called XMLParser and PAR_ERI720:
[Reports]
NoOfReports=2
Report1=XMLParser
Report2=PAR_ERI720
[XMLParser]
InterfaceID=000
ProgramID=711
InstanceID=001
EXEname=opxNorXML
UseHostname=0
Comments=XML parser
Monitor=1
MaximumRunningTime=10
TimeScale=SEC
[PAR_ERI720]
InterfaceID=001
ProgramID=720
InstanceID=001
EXEname=opx_PAR_ERI_720
UseHostname=0
Comments=parser ericsson 720
Monitor=1
MaximumRunningTime=30
TimeScale=SEC
For more information, see Example Process Monitor Configuration (INI) File on page 265.
Maintenance
In usual operation the Process Monitor should not need any special maintenance. During
installation the OPTIMA Process Monitor will be configured to maintain the backup and log
directories automatically.
However TEOCO recommends the following basic maintenance check to be carried out for Process
Monitor:
Log messages for Weekly In particular any Warning, Minor, Major and Critical
error messages messages should be investigated.
262
About the Process Monitor
A new log file is created every day. The information level required in the log file is defined in the
General Settings dialog box and will be one of the following:
• Debug
• Information
• Warning
• Minor
• Major
• Critical
These levels help the user to restrict low level severity logging if required. For example, if Minor is
selected then only Minor, Major and Critical logging will occur.
Note: Utilities are provided for searching and filtering log messages and also for loading the
messages into the database. For more information, see Checking Log Files on page 41.
If run continuously, then the Process Monitor process will monitor the working of all the programs
continuously. In this case, the application can be terminated. For more information, see Starting
and Stopping the Data Loading Process on page 40.
You can either obtain the version details from the log file or you can print the information by typing
the following command at the command prompt:
In Windows:
opx_MON_GEN_510.exe -v
In Unix:
opx_MON_GEN_510 –v
For more information about obtaining version details, see About Versioning on page 33.
263
OPTIMA 8.0 Operations and Maintenance Guide
5111 (Could not remove PID <Pid> from OS process list. File: WARNING
<PridFile>).
5112 PidNotWithinTimeAllowed: Last update for prid file was DEBUG
<timeSinceLastTouch> seconds.
IsPidTimeAllowedToRunUp: Last update for prid file was DEBUG
<nTimeSinceLastUpdate> seconds.
5113 Using MaximumRunningTime time <seconds> seconds. DEBUG
5114 Using GlobalTimeout time <GlobalTimeout> seconds. DEBUG
5115 Deleted Prid file File: <PridFile>. DEBUG
5116 Could not delete PRID file. File: <PridFile>. WARNING
5117 Processing complete. INFORMATION
5119 Pid file <PRIDFile> no longer exist, no processing needed. DEBUG
5120 Could not delete PRID file. PRID file no longer exist. File: WARNING
<PridFile>.
5129 (Pid value <Pid> not been updated since maximum time DEBUG
allowed.).
5130 Pid been running <timeSincePidStarted> seconds. DEBUG
5131 PID value <Pid> is being using by a new process because DEBUG
number off seconds since PID process started is less then
number of seconds since the PRID file was last updated.
5140 Ignoring own PRID file <PRIDFile>. DEBUG
5141 Error in checking to see if <Pid> was on OS task list from file WARNING
<PridFile>.
5142 Error in checking elapsed time of <Pid> on OS task list. File: WARNING
<PridFile>.
5143 Error when trying to kill <Pid> on OS task list. File: <PridFile>. WARNING
264
About the Process Monitor
Troubleshooting
The following table shows troubleshooting tips for the Process Monitor:
Cannot save configuration The user has insufficient privileges Enable permissions.
(INI) file. on configuration (INI) file or
directory. Make file writable.
The file is read only or is being Close the Process Monitor to release
used by another application. the configuration (INI) file.
Process Monitor does not Settings are not saved to the Check settings in file and (INI) file
use new settings. configuration (INI) file. location.
File created in the wrong location Restart the Process Monitor backend
application.
Process Monitor has not restarted
to pick up the new settings.
Application not monitoring Application has not been Use Process Monitor to check last
programs. scheduled. run status.
Crontab entry removed. Check crontab settings.
Application has crashed and Check configuration settings.
Process Monitor is not configured.
Check process list and monitor file. If
Incorrect configuration settings. there is a monitor file and no
corresponding process with that PID,
then remove the monitor file.
Note: The process monitor will do
this automatically.
Application exits Invalid or corrupt (INI) file. Use Process Monitor to check
immediately. instances running.
[MAIN]
InterfaceID=001
ProgramID=510
InstanceID=001
PollingTime=5
LogGranularity=3
LogSeverity=1
UseFolderFileLimit=0
FolderFileLimit=10000
StandAlone=0
RunContinuously=0
[OPTIONS]
TimeScale=SEC
GlobalTimeOut=60
265
OPTIMA 8.0 Operations and Maintenance Guide
[Reports]
NoOfReports=3
Report1=NortelXMLParser
Report2=CellStat
Report3=EricssonParser
[NortelXMLParser]
InterfaceID=000
ProgramID=711
InstanceID=001
EXEname=opxNorXML
Comments=nortel XML parser
Monitor=1
MaximumRunningTime=10
TimeScale=SEC
[CellStat]
InterfaceID=001
ProgramID=110
InstanceID=001
EXEname=CellStat
Comments=CellStat loader
Monitor=1
MaximumRunningTime=20
TimeScale=SEC
[EricssonParser]
InterfaceID=001
ProgramID=712
InstanceID=001
EXEname=EricssonParser
Comments=Ericsson Parser
Monitor=1
MaximumRunningTime=40
TimeScale=SEC
Important: In the [DIR] section (also called [Directory Parameters] in some INI files), you should
replace:
• OPTIMA_DIR with the OPTIMA home directory, which is set as an environment variable,
and will be different depending on whether you are using Windows or UNIX
• <application_name> with the name of the application (minus any file extension)
• Forward slashes with back slashes, if you are using Windows
266
About the Directory Maintenance Application
During the data extraction and loading process, a large number of directories are used for various
purposes. These directories need maintenance on a regular basis to ensure smooth running and
good performance for the whole system.
The Directory Maintenance application reports on and maintains user-specified directories based
on user-defined maintenance parameters.
The Directory Maintenance application uses a configuration file (INI) to store information about the
maintenance parameters. The configuration file can be edited using a suitable text editor.
Function Action
Logging Status and error messages are recorded in a daily log file.
Monitor Files The application runs in a scheduled mode. A monitor (PID) file, created each
time the application is started, ensures that multiple instances of the
application cannot be run. The PID file is also used by the OPTIMA Process
Monitor to ensure that the application is operating normally.
PRID The automatically-assigned PRID uniquely identifies each instance of the
application. It is composed of a 9-character identifier, made up of Interface
ID, Program ID and Instance ID.
The Interface ID is made up of 3 numbers but the Program ID and Instance
ID are both made up of 3 characters, which can be a combination of numbers
and uppercase letters.
For more information, see About PRIDs on page 29.
267
OPTIMA 8.0 Operations and Maintenance Guide
The application polls each configured directory at user-defined polling intervals to check if the files
have met the maintenance criteria, which includes maintenance by age and by file count. If a file
mask is specified in the settings, only those types of files are considered in the maintenance
process. Sub directories are also maintained if that particular option is chosen.
If the selected criterion is age then the files are maintained by age. Files older than the age
specified will be deleted or archived depending on the selected option.
If the selected criterion is file count, the number of files in the particular directory is considered for
maintaining the directory. If the file count is greater than the value specified, the excess files will be
archived or deleted according to the selected option.
The Directory Maintenance application displays the results of maintenance in a maintenance report.
Important: You must install the backend components using the OPTIMA combined backend
package, which will install the required components quickly and easily. For more information on
how to use this, see Installing the OPTIMA Combined Backend Package on page 21.
In Windows:
opx_MNT_GEN_610.exe opx_MNT_GEN_610.ini
In Unix:
opx_MNT_GEN_610 opx_MNT_GEN_610.ini
Note: In usual operation within the data loading architecture, all applications are scheduled. In
usual circumstances you should not need to start the program. For more information, see Starting
and Stopping the Data Loading Process on page 40.
268
About the Directory Maintenance Application
Parameter Description
RootDir Type the root of the directory tree that the Directory Maintenance application
will report on and maintain.
ReportDir Type the location where Directory Maintenance report will be stored.
LogDir Type the name of the directory in which log files will be created.
TempDir Type the name of the directory in which temporary files will be created. The
temporary file is deleted once directory is maintained.
PIDFileDir Type the name of the directory in which the program monitor file will be
created.
DefaultArchiveRootDir Type the default root of the archive directory tree. Maintained directories will
be backed up here if the archive option is on for these directories.
The Directory Maintenance application uses the tree structure of the
directory maintained. For example, if RootDir=/dev/optima/, the folder
/dev/optima/parser is archived to DefaultArchiveRootDir/optima/parser.
Notes:
• The Directory Maintenance application will not maintain any folder
matching path mask DefaultArchiveRootDir/*
• The program will append a path separator to end of directory path if
missing.
• The folder must be created before the application runs.
• This parameter is required if NumberOfDir is not zero.
TarDirExe The location of the gtar executable that is used to tar the files before they
are moved to the archive directory.
This location should take the format '.../<... path ...>/gtar.
If this parameter is blank or missing, the files cannot be tarred.
GzipDirExe The location of the gzip executable that is used to gzip the files before they
are moved to the archive directory.
Important: This can only be done after the files are tarred, so if the
TarDirExe parameter is not set, then this will not be done.
Parameter Description
269
OPTIMA 8.0 Operations and Maintenance Guide
Parameter Description
LogLevel or LogSeverity Sets the level of information required in the log file. The available options
are:
1 - Debug
2 - Information (Default)
3 - Warning
4 - Minor
5 - Major
6 - Critical
RunContinuously 0 - Have the Directory Maintenance application run once.
1 - Have the Directory Maintenance application continuously monitor for
input files.
PollingTime (or The pause (in seconds) between executions of the main loop when running
RefreshTime) continuously.
StandAlone 0 – Run the application without a monitor file. Do not select this option if
the application is scheduled or the OPTIMA Process Monitor is used.
1 – Run the application with a monitor file.
Iterations This parameter is used when the application does not run in continuous
mode so that it will be able to check for input files in the input folder for the
number of required iterations before an exit. Integer values are allowed,
like 1,2,3,4 and so on.
MaxFilesInArchive If the location of the gtar executable has been set, then this should be used
to define the maximum number of files to place inside a tar file.
Important: By default, this is set to 100, but if the backup has already been
tarred using the FTP, then this should be set to 1 here.
Parameter Description
DefaultFileMask Type the file mask of files to be reported on and maintained. For
example, DefaultFileMask=*.csv, will report and maintain all CSV
files.
MainThreadSleepMilliSeconds The time in milliseconds the main thread of the application will
sleep for in its main logic loop.
MaxNumberOfThreads The maximum number of threads the application can use while
running.
On UNIX this cannot be greater than 255 threads. On Windows
the maximum is slightly higher.
270
About the Directory Maintenance Application
Parameter Description
271
OPTIMA 8.0 Operations and Maintenance Guide
Parameter Description
The following table describes the parameters in each [REPORT] section. There is one of these for
each directory that you want to maintain or monitor using settings different to the defaults defined in
the [MAIN] section:
272
About the Directory Maintenance Application
Item Description
PathMask This parameter specifies the path mask for this maintenance
section.
The program will append a path separator to the start of the
field if missing.
The program will append a path separator to the end of the field
if the field does not end with a path separator or *.
Example 1:
RootDir=/dev/optima
PathMask=parser/abc/*
Any directory matching path mask /dev/optima/parser/abc/* will
be maintained recursively using these settings.
Example 2:
RootDir=/dev/optima
PathMask=parser/abc
Only the directory matching the path mask
/dev/optima/parser/abc/ will be maintained using these settings.
Example 3:
RootDir=/dev/optima
PathMask=/*
Any directory matching path mask /dev/optima/* will be
maintained using these settings. Every directory found will use
these settings if the directory does not match a path mask in
another section.
If a directory matches more than one section path mask then
the least general path mask will be used. For example:
[Section1]
RootDir=/dev/optima
PathMask=/parser/*
[Section2]
RootDir=/dev/optima
PathMask=/parser/tmp/
In this case, directory /dev/optima/parser/tmp/a/ will use Section
2 settings.
ExcludePathMasks Type a comma-separated list of path masks to use to exclude
directories which match the PathMask parameter and also
match ExcludePathMasks.
Notes:
• The program will ignore blank fields in the comma
separated list.
• The program will append a path separator to start of the
field if it is missing.
• The program will append a path separator to end of the
field if the field does not end with a path separator or *.
FileMask Type the file mask for this maintenance section.
273
OPTIMA 8.0 Operations and Maintenance Guide
Item Description
Maintenance
In usual operation, the Directory Maintenance application should not need any special
maintenance. During installation the OPTIMA Directory Maintenance application will be configured
to maintain the backup and log directories automatically.
However TEOCO recommends the following basic maintenance checks are carried out for
Directory Maintenance application:
Input directory for a backlog of Weekly Files meeting the maintenance criteria should not be in
files meeting the maintenance the input directory. A backlog indicates a problem with
criteria. the program.
Log messages for error Weekly In particular any Warning, Minor, Major and Critical
messages messages should be investigated.
274
About the Directory Maintenance Application
A new log file is created every day. The information level required in the log file is defined in the
General Settings dialog box and will be one of the following:
• Debug
• Information
• Warning
• Minor
• Major
• Critical
These levels help the user to restrict low level severity logging if required. For example, if Minor is
selected then only Minor, Major and Critical logging will occur.
Note: Utilities are provided for searching and filtering log messages and also for loading the
messages into the database. For more information, see Checking Log Files on page 41.
If run continuously, then the Directory Maintenance application will monitor the directories
continuously. In this case, the application can be terminated. For more information, see Starting
and Stopping the Data Loading Process on page 40.
You can either obtain the version details from the log file or you can print the information by typing
the following command at the command prompt:
In Windows:
opx_MNT_GEN_610.exe -v
In Unix:
opx_MNT_GEN_610 –v
For more information about obtaining version details, see About Versioning on page 33.
275
OPTIMA 8.0 Operations and Maintenance Guide
Troubleshooting
The following table shows troubleshooting tips for the Directory Maintenance application:
Invalid or corrupt (INI) file. Check that the INI file is configured correctly -
recreate if corrupted.
276
About the Directory Maintenance Application
[MAIN]
InterfaceID=001
ProgramID=610
InstanceID=001
PollingTime=5
StandAlone=1
RunContinuous=0
LogGranularity=3
LogSeverity=2
[OPTIONS]
DefaultFileMask=*
MainThreadSleepMilliSeconds=500
MaxNumberOfThreads=50
ThreadScope=0
DefaultMaxThreadRunningSeconds=300
[MAIN]
InterfaceID=101
ProgramID=610
InstanceID=001
LogGranularity=3
LogSeverity=2
RunContinuously=0
PollingTime=5
StandAlone=0
Verbose=1
Iterations=1
277
OPTIMA 8.0 Operations and Maintenance Guide
[OPTIONS]
DefaultFileMask=*.csv
MainThreadSleepMilliSeconds=500
MaxNumberOfThreads=10
ThreadScope=0
DefaultMaxThreadRunningSeconds=300
#DoNotProcessPathMasks=/bin,/run,/lib
DefaultArchive=1
DefaultMaxFilesToKeep=2
DefaultMaxFileAgeToKeep=5
DefaultMaxFileAgeTimeScale=0
NumberOfDir=2
Dir1=interface
Dir2=backup
[interface]
#PathMask=*\interface\*
#PathMask=*\root\*
PathMask=\*
MaintenanceType=2
MaxFileAgeToKeep=2
MaxFileAgeTimeScale=0
#MaxFilesToKeep=1000000000000
Archive=1
[backup]
#PathMask=/backup/*
#PathMask=*/backup/*
#PathMask=backup/*
PathMask=*\backup\*
MaintenanceType=2
MaxFileAgeToKeep=2
MaxFileAgeTimeScale=0
#MaxFileAgeTimeScale=3
Archive=1
278
About the OPTIMA Summary Application
The OPTIMA Summary application summarizes data within the OPTIMA database.
The OPTIMA Summary application is a database-based program that runs within the Oracle server.
It uses configuration tables in the database to store information about aggregating data. These can
be modified using the configuration utility.
Time and Element Aggregation Aggregates data from a primary table over time and/or element and
inserts this data into a secondary table.
Busy Hour Calculation Calculates a busy hour from data in a primary table and stores it in a
secondary table.
Busy Hour Summarization Populates the busy hour summary tables using the data in the busy
hour tables and a specified raw table.
Direct Database Loading Loads data from any other third-party database directly into OPTIMA
over a direct database link.
SUMMARY_REPORTS Configure
SUMMARY_SCHEDULES Configure
SUMMARY GUI
Read
SUMMARY_LOG
Log Log
279
OPTIMA 8.0 Operations and Maintenance Guide
The OPTIMA_SUMMARY package reads its configuration from the SUMMARY_REPORTS and the
SUMMARY_SCHEDULES tables. It then calls the DIFFERENCE_ENGINE package to compare
the source table with the destination table. The OPTIMA_SUMMARY package then inserts and
updates the summary table with the new data from the DIFFERENCE_ENGINE comparison.
Note: For more information on the DIFFERENCE_ENGINE package, see About the
DIFFERENCE_ENGINE Package on page 282.
The packages log their messages to the SUMMARY_LOG table. The SUMMARY GUI is a
Windows application which is used to configure the SUMMARY_REPORTS and
SUMMARY_SCHEDULES tables and monitor the processing of the OPTIMA_SUMMARY package.
Quick Start
This section is intended to indicate the steps you must take to get the OPTIMA Summary
Application running for demonstration purposes. It covers the essential parameters that must be
configured. Where more parameters exist but are not mentioned, the default settings will suffice.
For more information on the use of all the parameters that determine the behavior of the OPTIMA
Summary Application, see the remainder of this chapter.
Prerequisites
To run the OPTIMA Summary Application you will need to have:
• Created an OPTIMA database
• Run the OPTIMA Backend Installer
• Run Create_Optima_Summary.sql
• Run the OPTIMA Console
• Checked that the LOGS.SUMMARY_LOG table has a partition for the current date
If you use the OPTIMA Installation Tool you do not need to add grants as this is done automatically.
For more information see the OPTIMA Installation Tool User reference Guide.
Add Grants
Make these grants to the AIRCOM user:
• SELECT on the source table
• SELECT, INSERT and UPDATE on the destination table
280
About the OPTIMA Summary Application
GRANT SELECT ON
<SCHEMA>.<SRC_TABLE>
TO AIRCOM
<SCHEMA>.<DST_TABLE>
TO AIRCOM
1. From the OPTIMA Console, click the New Summary Report button . The Select
Interface/Machine window appears. This picture shows an example:
2. Click the row representing the interface and machine to be used in the PRID for the new
report, then click OK. The Summary Report window appears with the Report
Configuration tab showing.
3. Ensure that the Report Enabled option at the top left of the dialog box is selected.
4. In the Report Configuration pane, select the Summary Time Aggregation type. If you
select Element Aggregation Only then you must also specify an amount and units
(minutes, hours, days or weeks) for the Summary Table Granularity.
281
OPTIMA 8.0 Operations and Maintenance Guide
7. Click the SQL Query tab and type in an SQL query defining what you want your report to
summarize. This should be a SELECT clause which contains the following filter:
(assuming DATETIME is the name of the Primary Key date column in the source table).
8. Click the Column Mappings tab and map the columns of the SQL query on the left, to the
columns of the summary table on the right, using the Match Highlighted button.
Schedule Reports
To run a schedule:
1. Open the schedule from the report edit page or from the schedule explorer.
4. Ensure that the summary_log table has a partition for today's date, and that the summary
destination table has a partition for the date of the data it should contain.
begin
optima_summary.do_work;
end;
6. Check:
o the SUMMARY_LOG table for any errors
o the summary destination table to see if the data has been summarized
From the Optima Console toolbar, click the Log Viewer button .
The Log Viewer is displayed. You can filter on the report that you have just created, using the
PRID.
These tables are named using the format 'DIFFERENCE_OUTPUT_*', and are stored in the
dedicated OPS schema.
282
About the OPTIMA Summary Application
By default, the DIFFERENCE_OUTPUT tables are created in the CODESD tablespace, but you
can choose a different tablespace if required.To do this:
Any new DIFFERENCE_OUTPUT tables will be created in this tablespace. Tables already
created in other tablespaces are not moved.
When the DBMS_SCHEDULER job runs for the first time, it locates the current JOB_NAME using
the current SID, and populates the DIFFERENCE_OUTPUT table to use in the SUMMARY_JOBS
table. A record of the mapping between the DBMS_SCHEDULER jobs and
DIFFERENCE_OUTPUT tables is stored in the OPS.DIFFERENCE_JOBS table.
Each time the job is subsequently run, the same DIFFERENCE_OUTPUT table is used, based on
the OPS.DIFFERENCE_JOBS table.
Element Aggregation
Insert and Update/Date Insert with Delete
Without Time Aggregation
Time/Element with Element Filter
Insert and Update/Date Insert with Delete
Basic Busy Hour
Insert and Update/Date Insert with Delete
Busy Hour with Multi Rank
For example, Top 3 BH
Rolling Busy Hour
Insert and Update/Date Insert with Delete
Busy Hour Summary Standard
Date Insert with Delete
Busy Hour Summary Rolling
Date Insert with Delete
RollUp Busy Hour
Date Insert with Delete
Rolling Rollup Busy Hour
Date Insert with Delete
Rollup Rolling Busy Hour Summary
283
OPTIMA 8.0 Operations and Maintenance Guide
2. Follow the on-screen instructions to install the products and options that you require,
including:
o Entering your user name and company name
o Choosing the Setup Type you require
Complete: If you choose Complete setup, then the installer will install the following:
o OPTIMA Summary Application
o Mediation Device Binaries
Custom: If you choose Custom setup, then you will have the option to select which
application you would like to install
3. When the InstallShield Wizard Completed dialog box appears, click Finish.
4. In TOAD, sqlplus or a similar editor, log into the database as a SYS user, and grant the
following permission to the AIRCOM user:
After you have finished installing the OPTIMA Summary application, you will need to connect to the
database to get it running.
284
About the OPTIMA Summary Application
To connect to a database:
1. From the Start menu, point to Programs, select Aircom International, AIRCOM OPTIMA
Backend 8.0, AIRCOM OPTIMA Summary.
3. Click Connect. If there is any error, you will need to ensure that the major, minor, and
interim version numbers for the tool and packages are the same. For more information, see
About OPTIMA Summary Version on page 285.
4. The OPTIMA Summary Configuration dialog box appears, in which you can configure the
OPTIMA Summary process.
Tip: If the Summary Configuration In the dialog box is not displayed, from the Tools menu,
click Summary.
1. In the OPTIMA Summary Configuration dialog box, click the About Summary button
.
2. In the dialog box that appears, click the Version Info button.
A basic compatibility check is made to check whether the tool and packages in the relevant
schemas have the same major, minor and interim version numbers.
285
OPTIMA 8.0 Operations and Maintenance Guide
The results of the check, including any compatibility errors, are displayed in the Summary
Version Information dialog box:
To check which patch version of the OPTIMA Summary packages you are using:
Run the following command from an SQL window, depending on which package you want to check:
• For the Summary package, run:
SELECT AIRCOM.OPTIMA_SUMMARY.GET_VERSION FROM DUAL;
• For the Difference Engine package, run:
SELECT OPS.DIFFERENCE_ENGINE.GET_VERSION FROM DUAL;
Ensure that the LOGS.SUMMARY_LOG table has a partition for the current date. The
partitions for this table should be created by the OSS Maintenance package.
Ensure that raw and summary tables are created with correct partitions. You can do this
manually or by using the OPTIMA Installation Tool (OIT).
286
About the OPTIMA Summary Application
After the raw and summary tables have been created, configure the report by setting the
parameters for Source and Summary tables. For more information, see The Report
Configuration Tab.
Important: If you want to use sub-hourly summaries, then you must first configure your
system to allow them. For more information, see Configuring Sub-Hourly Summaries on
page 289.
4. Create Schedule(s).
After configuring a report, schedules are created by default. These schedules decide when
a particular report will run. You can edit these schedules to change the run time
parameters. For more information, see Viewing and Editing Report Schedules on page
314.
As well as these default report schedules, you can also create your own to correspond to
different time zones, for example. For more information, see Adding Report Schedules on
page 311.
Ensure that the report and schedules are enabled. A schedule or a report will not run if it is
not enabled.
In addition, ensure that the Next Run Date in the schedule(s) is set to SYSDATE.For more
information, see Adding Report Schedules on page 311.
begin
aircom.optima_summary.do_work('SCHEMA', 'TABLE');
end;
Where 'SCHEMA' and 'TABLE' are optional filters defining the schema and table on which
the job should be run.
Tip: You can specify more than one schema in this filter, by separating each with a comma
- for example, ''ERICSSON_UTRAN,NOKIA_GPRS'.
The interval for this job should be set to SYSDATE+(1/24/60). This means that the
DBMS_Scheduler will wait one minute before the next job runs.
Tip: You can also configure the OPTIMA Summary to process more than one schedule
each time it runs. For more information, see Processing Multiple Schedules Per Session
on page 288.
287
OPTIMA 8.0 Operations and Maintenance Guide
When a report is run, log messages are generated. You can check the log messages to
make sure that the data has arrived.
Add/Configure Report
To set this parameter, insert a new record into the OPTIMA_COMMON table as follows:
COMMIT;
288
About the OPTIMA Summary Application
1. Install the TRUNC_MINS function for the AIRCOM schema. This should have a public
synonym created for it. This returns the date truncated to a defined number of minutes.
Where
289
OPTIMA 8.0 Operations and Maintenance Guide
Parameter Description
From the OPTIMA Summary toolbar, you can select these options:
New Summary Report Create a new summary report. For more information,
see Adding a New Summary Report on page 291.
Edit Report Make changes to the configuration of a single report.
For more information, see Editing and Deleting a
Summary Report on page 320.
Edit Multiple Reports Make changes to a number of reports simultaneously.
For more information, see Editing Multiple Reports on
page 320.
Delete Report Delete an existing report. For more information, see
Editing and Deleting a Summary Report on page 320.
Log Viewer View the list of log messages. For more information,
see Viewing Log Messages on page 325.
Schedule Explorer View and edit the list of report schedules. For more
information, see Viewing and Editing Report
Schedules on page 314.
Job Explorer View the list of Oracle DBMS_SCHEDULER jobs. For
more information, see About Oracle
DBMS_SCHEDULER Jobs on page 327.
Note: This tab is for use with Oracle versions prior to
10G. For subsequent versions use Schedule Explorer
which identifies running summaries with "Current
Session ID".
About Summary View version information of the OPTIMA Summary
Process. For more information, see About OPTIMA
Summary Version on page 285.
290
About the OPTIMA Summary Application
GRANT SELECT ON
<SCHEMA>.<SRC_TABLE>
TO AIRCOM
<SCHEMA>.<DST_TABLE>
TO AIRCOM
• The following tables must have a valid partition for the current day:
o Summary_Log
o Destination Table
Note: It is important for all summary tables to have an ENTRIES column to enable
resummarization.
To add a new summary report in the OPTIMA Summary Configuration dialog box:
-or-
Right-click in the OPTIMA Summary Configuration dialog box and from the menu that
appears, click New Report.
2. In the dialog box that appears, click a particular column to select the machine/interface to
be used in the PRID for the new report.
291
OPTIMA 8.0 Operations and Maintenance Guide
Note: For more information on PRIDs, see About PRIDs on page 29.
3. Click OK. The Create New Summary Report dialog box appears.
Report Configuration Configure the report, source table and summary table.
292
About the OPTIMA Summary Application
1. If you want the report to be run by the OPTIMA Summary processing package, select the
Report Enabled option.
2. In the Report Configuration pane, from the Summary Time Aggregation drop-down list,
select the aggregation type. This will be applied to the PK date column in the source table
to aggregate the data to the granularity of the summary table. This means that the
granularity of the summary table will be based on the value in the Summary Time
Aggregation field.
For example, if you select Daily, then the summary table will have data for each day from
the source table.
Important:
o If you select Element Aggregation value, then the granularity of the source table will be
same as the granularity of the summary table. Granularity is required for element
aggregation to ensure that the scheduling is configured correctly.
o If you specify element aggregation without time aggregation, then you cannot use the
Managed Element Insert or Managed Element with Delete load options. For more
information, see Setting Advanced Options on page 296.
293
OPTIMA 8.0 Operations and Maintenance Guide
o If you select Element Aggregation Only, then you will need to specifically select the
Summary Table Granularity.
o If you want to use the 'Minutes' time aggregation granularity, then you must ensure that
you have defined the TRUNC_MINS function correctly. For more information, see
Configuring Sub-Hourly Summaries on page 289.
o If you select IW-Weekly or D-Weekly you can use the Weekly Offset drop-down list to
specify the start of a non-standard week. For example, for IW-Weekly the default start
of week is Monday so selecting an offset of 2 would result in a week starting on
Wednesday. For D-Weekly, the default start of week is determined by the Oracle NLS
setting and the offset is applied to that default.
3. If you have selected Element Aggregation as the aggregation type, in the Summary Table
Granularity field, define the granularity of the summary table in terms of minutes, hours,
days or weeks.
For example, if the granularity is specified as 1 day, then the summary table will have data
for each day.
4. In the Description text box, you can optionally type a text description for the report for
identification purposes.
1. If the summary has two source tables, select the Enable Second Source Table option.
For example, you may require a join of two raw tables, or a raw table with a busy hour
definition table. The two tables will be joined by the OPTIMA Summary PL/SQL package.
If you select this option, then you will have to set the configuration for two source tables:
o Source 1 - Primary
o Source 2 - Secondary
If you do not select this option, then you will only have to set the configuration for the
primary source table.
Using a second source table enables the OPTIMA Summary to check that data is present
in both tables before summarizing/re-summarizing.
Important:
o You must still specify both tables in the SQL Query with the correct join.
o If you are using the Managed Element Insert or Managed Element with Delete load
options, you cannot specify a second source table.
2. From the Source # drop-down list, select which source table you want to configure - either
primary or (if you have selected the Enable Second Source Table option) secondary.
3. From the Schema drop-down list, select the schema in which the source table exists.
4. From the Table drop-down list, select the source table for the summary report.
5. From the Datetime Column drop-down list, select the Oracle date column that is used as
the primary key of the source table.
294
About the OPTIMA Summary Application
Note: If your secondary source is a CFG table with no date field in the primary key, then
this can be left empty.
6. If you are generating summaries across multiple time zones, and want to aggregate the
correct data using timestamp aggregation:
o Select the Enable Timestamp Aggregation option.
o From the Timestamp Column drop-down list, select the source table's timestamp
column name. This will be the column that is read across all of the raw tables in order
to ensure time zone consistency.
Important:
o When configuring the summary table, you should ensure that you choose the correct
time zone option, either Natural or Selected. For a description of these options, see
Configuring the Summary Table below.
o If you select the Enable Timestamp Aggregation option as well as the Enable
Second Source Table option, you must use the 'Override SQL' option in the
Advanced Options to specify the SQL that will be used to join the tables. For more
information, see Setting Advanced Options on page 296.
7. In the Source Join Elements box, use comma-separated values to define which common
elements join the two source tables.
The first column in the primary list should match the first column in the secondary list, and
so on.
For example, BSC in the primary source table may correspond to BSC_NAME in the
secondary source table, CELL may correspond to CELL_NAME and so on.
8. In the Entries Formula box, specify the formula used for the CRC check.
The Entries Formula is used to load the ENTRIES column, whose column name is
specified in the Summary Table Configuration. In normal usage for daily, weekly, and
monthly summaries, you need to specify COUNT(*).
When the table is loaded from multiple count groups, each counter group in the raw table
can be checked by selecting a column from each counter group. For example, NVL(Col
A,0) + NVL(Col B,0) + NVL(Col C,0) where Col A, Col B and Col C are single columns from
each counter group.
If the source table for the summary is a summary table in another report, for example, a
daily summary can be a source for a weekly summary, then the ENTRIES formula should
be SUM (ENTRIES), where ENTRIES is the ENTRIES column in the daily summary report.
Note: If you are defining two source tables, the entries formula will be defined once for
both. You should differentiate any columns that have the same name in the primary source
and secondary source by using the appropriate prefix, either 's1' or 's2' respectively.
9. In the Filter box, type the filter. This filter applies to the source table selected in the
Source# drop-down list. The filter enables you to restrict the number of rows in the source
table to be summarized.
This field can have either a date filter or an element filter or both.
Important: If you are using either of the Managed Element load options, then you do not
need to define a report filter.
An example of the Element filter is BSC = 'BSC1'. In this example, only BSC1 will be
summarized.
295
OPTIMA 8.0 Operations and Maintenance Guide
An example of the Time filter is a working week. A working week will summarize only the
working days and not Sunday.
TO_CHAR(DATETIME,'D') IN (1,2,3,4,5,6)
where:
DATETIME is the date PK column name and (1,6) means Monday-Saturday, that is,
exclude Sunday. If you want to exclude Saturday as well, you will need (1,5).
Note: If you are defining two source tables, the filter will be defined once for both. You
should differentiate any columns that have the same name in the primary source and
secondary source by using the appropriate prefix, either 's1' or 's2' respectively.
10. In the Aggregated Elements box, specify the remaining non-date part of the logical
primary key of the data in the SQL query:
o If element aggregation is not being used, this is the primary key of the source table
minus the date column.
- or -
o If element aggregation is being used, this is the primary key of the source table query
minus the date column and any columns which are at a lower level to the aggregated
level.
- or -
o If the Managed Element Insert or Managed Element with Delete load option is being
using, this is a single column representing the managed element - for example, BSC
for a 2g network or RNC for a 3g network. This column must exist in the source and
summary tables, and for optimum performance it should be defined as the second
column in the primary key (after DATETIME).
For example, to aggregate CELLSTATS with primary key (DATETIME, BSC, CELL) to a
BSC level, you would enter BSC whereas for CELLSTATS hourly summary without
element aggregation, you would enter BSC, CELL.
If you want to use element aggregation by CFG table, these columns do not have to be the
same as the Source Join Elements.
Note: If you are defining two source tables, the aggregated elements will be defined once
for both. You should differentiate any columns that have the same name in the primary
source and secondary source by using the appropriate prefix, either 's1' or 's2'
respectively.
Important: The number of the columns and the data inside the Aggregated Element box
for both the source table and summary table must match. The column names must be
different, but the rest must be identical, because the source aggregated elements columns
will joined to their equivalent summary aggregated elements columns (first to first, second
to second, and so on).
296
About the OPTIMA Summary Application
The Advanced Options dialog box enables you to tune your summary report configuration further,
across a number of tabs.
Item Description
Log Severity Set the severity of the log message that will be logged by the OPTIMA Summary
package.
The various log severity levels are:
• Debug
• Information
• Warning
• Minor
• Major
• Critical
Difference Engine Hint Use hint for the difference engine.
Tip: It is recommended to use the Hash hint.
Load Option Select one of the load options, which determines how data is handled in the
summary tables.
For more information, see About the Load Options for Summary Reports on page
299.
297
OPTIMA 8.0 Operations and Maintenance Guide
Item Description
Before running a particular report, it is possible for the package to run a user defined SQL to tune
the database session before running the main SQL Query in the SQL Query tab.
-or-
In the User-defined SQL text box, enter the SQL that should be executed before the
report is run. You can enter SQL to change a database parameter for the session. This will
only apply to the current report and the session will close after the report has run. Hence,
any changes to session parameters will be reset.
It is also possible to call a PL/SQL procedure before the report is run. For example, to call a
SUMMARY_EXAMPLE procedure passing the parameter 100, type the following:
begin
aircom.summary_example(100);
end;
298
About the OPTIMA Summary Application
Important: If you change a database parameter for the session and have also chosen to
process multiple schedules per session, then this session parameter change is applied to
all schedules for this session, not just the one for which it is defined.
For more information, see Processing Multiple Schedules Per Session on page 288.
Tip: If you need to clear the SQL, you can use the Clear SQL button .
These tabs show the query that will be run. This is dependent on the option selected in the Load
Options drop-down list in the Advanced Options tab.
Note: Minor changes are made to the SQL before it is run, for example, to replace to the schedule
filter with the filter defined for the current schedule.
When adding a new summary report, on the Advanced Options tab of the Report Advanced
Options dialog box you can specify the load option. This option is used to determine how data is
handled in the summary tables.
Important: Some load options also require you to define the aggregated elements fields in the
report configuration in a particular way. You must complete this configuration correctly for the
summary report to work as required.
Insert with Delete Inserts new summary periods into Date Not required.
the Summary table.
If re-summary is required, then it
deletes the entire period and
inserts again.
Insert then Update Inserts data of new periods into Primary Key All primary key
the Summary table. columns except for
date, in a comma-
If re-summary is required, then it separated list.
inserts the missing rows and
updates for incomplete or
incorrect rows using the primary
key.
Insert Only Inserts data of new periods into Primary Key All primary key
the Summary table, with no re- columns except for
summary. date, in a comma-
separated list.
Update Only Only updates the primary keys Primary Key All primary key
that have changed. columns except for
date, in a comma-
separated list.
299
OPTIMA 8.0 Operations and Maintenance Guide
Date Insert Only Inserts new periods into the Date Not required.
Summary table and does not re-
insert, update or delete the data
after it has been inserted.
Managed Element Insert Inserts data on a per managed Managed A single managed
element basis into the Summary Element element column - for
(Recommended for table. The managed element is example, RNC.
Recent schedules) the element producing the files
which are loading the table - for
example, BSC for 2G data or RNC
for 3G data.
If a managed element has been
partially summarized for a period it
will not be re-summarized.
Note: The managed element
corresponds to the GPI (Grouping
Primary Identifier) column in an
OIT interface template.
Managed Element with Inserts data on a per managed Managed A single managed
Delete element basis into the Summary Element element column - for
table. example, RNC.
(Recommended for
Historic schedules) If a managed element has been
partially summarized for a period
and requires to be re-summarized,
the data for that managed element
for that period will be deleted and
re-inserted.
Note: For a more detailed description of the load types and their equivalent database values, see
Tuning the OPTIMA Summary on page 323.
Important:
• The Managed Element Insert load option enables you to summarize managed elements
that exist in different timezones, and therefore avoid the need to create schedules for each
timezone. This is because it excludes the most recent period per management element,
which means that you must have some data in the time period after the one you want to
summarize. For example, for a daily summary, there must be some data for Tuesday in the
source table in order to summarize Monday, or for a weekly summary, there must be some
data for Week 2 in the source table in order to summarize Week 1.
• You can override the load type at the schedule level; for example, you could set a recent
schedule to process managed element insert whereas the historic schedule could process
managed element with delete. However, you cannot have both a Managed Element-level
and a Primary Key-level schedule for the same summary report.
300
About the OPTIMA Summary Application
To support summarization of data in time zones that are in the future with respect to the database's
SYSDATE:
• If the Summary PRID is based on a daily summary (that is, weekly/monthly summaries)
and the 'DIFF_ENGINE_SAFETYPERIOD_HOURS' parameter has been set in
OPTIMA_COMMON, then this parameter will define the safety period in hours. This allows
weekly/monthly summaries to summarize the previous week/month on the first day of the
next week/month.
• Otherwise for all standard managed element insert summaries, the difference engine will
exclude the latest period for each managed element, and summarize all remaining periods.
This means that if there is 15 minute data for 15:00 then the 14:00-14:59 hourly summary
period can be summarized, even if the SYSDATE is 11:00.
1. From the Schema drop-down list, select the schema in which the summary table exists.
2. From the Table drop-down list, select the summary table. When you select a summary
table, the Datetime Column, Aggregated Elements, and the Entries Formula fields
acquire default values:
Field Description
Datetime The Oracle Date column in the Summary Table that is used in the Primary
Key.
Aggregated Elements The Summary Table's primary key minus the date column.
The Aggregated Elements column names should be specified as a comma
separated list.
Important: The number of the columns and the data inside the Aggregated
Element box for both the source table and summary table must match. The
column names must be different, but the rest must be identical, because the
source aggregated elements columns will joined to their equivalent summary
aggregated elements columns (first to first, second to second, and so on).
Entries Column The column used to store the Entries Formula value.
3. If you have chosen to enable timestamp aggregation, select the required time zone option.
This table describes these options:
Item Description
Natural Timezone If you select this option (the default), the time zone value will be ignored - in
other words, 10:00 in time zone 1 will be the same as 10:00 in time zone 2,
time zone 3 and so on.
The timestamp is returned as a date with the time zone information ignored,
and the data is then aggregated to daily, weekly, monthly and so on based on
this.
301
OPTIMA 8.0 Operations and Maintenance Guide
Item Description
Selected Timezone If you select this option (and then select a time zone from the drop-down list),
the time zone value will be used to aggregate the data at the correct time
across multiple time zones.
For example, you have three time zones: West (-1 hour), Central (the
meridian) and East (+ 1hour). If you choose Central as the Selected
Timezone, then a summary report configured to summarize the data across all
3 time zones at 10:00 Central time will aggregate the 09:00 data from West,
10:00 data from Central and 11.00 data from East.
An example of the SQL Query tab of the Summary Report dialog box
Note: This example SQL query does not use time zones. To view a query that includes time zones,
see Example SQL Query Using Time Zones on page 304.
On the Report Configuration tab, you can choose a filter to apply to the source table selected in
the Source# drop-down list. These filters enable you to restrict the number of rows in the source
table to be summarized.
302
About the OPTIMA Summary Application
To use these filters, click the appropriate filter button. For example, click Source1 Filter to retrieve
the value for the Element filter that has been set for Source 1.
When you click on anyone of these buttons, placeholders are inserted and values are picked based
on the ones that you have specified in the Report Configuration tab.
Note: It is important to type AND or Where in the SQL statement where it is applicable as per SQL
rules.
2. Type the SQL to select the data to summarize from the Source Table.
You need to keep in mind the following rules for the Select query:
o It must include the clause Where %DATE1 OR DATETIME BETWEEN :STARTDATE
AND :ENDDATE.
If you have a second source table, click Source2 Date Filter to insert the %DATE2
placeholder.
o The query must give an alias for the date column to itself if there is a date truncation
applied. For example, SELECT TRUNC(DATETIME,'DD')DATETIME
Notes:
If the date column is to have a different name in the summary table, it must still have
the alias for the date column name in the source table (it will still be mapped by
position to the correct column, as defined on the Column Mappings tab).
If you are using sub-hourly summaries, then you should use the TRUNC_MINS function
instead of TRUNC.
o An alias should not be applied to the columns in the aggregated elements PK list. Only
the date column should be aliased in the Primary Key.
o An alias should be given to the remainder of columns (counters).
o The query should not have :GROUPELEMENT bind variable. The only bind variables in
the query should be :STARTDATE and :ENDDATE.
SUM(COL1) COL1,
SUM(COL2) COL2,
SUM(COL3) COL3,
SUM(COL4) COL4,
SUM(COL5) COL5,
COUNT(1) ENTRIES
FROM SUMTEST.CELLSTATS
303
OPTIMA 8.0 Operations and Maintenance Guide
GROUP BY TRUNC(DATETIME,'DD'),BSC,CELL
SUM(COL1) COL1,
SUM(COL2) COL2,
SUM(COL3) COL3,
COUNT(*) ENTRIES
FROM ERICSSON2G.SUM_CELLSTATS_TZ
This picture shows this SQL query on the SQL Query tab:
304
About the OPTIMA Summary Application
An example of the Column Mappings Tab of the Summary Report dialog box
305
OPTIMA 8.0 Operations and Maintenance Guide
Load Column Indicates whether a particular column of the SQL query is mapped to the
corresponding column of the summary table. It takes the value 1 to indicate
that the column is mapped, else its value is 0.
Is PK Indicates whether the column is a primary key of the summary table.
1. Click the Column Mappings tab. You will see that the left hand side pane has the values
for the SQL query columns and the right hand side has the values for the summary table
columns. The first three columns will be empty.
2. Select a particular row on the right hand side and click Match Highlighted->. The system
will map a column from the left hand side to the selected column on the right hand side,
delete the column from the left hand side and populate the row on the right hand side. After
you click Match Highlighted, the value of Load Column changes to 1 as the column has
been mapped.
Note: The color of the selected row on the right hand side indicates the following:
o No Color: The SQL query column is correctly mapped to the summary table column.
o Green: There is no mapping between the SQL query column and the summary table
column.
o Yellow: The SQL query column is mapped to the summary table column but the
Name/Data Type/Position is different between the SQL query column and summary
table column. However, as the column is not a primary key of the summary table, the
summary will work despite the mismatch.
o Red: This means that the Summary application will not work and it will result in an error
due to any one of the following reasons:
First, an SQL query column is mapped to a summary table column but the Query
Alias Name of the SQL query column is different from the Query Alias Name of the
summary table column and this column forms part of the Primary Key of the summary
table. To rectify this error, ensure that the query alias name of the summary table
column is same as the query alias name of the SQL query column for the primary key.
Second, the Query Alias Name is not displayed in the 'Datetime Column' or
'Aggregated Elements' fields in the source table configuration in the Report
Configuration tab and the field forms part of the Primary Key of the summary table. To
rectify this error, ensure that the Query Alias Name is displayed in the 'Datetime
Column' or 'Aggregated Elements' fields.
3. Click Clear and Load Summary Columns to remove all the mappings, reload the column
list from the summary table and repopulate the query columns in the left hand grid.
- or -
Select a particular row and click Remove Current Match to remove the mapping for that
row.
Tips:
o Click Sync By Name to map the columns with the same name.
o Click Sync By Position to map the columns with the same position.
306
About the OPTIMA Summary Application
When you click the Schedules tab, two default schedules for that report are automatically created.
These are:
• Recent Schedule:
A recent report checks the recent periods. The date period for a recent schedule is
between SYSDATE-RECENT and SYSDATE.
The following table lists the recent periods and next data formulas for various granularity
levels:
Important: Weekly and monthly recent summaries will run on the first two days of the
week/month, at the same time of day. The summaries work as follows:
o If the summary is running on the first day of the week/month, then it will only
summarize the previous week/month if the last day in the previous week/month is
present for each managed element
o If the summary is running on the second day of the week/month, then it will always
summarize the previous week/month if it has not been summarized already on the the
previous day for each managed element
307
OPTIMA 8.0 Operations and Maintenance Guide
• Historic Schedule:
When it first runs, a historic report will process a set of past data equal to the default period
defined in the report, to ensure that the table has a set of historic data to start with. If you
want more historic data than this, then you must temporarily change the start date. The
date period for a historic schedule is between SYSDATE-HISTORIC and SYSDATE-
RECENT.
The following table lists the historic periods for various granularity levels:
Destination Date Period to Look Back Every Time Next Schedule Date
Data Truncations Formula
Granularity
308
About the OPTIMA Summary Application
Note: The recommended load options for schedules using standard time-based aggregations (HR,
DY, WK or MO summaries) are:
• Recent Schedule - Managed Element Insert
• Historic Schedule - Managed Element with Delete
Option Description
309
OPTIMA 8.0 Operations and Maintenance Guide
Option Description
Parent_Schedule_ID Indicates the ID of the schedule which if updated will require the
current schedule to be run.
Note: A historic schedule will normally be the parent schedule of
another historic schedule and a recent schedule will be the parent
schedule of another recent schedule.
Start_Period The start date of the period. This is relative to the current date, also
known as the SYSDATE and is shown as (as SYSDATE - x) where
x is in days.
For example, if it is specified as (SYSDATE-1), it means that the
start period for the data will be from yesterday as SYSDATE is the
current date.
End_Period The end date of the period. This is also relative to the current date
and is specified as (SYSDATE - x) where x is in days.
Note: A second is always subtracted from the end date. So, in
order to process a daily summary of the 24th, you should pass start
date as 24/11/2008 00:00:00 and end date as 25/11/2008 00:00:00
which is then converted to process the 24th.
For example, if the end period is specified as (SYSDATE-0), then
only the data till today will be picked up. Hence, if start period is
(SYSDATE-1) and end period is (SYSDATE-0), then the data will
be processed only for yesterday.
Priority The priority of the schedule. The most urgent schedule should be
given a 1, while a lower priority schedule should be given a higher
number. The Oracle job will process a higher priority schedule
before the lower priority schedule.
Note: More than one schedule can have the same priority number.
Enabled Indicates whether the schedule will run.
Current_Process_Start_Date Indicates the date and time when the current schedule started
processing.
Next_Run_Date Indicates the date when the schedule is next scheduled to run.
Last_Run_Date Indicates the date when the schedule was last run.
Next_Schedule_Date_Formula An ORACLE formula that is used to calculate when the schedule
should next be run after it has finished processing.
For example, a value of SYSDATE + (15/24/60) will mean the
schedule will run 15 minutes after it has completed processing and
a value of TRUNC (SYSDATE+1) +(20/24) will run the next day at
8pm.
Note: Schedules are also run according to the Run Order. The run order lists the order in
which the schedules will be run, and is determined by an algorithm that takes into account
how long the schedule has been waiting to run. This algorithm increases the priority of a
schedule for each hour that the schedule is delayed from running (by subtracting one from
the priority value), meaning that it will be higher in the run order. This means that a lower
priority schedule that has been waiting a longer time than a higher priority schedule could
be run first.
For example, consider two schedules A and B; Schedule A has a priority of 3 while
Schedule B has a priority of 5. Schedule A has been delayed from running by 1 hour
meaning that its run order is 2 (Priority minus 1 [hour]). However, Schedule B has been
delayed from running by 4 hours, so its run order is 1 (Priority minus 4 [hours]). Therefore
Schedule B will be run first.
4. In the Close Report dialog box that appears, click Yes to save the report.
310
About the OPTIMA Summary Application
Tip: You also have the option to edit these schedules. For more information, see Viewing
and Editing Report Schedules on page 314.
As well as the two default schedules, you can create your own additional recent report schedules.
This is particularly useful if your network spans multiple timezones, because it means that you can
create separate schedules for each timezone. For more information, see Adding Report Schedules
on page 311.
Important: If you use the Managed Insert load option, then you do not have to create separate
schedules per timezone. For more information, see About the Load Options for Summary Reports
on page 299.
This is particularly useful if your network spans multiple timezones, because it means that you can
create separate recent schedules for each timezone. Different timezones need to be processed at
different times, based on when the data has loaded into the raw table - for example, based on your
own location, each other time zone used will have a different 00:00, which could be before or after
your own, and therefore the point at which a day's worth of data is collected will be different as well.
Different schedules will be needed to compensate for this.
Important: If you use the Managed Insert load option, then you do not have to create separate
schedules per timezone. For more information, see About the Load Options for Summary Reports
on page 299.
1. On the Schedules tab, click the Add Recent Schedule button or Add Historic
- or -
Right-click in the Schedules pane, and from the menu that appears, click either Add
Recent Schedule or Add Historic Schedule as appropriate.
311
OPTIMA 8.0 Operations and Maintenance Guide
The Summary Schedule dialog box appears. This picture shows an example recent
schedule:
3. Define the details for the new schedule, as described in the following table:
Schedule Period Change the Start Date and the End Date of the schedule, calculated
based on the SYSDATE. To allow for time zone differences, you can
specify start points before or after the SYSDATE in terms of days and
hours.
If you want the start date and the end date to be calculated by truncating
to the midnight of the day that has been selected, ensure that the
Truncate to Day checkbox is enabled.
Schedule Configuration Set the priority of the schedule. A lower number indicates higher priority.
From the Next Schedule Date Formula drop-down list, select the Oracle
formula that will be used to calculate when the schedule should next be
run after it has finished processing.
Tip: You can include a CASE statement in the Next Schedule Date
Formula, to enable more advanced scheduling.
312
About the OPTIMA Summary Application
Schedule Dependencies Select the Dependencies checkbox if you want the schedule to be
dependent on a parent schedule. This means that after the parent
schedule has run, and if it is determined that the current schedule can be
run, then the Next_RUN_DATE of the current schedule is set to the
current date.
From the Parent Schedule ID drop-down list, select the schedule ID for
this schedule.
Notes:
• This option is ignored if either of the Managed Element load options is
being used.
• You can select a summary report which populates either the source1
table or the source2 table in the Report Configuration tab. If the
source tables are not populated by the summary, then dependencies
cannot be used for the current schedule. The Parent Schedule ID is
automatically set by the Summary GUI. If the current schedule is a
recent schedule, then the recent schedule for the parent report is
selected as the Parent Schedule ID. If the current schedule is a
historic schedule, then the historic schedule for the parent report is
selected as the Parent Schedule ID.
Load Options Select the Override Report Load Option if you want this particular report
schedule to use a different load option to the one defined for the report.
Select the required load option for the schedule from the drop-down list.
This means that you can have different load options for Recent and
Historic schedules - for example, you may just want to use Insert Only for
the Recent schedule to load data only, but then use Insert with Delete for
the Historic schedule to 'clean up' the summary tables.
For more information on the load options, see Setting Advanced Options
on page 296.
Important: You cannot have schedules for the same PRID using a
combination of primary key load options (Insert Only, Insert then Update,
Update Only) and managed element load options (Managed Element
Insert, Managed Element with Delete). For example, you cannot have a
Recent schedule that uses Managed Element Insert and a Historic
schedule that uses Insert then Update.
This is because the Aggregated Elements field on the Report
Configuration tab is defined differently for each of these load options. For
more information, see The Report Configuration Tab.
This does not apply to Insert with Delete and Date Insert Only, because
these do not use the difference engine.
Note: The recommended load options for schedules using standard time-
based aggregations (HR, DY, WK or MO summaries) are:
• Recent Schedule - Managed Element Insert
• Historic Schedule - Managed Element with Delete
These are the defaults created by the OPTIMA Installation Tool.
Schedule Filter SQL Select the Schedule Filter option, and then define the SQL query that you
want to use to filter the data.
For example, if you are creating separate schedules for each timezone
within your network, you should use this to filter on timezone, in order to
filter out data that has not been completely loaded yet because it is in a
different timezone that begins loading later than the other timezones.
Note: This filter is used in addition to the report-level filter. Both of these
filters are optional and are not required for any of the load options.
313
OPTIMA 8.0 Operations and Maintenance Guide
Schedule Information Click the Set to SYSDATE button to set the Next_RUN_DATE to the
current date. In this case, the schedule will run immediately.
Important: This needs to be done when the schedule is first created,
otherwise it will never run. The next run time will then be calculated based
on the Next Schedule Date Formula.
Click the Reset Currently Processing button to reset the data.
You should click this button if the DBMS_SCHEDULER job session
crashes while this schedule is running, in order to run the schedule again.
Important: You should first investigate the cause of the crash before
resetting the data.
4. Click Save.
1. In the OPTIMA Summary Configuration dialog box, click the Schedule Explorer button
.
The Schedule Explorer appears, displaying the report schedules for all the reports.
This dialog box explains all the schedule parameters. For more information on schedule
parameters, see The Schedules Tab on page 307.
314
About the OPTIMA Summary Application
- or -
While creating a report, in the Schedules tab of the OPTIMA Summary Configuration
dialog box, double-click a schedule.
- or -
Select the required schedule, and then click the Edit Single Schedule button .
2. Edit the schedule details as required. For more information on these, see Adding Report
Schedules on page 311.
3. Click Save.
Tip: For information on how to edit several schedules at once, see Editing Multiple Report
Schedules on page 315.
As well as editing individual report schedules, you can also edit multiple report schedules
simultaneously.
To do this:
315
OPTIMA 8.0 Operations and Maintenance Guide
3. Change the details of the schedules as required. The parameters are the same as those for
individual schedules, although a few have slightly different names; for example, the 'Set all
Selected Schedule's Next Run Dates to SYSDATE' option is a checkbox rather than a
'Set To SYSDATE' button.
Time
Now
Recent Schedule
Period
1. When a summary table is configured by the OPTIMA Summary, a report is created which
will have a PRID. The OPTIMA Summary Process will then generate the following two
schedules for the report:
o Recent Schedule
o Historic Schedule
Note: For more information on PRIDs, see About PRIDs on page 29.
2. The recent schedule will summarize and resummarize the recent data in the raw table, for
example, from SYSDATE-3 to SYSDATE.
3. The historic schedule will run less often and will resummarize any late data that has loaded
into the raw table. The historic schedule will therefore process an older period, for example,
from SYSDATE-15 to SYSDATE-3.
The end period of the historic schedule should match the start period of the recent
schedule, the summary will subtract one second from the end period so the data queried
will not overlap. When the historic schedule executes for the first time, it will process a
much longer period to allow all the data in the raw table to be summarized.
316
About the OPTIMA Summary Application
You can do this for all reports within a particular schema, to ensure that data from each time zone
is processed by a separate schedule.
To do this:
1. Ensure that:
o You have uploaded and successfully activated the required interface with summaries
using the OPTIMA Installation Tool.
o You have installed the Summary table SUMMARY_GLOBAL_FILTERS, using the
SUMMARY_GLOBAL_FILTERS.SQL file.
o The SUMMARY_LOG table is partitioned, and has an INSERT grant assigned to you
o The SUMMARY_REPORTS table has INSERT and UPDATE grants assigned to you
2. In the SUMMARY_GLOBAL_FILTERS table of your database, add a new record for each
time zone/schedule required.
4. In the OPTIMA Summary Configuration dialog box, click the Schedule Explorer button
.
317
OPTIMA 8.0 Operations and Maintenance Guide
6. From the Schema drop-down list, select the schema that you have activated, and then click
the Create Schedules button.
Note: You should check the SUMMARY_LOG table for any errors.
This replaces the previous recent schedule, and differs in two ways - the dependencies
will be switched off and the SCHEDULE_FILTER_SQL will be populated from the
values in the SUMMARY_GLOBAL_FILTERS table.
o The dependencies for the weekly and monthly schedules are set to the daily schedule
that has the SCHEDULE_FILTER_SQL in which the IS_PARENT_SCHEDULE value
in the SUMMARY_GLOBAL_FILTERS table is set to 1. This means that the weekly
and monthly schedules will be dependent on the last daily schedule to run.
o For all hourly and daily reports, the SQL queries (Select, Date Insert, Pk Insert and Pk
Update) are updated to use the %SFILTER schedule filter.
About Dependencies
A dependency is a situation when one schedule depends on data to arrive from another schedule
before it executes. Hence, dependencies have parent and child schedules. The schedule that is
dependent is the child schedule while the one for which the child schedule waits is the parent
schedule. A child schedule will execute only after the parent schedule has executed and data from
the parent schedule has arrived.
For example, a daily schedule will be dependent on an hourly schedule as it will be executed only
after data for all the hours in a day has come in after executing an hourly schedule. In this case, the
daily schedule is the child schedule and the hourly schedule is the parent schedule.
318
About the OPTIMA Summary Application
The two types are processed differently. The Schedule_Type column determines the type of
dependency:
If the schedule type is 3 or 4, it means that such schedules do not have any dependent schedules.
Note: Dependencies are not used for schedules that used the Managed Element Insert or
Managed Element with Delete load options.
If the schedule type is 1, then any child schedules will be scheduled to process immediately if the
summary process has processed the last period in the parent schedule.
For example, if the 11pm data is processed, then a daily (child) recent schedule is set to run
immediately. If a daily summary schedule is running, then if it has processed the last day of the
month, then the monthly schedule will be set.
To improve efficiency, before a schedule processes a set of periods, it will generate a list of child
schedule IDs together with the date and time that must be processed to cause the child schedule to
be set. Every time it has completed processing a period, it will check the date and time of the period
with the list and if it finds any matches it will set the NEXT_RUN_DATE of the child schedule to
SYSDATE.
If the schedule type is 2, and any data - even row 1 - changes in the parent schedule, then the child
historic is set to run immediately. There is no check on the date period being processed, and the
assumption is that the processing period of the child schedule is the same as the parent schedule.
The dependencies check is therefore done when a schedule has finished processing a period. If
any rows have been updated or inserted then all child schedules with PARENT_SCHEDULE_ID
equal to the current SCHEDULE_ID are set to run immediately (NEXT_RUN_DATE is set to
SYSDATE).
319
OPTIMA 8.0 Operations and Maintenance Guide
1. Select the report that you want to edit and click the Edit Report button .
-or-
Right-click and from the menu that appears, click Edit Report.
4. In the dialog box that appears, click Yes to save your changes.
1. Select the report that you want to delete and click the Delete Report button .
-or-
Right-click and from the menu that appears, click Delete Report.
2. In the dialog box that appears, click Yes to delete the report. The selected report is
removed from the summary reports table.
To do this:
1. In the OPTIMA Summary Configuration dialog box, select multiple reports that you want
to edit.
- or -
Right-click and from the menu that appears, click Edit Multiple Reports.
320
About the OPTIMA Summary Application
This picture shows an example of the Edit Multiple Reports dialog box:
The different PRIDs that will be affected with your changes are displayed in the first pane.
These PRIDs are of the different reports that you have selected.
Report Enabled Click Update Report Enabled Status to activate the Enabled/Disabled
options.
Select the desired option depending on whether you want to enable or
disable the selected reports.
Entries Formula (Source 1 Select the Update Source 1 Entries Formula checkbox to activate the New
Table) Value text box.
In the New Value text box, type the new entries formula for Source 1 table.
321
OPTIMA 8.0 Operations and Maintenance Guide
Entries Formula Select the Update Source 2 Entries Formula checkbox to activate the New
Value text box.
(Source 2 Table)
In the New Value text box, type the new entries formula for Source 2 table.
Entries Formula Select the Update Destination Entries Column checkbox to activate the
New Value text box.
(Destination Table)
In the New Value text box, type the new entries formula for the Destination
table.
Load Option Click the Update Load Option checkbox to activate the New Value drop-
down list.
From the New Value drop-down list, select the new load option.
Log Severity Click the Update Log Severity option to activate the New Value drop-down
list.
From the New Value drop-down list, select the new log severity.
However, you may in very rare circumstances need to force the OPTIMA Summary to stop
processing a report. To do this, an entry in the OPTIMA_COMMON table needs to be updated.
OPTIMA_COMMON is a general purpose table used by many OPTIMA applications.
To do this:
Ensure that the terminate parameter has been defined in the OPTIMA_COMMON table as follows:
COMMIT;
To stop the OPTIMA Summary package processing any further reports and halt as soon as
the current period has been processed:
COMMIT;
COMMIT;
322
About the OPTIMA Summary Application
A recent schedule should be configured to use Managed Element Insert to summarize all
of the managed elements that have no data in the Summary table. A historic schedule
should also be set up to use Managed Element with Delete to resummarize older data, in
order to capture late-arriving data.
In the Summary Schedule dialog box, you should select the Override Report Load Option
checkbox to configure the recent and historic schedules to use different load options. For
more information, see Adding Report Schedules on page 311.
• Use Other Load Types for Element Aggregation, BH and BH Summary
Configurations: For more information, see Supported Summary Types on page 283.
• Select the most appropriate load type according to your needs: The specified
database values indicate the LOAD_METHOD value stored in the SUMMARY_REPORTS
table. If you cannot access the OPTIMA Summary GUI, you can specify the load type by
updating the SUMMARY_REPORTS table with the required LOAD_METHOD value.
323
OPTIMA 8.0 Operations and Maintenance Guide
Tip: For a general overview of the load types, see About the Load Options for Summary
Reports on page 299.
• Schedule the DO_WORK Oracle Job: The DO_WORK Oracle Job should be scheduled
to run using the DBMS_SCHEDULER. Multiple jobs should be set up to run the summary
concurrently. This table describes the four methods of calling DO_WORK:
324
About the OPTIMA Summary Application
Problem Solution
The sessions for summary Create a single daily job to run the following PL/SQL:
schedules occasionally
begin
terminate abnormally
optima_summary.reset_expired_schedules
end;
This will log a warning message for the updated schedules, for example
'Warning - Schedule ID 297 will be reset, as this schedule started
processing over 24 hours ago and the session no longer exists'.
Important: It is essential that the Summary_Log table is partitioned for the current date and time
for the log messages to appear in the Log Viewer.
To view the log messages in the OPTIMA Summary Configuration dialog box:
2. From the Select the minimum date and time to display drop-down options, select the
date and time after which you want to view the log messages.
325
OPTIMA 8.0 Operations and Maintenance Guide
Tip: The log messages that will be displayed will be for the time period between the
selected date and time and the present.
3. From the Select prid to display drop-down list, select a particular PRID for which the log
messages will be displayed.
The following table lists the information that you can view for log messages:
326
About the OPTIMA Summary Application
optima_summary.do_work()
Oracle Job
optima_summary.do_work()
Oracle Job
optima_summary.do_work()
Oracle Job
Ordered by
1. Priority
2. next_run_date
Execution of a schedule
1. Oracle Jobs are set up within the database to run every minute and call
OPTIMA_SUMMARY.DO_WORK().
2. The schedules are selected from the list in the SUMMARY_SCHEDULES table based on the
following criteria:
o Schedule’s next run date is before the current date (NEXT_RUN_DATE < SYSDATE)
o Schedule is not currently processing (current_process_start_date IS NULL)
o Schedule is enabled (ENABLED=1)
4. The job will process the highest priority schedule and then terminate. If there are more
schedules to process, they will be picked up by the next available job.
Each job therefore represents a concurrent execution of the summary. If there are five jobs,
then the five schedules can be processed at the same time.
327
OPTIMA 8.0 Operations and Maintenance Guide
Note: As the list of schedules ages, the Run Order value - stored as a parameter in the
Scheduler Explorer of the OPTIMA Summary GUI - will be taken into account. The run
order is determined by an algorithm that takes into account how long the schedule has
been waiting to run. This algorithm increases the priority of a schedule for each hour that
the schedule is delayed from running (by subtracting one from the priority value), meaning
that it will be higher in the run order. This means that a lower priority schedule that has
been waiting a longer time than a higher priority schedule could be run first.
For example, consider two schedules A and B; Schedule A has a priority of 3 while
Schedule B has a priority of 5. Schedule A has been delayed from running by 1 hour
meaning that its run order is 2 (Priority minus 1 [hour]). However, Schedule B has been
delayed from running by 4 hours, so its run order is 1 (Priority minus 4 [hours]). Therefore
Schedule B will be run first.
DBMS_Scheduler enables you to perform resource plan management, so that you are able to
control:
• The number of concurrent jobs for a particular job_class
• The order of executing of a job or groups of jobs
• Switching jobs from one resource plan to another during the day
• And much more
Scheduler Components
The Scheduler uses three basic components to handle the execution of scheduled tasks. An
instance of each component is stored as a separate object in the database when it is created:
Component Description
328
About the OPTIMA Summary Application
To do this:
1. Create a database link that enables you to access the source database from OPTIMA using
the login details for the source database. A database link is a schema object in one
database that enables you to access objects on another database.
You can run the following script within TOAD to create the database link while logged into
OPTIMA as the DBA.
USING 'OMCDB';
Where:
o OMCDBLINK is the name of the link you want to create
o OMC_USER is the name of the user on the source database (OMCDB)
o password is the password string
o OMCDB is the name of the source database from where the data is to be loaded into
OPTIMA.
2. In the OPTIMA database, define your summary report configuration in the usual way, using
the following guidelines:
o The Summary Time Aggregation option should be set as 'Element Aggregation Only'
o The Summary Table Granularity should match the granularity of the raw and
destination tables
o For the Source Table, the schema should be the one in the source database that you
want to load, and the table should be written as 'tablename@database link name'
o The Entries Formula should be COUNT(*) and the Entries Column should be 1 in order
to ensure a 1:1 mapping of primary key records
o The Summary Table that you define should be the table in the destination database
into which you want to load the data
329
OPTIMA 8.0 Operations and Maintenance Guide
WHERE %DATE1
This query will load all rows from the BSCQOS table in the ERICSSON_GERAN schema of
the source database referenced by the OMCDB database link.
Note: The query should not have a 'group by' clause, as it will be comparing individual
rows rather than groups.
330
About the Data Quality Package
The Data Quality package enables you to configure reports on the quality of data, for example, data
that is incomplete or missing.By default, the Data Quality package processes the data on a daily
basis, and calculates all its results at a daily level (although it can be configured to process at
different periods; for more information, see Configuring Period Processing on page 348).
The results are stored in specially-designated tables in the OPTIMA database, and you can use
several OPTIMA Excel reports to present these results in a more useful way to users. OPTIMA
supports Office 2010.
Important: You must install the backend components using the OPTIMA combined backend
package, which will install the required components quickly and easily. For more information on
how to use this, see Installing the OPTIMA Combined Backend Package on page 21.
3. Click Connect.
331
OPTIMA 8.0 Operations and Maintenance Guide
1. Global Configuration - this includes column set configuration, data source attribute
configuration and granularity configuration. For more information, see About Global
Configuration on page 333.
2. Data Source - this involves adding and configuring the data sources for the Data Quality
package. For more information, see About Data Source Configuration on page 341.
3. Data Quality - this determines the processes and groups with which the Data Quality
package will run. For more information, see About Data Quality Configuration on page 353.
Each of these configurations corresponds to a folder in first level of the Data Quality Console tree.
Important: Although you can configure the Data Quality package manually (as described in this
chapter), it is recommended that you use the OPTIMA Installation Tool to ensure that the correct
options are chosen and therefore the reports include all of the statistics.
For more information, see OPTIMA Installation Tool User Reference Guide.
332
About the Data Quality Package
Level Description
Important: It is strongly recommended that you run the Data Quality package only on the Table
and Managed Element column set levels. The Data Quality package should not be run on the
Reported Element level, in the majority of circumstances, as this causes too much of a load on the
OPTIMA database.
There are also other system column sets which help you describe the data. For example, you must
use the Reported DateTime column set to store the table’s DATE field. The DATE field contains the
date of the data, for example, the DATETIME column.
You can also define your own user-defined (non-system) column sets. This enables you to check
for data quality on user-defined levels, for example, for regions of a country. For information about
how to create column sets, see Creating Column Sets on page 334.
333
OPTIMA 8.0 Operations and Maintenance Guide
As part of the Column Sets configuration, you can also create user-defined column sets:
To do this:
1. In the Data Quality Console, in the tree view, select the Column Sets folder.
2. In the right hand pane, right-click and, from the menu that appears, click Create Column
Set.
334
About the Data Quality Package
Right-click the column set you want to edit and, from the menu that appears, click
Properties.
- or -
2. In the Column Set Properties dialog box that appears, make the required changes.
4. Click OK to close the Column Set Properties dialog box and return to the Data Quality
Console.
335
OPTIMA 8.0 Operations and Maintenance Guide
1. In the Data Quality Console, in the right hand pane, select the column set you want to
delete.
- or -
Creating Attributes
1. In the Data Quality Console, in the tree view, select the Data Source Attributes folder.
2. In the right hand pane, right-click and, from the menu that appears, click Create New
Attribute.
336
About the Data Quality Package
337
OPTIMA 8.0 Operations and Maintenance Guide
To edit an attribute:
Right-click the attribute you want to edit and, from the menu that appears, click Properties.
- or -
2. In the Attribute Properties dialog box that appears, on the General tab, make the required
changes.
3. On the Pick List tab, you can type the range of values the attribute can have. For example,
the System Key attribute has the values Yes and No to determine if the column is a key
field.
5. Click OK to close the Attribute Properties dialog box and return to the Data Quality
Console.
To delete an attribute:
1. In the Data Quality Console, in the right hand pane, select the attribute you want to delete.
- or -
Configuring Granularities
OPTIMA performance tables have different granularities of data. For example, raw data tables often
have hourly data, whereas summary tables often have daily or weekly data. There are a large
number of system granularities which come pre-configured with the Data Quality package, for
example, 15 minutes, daily or weekly.
338
About the Data Quality Package
You can also create new granularities as required. For more information, see Creating Granularities
on page 339.
Creating Granularities
1. In the Data Quality Console, in the tree view, select the Granularities folder.
2. In the right hand pane, right-click and, from the menu that appears, click Create New
Granularity.
5. To set the times of day when the granularity appears, you must edit the granularity. For
information about how to do this, see Editing and Deleting Granularities on page 339.
339
OPTIMA 8.0 Operations and Maintenance Guide
To edit a granularity:
Right-click the granularity set you want to edit and, from the menu that appears, click
Properties.
- or -
2. In the Granularity Properties dialog box that appears, on the General tab, make the
required changes.
3. On the Times of Day tab, type the times of day when the Granularity occurs. This setting is
used to check that data is present at the specified times.
Note: Times must be entered in HH24MI format as a 4 digit 24 hour time with no
punctuation. For example, for a 6 hour granularity you would type: 0000, 0600, 1200, 1800.
For granularities of 1 day and less, simply type 0000 or the time of day when the data is
present. This picture shows an example:
5. Click OK to close the Granularity Properties dialog box and return to the Data Quality
Console.
To delete a granularity:
1. In the Data Quality Console, in the right hand pane, select the granularity you want to
delete.
- or -
340
About the Data Quality Package
Important: You should not generally add the AIRCOM, GLOBAL, and OSSBACKEND schema
tables used for configuring OPTIMA as data sources.
The data sources you configure for Data Quality can be viewed in the Data Source tree in the Data
Quality Console. This picture shows an example:
Category Description
All Shows a list of all data sources added to the Data Quality package.
Schemas Shows the data sources by the schema they are in. Each schema appears as a sub-
item.
Attributes Shows the data sources by attribute. Each different attributes appears as a sub-item.
Note: By default, when you first install the Data Quality package, no data sources are configured.
1. In the Data Quality Console, in the tree view, select the All folder.
2. In the right hand pane, right-click and, from the menu that appears, click Add Data Source.
341
OPTIMA 8.0 Operations and Maintenance Guide
3. On the first page of the Wizard, choose a method to generate a list of available data
sources. The options are described in the following table:
Select this To
option
Oracle Data Search the Oracle Data Dictionary tables for all schemas in the database with the
Dictionary exception of the AIRCOM, SYS and SYSTEM schemas.
This allows you to add tables that have not been configured before for any
OPTIMA application.
OSS Data Dictionary Generate a list of tables used in the OPTIMA Data Dictionary which is populated
by the OPTIMA Interface Template.
If the OPTIMA Interface Template has been used to install the database, then this
will provide a quick method to select only tables relevant to Data Quality.
Configuration Tables Search for tables configured in OPTIMA backend applications such as the
OPTIMA Summary and the ETL Loader.
This may be the best option when the Summary and Loader have been
configured and Data Quality is needed for the tables used by these applications.
4. Click Next.
5. On the next page of the Wizard, in the right hand pane, select the data sources you want to
add and use the right arrow button to move them to the left hand pane.
The selected data sources are added to the Data Source tree in the Data Quality
Console.
Once you have added the data sources, you can configure them by either setting their properties or
by using the Add Columns Wizard. For more information, see Setting Data Source Properties on
page 343 and Configuring the Data Quality Package Using the Add Columns Wizard on page 350.
342
About the Data Quality Package
Note: Not all properties are available when setting data source properties globally.
1. In the Data Quality Console, in the right hand pane, select the data source(s) whose
properties you want to set.
Tip: Use the Shift and Ctrl keys to select more than one data source at a time.
Loading Type Select the type of data the data source contains from the drop-down list, either raw
or summary data.
Granularity Select the granularity of the data source from the drop-down list, for example, 15
minute data.
Use Topology Select this option if you want the Data Quality package to obtain a list of elements
Table which should be present at each granularity from a topology configuration (CFG)
table.
If you enable this option, you must also select the Owner (schema) and Name of the
topology table you want to use from the drop-down lists.
4. On the Attributes tab, you set values for any attributes created during Data Source
Attributes configuration. For more information, see Configuring Data Source Attributes on
page 336.
To set an attribute value, click the Value field you want to set and select the value from the
drop-down list:
Note: The attributes are used only for reporting purposes and are not required by the Data
Quality package.
5. On the Columns tab, you can view the column sets created during Column Sets
configuration by selecting a column set from the drop-down list. For more information, see
Configuring Column Sets on page 333.
343
OPTIMA 8.0 Operations and Maintenance Guide
The default values for the system column sets are then populated by the Data Source
Wizard.
Note: If you are setting properties for multiple data sources, the Columns tab is disabled.
6. On the Data Quality Configuration tab, you configure the Data Quality processes. For
more information about Data Quality processes, see About Data Quality Configuration on
page 353. The following table describes the options:
On this Do this
sub-tab
Completeness Select the levels you want the Data Quality package to process for Completeness.
Completeness is the percentage of available data for the period loaded.
Tip: It is recommended that you enable the Table and Managed Element levels.
To change a threshold value for a column set:
• Right-click the column set and, from the menu that appears, click Change
Threshold.
• In the dialog box that appears, set the required threshold and click OK.
To view the column details for a column set, right-click the column set and, from the
menu that appears, click Column Details.
To add a column set:
• Click the Add button.
• In the dialog box that appears, select the column set from the drop-down list, set a
threshold value and click OK.
Note: You can only add column sets that contain one or more columns.
To remove a column set, select the column set and click Remove.
To enable Completeness processing for the data source, ensure that the Enable
Process option is selected.
Note: If you are setting properties for multiple data sources, then only the Enable
Process option is available.
Tip: It is recommended that you configure the Data Quality package to have
Completeness enabled for all tables in each interface. However, it is not required for
tables with a granularity of a day or more.
344
About the Data Quality Package
On this Do this
sub-tab
Availability Select the levels you want the Data Quality package to process for Availability.
Availability is the percentage of elements which are completely missing for a day.
The options for Availability are the same as those on the Completeness sub-tab, as
described above.
Tip: It is recommended that you enable the Table and Managed Element levels. You
should also enable the Reported Element level for:
• For tables with a granularity of a day or more
• For tables with a granularity of less than a day that require troubleshooting
Note: There is no need to select these levels for CFG tables, as you only need to
define these for the performance management tables which are being reported on.
Tip: It is recommended that you configure the Data Quality package to have Availability
enabled for all tables in each interface.
Nullness Select the column set(s) you want the Data Quality package to process for Nullness.
Nullness is the number of null entries in the table for a day for a specified list of
columns.
The options for Nullness are the same as those on the Completeness sub-tab, as
described above.
In addition, you can choose which columns in the data source to check for null values.
To do this:
• Click the Choose Columns button.
• In the dialog box that appears, select the column(s) you require in the list and click
OK.
Tip: It is recommended that Nullness is only used when you need to perform
troubleshooting on specific tables.
Last Load Select the column set(s) you want the Data Quality package to process for Last Load.
Last Load is the last date a table loaded, that is, the maximum date of the table.
The options for Last Load are the same as those on the Completeness sub-tab, as
described above.
Tip: It is recommended that Last Load is only used when you need to perform
troubleshooting on specific tables.
345
OPTIMA 8.0 Operations and Maintenance Guide
7. On the DQ Period Processing tab, you can configure the period processing options.
For more information on how period processing works, see Configuring Period Processing
on page 348.
For more information on these, see Configuring Column Sets on page 333.
o Click the View Config/Scheduling Info button to check that the tables are configured
correctly.
346
About the Data Quality Package
Tip: Click the Current button to see the information for the current datasource, or click the
All button to see the information for all datasources.
A CFG table is configured, with the Managed and Reported columns set for the CFG
table as well as the raw table
The data in the CFG columns must match the data in the raw table columns
The order of the CFG columns must match that of the raw table columns
Important: You cannot directly edit the data displayed in this dialog box; instead, you must
edit the data on the other tabs of the Data Source Properties dialog box.
347
OPTIMA 8.0 Operations and Maintenance Guide
o To choose the columns that you want to report on, click the Choose Columns button,
and in the dialog box that appears, select the required columns:
Note: The available columns for period processing are the same as those available for
daily processing.
o Click OK.
9. Click OK to close the Data Source Properties dialog box and return to the Data Quality
Console.
By default, all of the data quality reports produce daily results. This means that although the
granularity of the raw data may be based on a period of minutes (for example, 10 minutes or 30
minutes), the data quality reports on show the results for a whole day.
348
About the Data Quality Package
However, if you require the daily quality reports to be processed for a different period, for example
hourly, you can do this as well. There are two stages to this:
1. On the DQ Period Processing tab of the Data Source Properties dialog box:
o Specify the levels that you want to process
o Check that the tables are configured correctly.
2. When you run the DQ_PERIOD_PROCESSING package, set the PERIOD_START and
PERIOD_END to be the required period.
Note: A second is subtracted from the end date when you run the package in a job or
script. So, for example, to process Report 5 for 02/07/2009 for the 6pm hour, run the
package with following parameters:
(5,
to_date('020720091800','ddmmyyyyhh24mi'),to_date('020720091900','dd
mmyyyyhh24mi'))
Note: You can only configure period processing for availability and nullness reporting. This is
because:
• Completeness reporting uses a variable number of loading periods - it may report based on
a single period, or on a number of periods
• For last load reporting, the maximum date of the data in the table cannot be applied to sub-
periods
Important: Period processing reports on the period as a whole. For example, if you run the
availability report for 3 hours and 1 hour has a missing cell, then this will not be picked up.
However, if you run each hour individually, the missing cell will be detected. Similarly, the nullness
report reports on the total rows and null rows for the period it is run as a whole.
Right-click the data source you want to edit and, from the menu that appears, click
Properties.
- or -
2. In the Data Source Properties dialog box that appears, make the required changes. For
more information, see Setting Data Source Properties on page 343.
4. Click OK to close the Data Source Properties dialog box and return to the Data Quality
Console.
1. In the Data Quality Console, in the right hand pane, select the data source(s) you want to
delete.
Tip: Use the Shift and Ctrl keys to select more than one data source at a time.
349
OPTIMA 8.0 Operations and Maintenance Guide
- or -
Configuring the Data Quality Package Using the Add Columns Wizard
You can use the Add Columns Wizard as an alternative method of configuring the Data Quality
package. The main benefit of using the Add Columns Wizard is that it reduces the time taken to
configure columns contained in the columns sets of each data source.
Important: Before using the Add Columns Wizard, first ensure you have:
o Added the data sources. For more information, see Adding Data Sources on page 341.
o Configured the column sets. For more information, see Configuring Column Sets on
page 333.
To configure the Data Quality package using the Add Columns Wizard:
1. In the Data Quality Console, in the right hand pane, right-click and, from the menu that
appears, click Add Columns.
2. On the first page of the Wizard, select the data source(s) you want to configure and use the
double right arrow button to move them to the left hand pane.
Tip: Use the Shift and Ctrl keys to select more than one data source at a time.
3. Click Next.
4. On the next page of the Wizard, select the column set(s) you want to configure and use the
double right arrow button to move them to the left hand pane.
Tip: If you do not need to select the columns for each column set and only need to be able
to enable and disable levels, click Skip Column Sets Mapping and proceed to step 10.
5. Click Next.
350
About the Data Quality Package
6. On the next page of the Wizard, select the columns to include in the column sets. If you
only want to include columns that form the primary key of the data source, select the PK
Cols Only option.
Tip: You can change the sort order for each column alphabetically by clicking the column
headings.
7. Click Next.
8. On the next page of the Wizard, set the order of the columns in the column set by selecting
columns and using the Up and Down buttons to position them.
Note: The column order you set applies to all column sets and all data sources selected.
You should choose an order for all columns, for example, the BSC column always above
the Cell column, the MSC column always above the BSC column.
9. Click Next.
10. On the last page of the Wizard, select the Data Quality processes that you want to be
enabled for the column sets that you selected in step 4. For a detailed description of the
processes, see About Data Quality Configuration on page 353.
351
OPTIMA 8.0 Operations and Maintenance Guide
Tips:
It is recommended that:
o You enable Completeness and Availability at the Managed Element level for all data
tables in each interface (although Completeness is not required for tables with a
granularity of a day or more)
o You enable Completeness and Availability at the Reported Element level for all data
tables with a granularity of a day or more, and for data tables with a granularity of less
than a day that you want to troubleshoot
o Last Load and Nullness are only used for troubleshooting on particular tables
In the Data Quality Console, in the right hand pane, right-click the data source for which
you want to enable processing and, from the menu that appears, point to Data Quality and
then click the processing type you require.
Any processes that are already enabled are shown with a selected checkbox. This picture
shows an example:
352
About the Data Quality Package
You can also create new processes after you have completed Data Source configuration. For
information about how to do this, see Creating Processes on page 354.
In the Data Quality Console, in the Data Quality tree, you can view the Data Quality processes by
type and by group:
The Type folder contains the different processing types, these are:
Processing Description
Type
Availability The percentage of elements which do not appear at least once for a period.
This type shows which elements are missing in the defined period, and also any new
elements that have appeared, with statistics for this.
In order to know which elements should be appearing, the process looks at the
configuration table for the data table, and compare this against the elements listed. If
a configuration table is not available, then it will look back in the data table for a
configurable number of days, defaulted to 7.
For more information, see About the Standard Data Quality Reports on page 362.
Completeness Provides statistics to show how complete the available data is.
The process calculates how many records are expected in the table for the day,
based on the available elements and periods in the table. In an element is incomplete
(in other words, does not have results for all periods), the element and the missing
period are be stored, and listed in the reports.
Important: In order to fully understand the data quality of any table, the Availability
and Completeness statistics need to be presented together, using the reports
provided.
Last Load Status The last date a table loaded, that is, the maximum date of the table.
Important: This process is only relevant to for troubleshooting specific problems, so
it is recommended this process is initially disabled, and only used for troubleshooting
specific tables.
Nullness The number of null, or missing, entries in the table for a specified list of columns, for
a period.
For more information, see About the Standard Data Quality Reports on page 362.
Warning: This process is particularly intensive and if executed for a large number of
tables may impact the database performance. Therefore it is recommended that this
process is initially disabled, and only used for troubleshooting specific tables.
353
OPTIMA 8.0 Operations and Maintenance Guide
The Groups folder contains the processing groups. For more information about processing groups,
see Scheduling Data Quality with Process Groups on page 357.
Creating Processes
To create a new process:
1. In the Data Quality Console, in the tree view, select the folder for the type of process you
want to create:
2. In the right hand pane, right-click and, from the menu that appears, click Create Process.
The Create New Process dialog box appears. This picture shows an example:
3. In the Create New Process dialog box, complete the following information:
In this Do this
field
Data Source Select the data source for this process from the drop-down list.
Days Back Set the number of days back to go for processing this process. For example, if the Data
Quality package runs daily and Days Back is set to 3, then it will process 3 days of data
each day.
Tip: Alternatively, if you just want to process this package for today's data only, select the
Today Only checkbox.
Group Set the number of the processing group that this process runs under.
For more information about processing groups, see Scheduling Data Quality with Process
Groups on page 357.
354
About the Data Quality Package
In this Do this
field
Active Enable the process by selecting the Active option. If the process is enabled, it will run if it
is scheduled.
Severity Log Indicates the information level of the messages that will be sent to the log file.
Anything below the selected level will not be reported - for example, if Minor is selected
then only Minor, Major and Critical logging will occur.
Comments Type any comments you want to add for the process. This option is for information only
and is not used in processing.
Note: This option is not available if you are setting properties for multiple processes.
Note: Not all properties are available when setting process properties globally.
1. In the Data Quality Console, in the right hand pane, select the process(es) whose
properties you want to set.
Tip: Use the Shift and Ctrl keys to select more than one process at a time.
Data Source Select the data source for this process from the drop-down list.
Tip: If you are defining an Availability process, you can also filter out old
elements present in a CFG table. For more information, see Filtering Data for
Availability Reports on page 357.
Days Back Set the number of days back to go for processing this process. For example, if
the Data Quality package runs daily and Days Back is set to 3, then it will
process 3 days of data each day.
Tip: Alternatively, if you just want to process this package for today's data
only, select the Today Only checkbox.
Group Set the number of the processing group that this process runs under.
For more information about processing groups, see Scheduling Data Quality
with Process Groups on page 357.
Active Enable the process by selecting the Active option. If the process is enabled, it
will run if it is scheduled.
Severity Log Indicates the information level of the messages that will be sent to the log file.
Anything below the selected level will not be reported - for example, if Minor is
selected then only Minor, Major and Critical logging will occur.
355
OPTIMA 8.0 Operations and Maintenance Guide
Comments Type any comments you want to add for the process. This option is for
information only and is not used in processing.
Note: This option is not available if you are setting properties for multiple
processes.
4. On the Stats tab, you can view information about the running of the process:
6. Click OK to close the Process Properties dialog box and return to the Data Quality
Console.
356
About the Data Quality Package
When defining the data source for an Availability process, you can also choose to filter out
elements from a CFG table based on their LAST_STAT value, which is a field storing the date of
the most recent data that has been loaded for that element.
To do this:
Right-click the process you want to edit and, from the menu that appears, click Properties.
- or -
2. In the Process Properties dialog box that appears, on the General tab, make the required
changes. For more information, see Setting Process Properties on page 355.
4. Click OK to close the Process Properties dialog box and return to the Data Quality
Console.
To delete a process:
1. In the Data Quality Console, in the right hand pane, select the process(es) you want to
delete.
Tip: Use the Shift and Ctrl keys to select more than one process at a time.
- or -
357
OPTIMA 8.0 Operations and Maintenance Guide
Tip: It is recommended that you schedule each process type into different groups, and also divide
the tables to process raw tables and summary tables separately. For example, you could schedule
the following groups:
• Group 1 - Availability for all raw tables
• Group 2 - Availability for all summary tables
• Group 3 - Completeness for all raw tables
• Group 4 - Completeness for all summary tables
... and so on
Including the other two processing types, this will lead to eight process groups per interface.
Note: If the interface has no sub-daily summary tables, then Completeness does not need to be
scheduled for summary tables. This is how the OPTIMA Installation Tool will configure the
processes if it is used to configure the Data Quality package.
You can define the groupings by specifying the group numbers in the Process Properties dialog
box. The Group option can be configured for multiple processes at once. For more information, see
Setting Process Properties on page 355.
In order to run a process group, you must schedule it in the database using the Oracle Scheduler.
You should use the following procedure:
OSSBACKEND.DQ_PROCESSING.RUN_PROCESS_GROUP (n);
Note: If you have installed the Data Quality package in a schema other than OSSBACKEND, then
substitute the schema name, which is at the beginning of the example above.
To start with, you only need to configure the groups for Availability and Completeness. If Nullness
or Last Load processes are required later for troubleshooting, they can be scheduled as required.
Important: If OPTIMA is deployed in an Oracle RAC environment, you must ensure that the
schedules are created to support node affinity. Node affinity is used to reduce interconnect traffic
between the database nodes by ensuring that all processing on a particular set of data (usually an
interface) is performed on the same node.
To do this, you should configure the Oracle resource plans accordingly and use the scheduled job
classes to map a schedule to a particular resource plan.
It currently configures data quality for raw tables to enable DQ reports to be run on the loaded data.
It will add OIT CFG tables to the Data Quality data dictionary as well as raw tables so that the CFG
tables can be used to calculate the expected elements for a period. Data Quality is enabled on the
table and managed level for every raw table found in the OIT interface. The Data Quality
Configuration package also configures data quality reports for Summary data with hourly, daily or
weekly periods (‘HR’,’DY’ or ’WK’).
358
About the Data Quality Package
Important: After the package has been used the installer should check the COMMON_LOGS table
for any processing errors. These may require some additional manual DQ configuration when the
OIT template used does not contain all required configuration.
The Data Quality Configuration package uses the following information from the OIT data
dictionary:
• Schema • CFG table names • Raw table names • Raw table
name granularities
• Raw table's • Raw table's counter • Summary table names • Summary periods
default CFG groups and columns
table
The OIT interface must have been activated (not only uploaded) and the raw, CFG, or summary
tables must exist in order to use the Data Quality package. The Data Quality Configuration package
will then use the Oracle data dictionary to determine column information and so on.
The 'process days back' setting will be set to 3 (days) for all processes (completeness, availability,
null and last load daily processes).
You can modify this by changing the value of the 'c_process_days_back' constant in the
'DQ_OIT_CONFIG.pks' package header. However it is recommended that this left as the default 3
days.
Data Quality NULL reporting checks individual columns for NULL values. These columns can be
configured in the Data Quality Console.
The Data Quality Configuration package will add the first counter (NUMBER) column from each
table to this column set.
If the table’s counters come from more than one input file (for example, when the OPTIMA
Combiner is used to combine the files before loading), then the Data Quality Configuration package
will also add the first counter from each counter group/input file to the NULL column set.
Note: This will not include columns configured as the REMOVE or KEY combiner types in the
Microsoft Excel OIT Interface Template.
The Data Quality Configuration package will enable sub-daily availability processing on the Table
and Managed Element levels on all raw tables that have a CFG table defined in the OIT.
You can configure Completeness reporting for Summary tables, but only on Hourly (‘HR’)
Summaries.
359
OPTIMA 8.0 Operations and Maintenance Guide
To do this:
2. Retrieve the reference to this interface from the OIT data dictionary by running the following
query:
Retrieve the value of INTRFC_ID for the OIT interface that you would like to configure in
Data Quality.
3. Log on as OSSBACKEND and run the following script, replacing 1 with the value of
INTRFC_ID:
BEGIN
OSSBACKEND.DQ_OIT_CONFIG.CONFIGURE_DQ_FROM_OIT(1);
END;
4. After the package has been run, check the COMMON_LOGS table for errors.
Note: If the auto-populate chooses the wrong column, remember to configure the managed
element columns correctly. For more information, see Troubleshooting the Data Quality
Configuration Package on page 361.
After you have configured Data Quality for raw tables from the OIT data, you can then configure
Data Quality for summary tables from the OIT data.
A new view, DQ_OIT_SUMMARY_TABLES, lists the available summary tables with their respective
summary periods and GPI_COL identifiers. The GPI_COL identifier is the managed element on
which the summary is aggregated. This is not necessarily the same element that the summary
reports on, because the summary aggregation reporting element may be at a finer granularity and
would generate too much data in the DQ Report.
To run the Data Quality Configuration package, run the following script, replacing 1 with the value
of INTRFC_ID and specifying the required summary period as 'HR' (hourly), 'DY' (daily) or 'WK'
(weekly):
BEGIN
OSSBACKEND.DQ_OIT_CONFIG.CONFIGURE_DQ_SUMMARY_FROM_OIT(1, ‘HR’);
END;
360
About the Data Quality Package
By default, debug messages will not be logged but an information level message will be logged for
every raw, summary, and CFG table added. Any other messages should be carefully read and may
require a response.
Example
Log Action required from Installer: Raw table RPPLOAD has managed
message element column BSC but its CFG table PCU_CFG has managed element
column PCU. The data in these columns must match otherwise the
managed columns must be corrected manually. {Owner=ERICSSON_GSM}.
Cause This is stating that the managed element configured for the raw table RPPLOAD has a
different managed element defined than its CFG table PCU_CFG. Data Quality will allow
these column names to be different, but they must represent the same data. In the
example above the BSC column represents different data to the PCU column
Solution Change the managed element column defined for the raw or CFG table (or both) in the
Data Quality GUI. To do this:
1. In the Data Quality Console, in the tree view, click Data Source, Schemas and then the required
schema.
2. Right-click the raw/CFG table and from the menu that appears, click Properties.
3. Click the Columns tab.
4. Filter by Managed Element to see the columns currently defined for managed element.
5. Right click a column and from the menu that appears, click Managed Element. The managed element
column should be the parent element of the data (for example, the BSC for cell data in a 2g network).
6. To remove a column from the managed element list, right-click the column and deselect the Managed
Element option. The managed element of the raw table must match the managed element of its CFG
table (defined in the General tab) for Data Quality to work correctly.
The Data Quality configuration package will attempt to use the raw table’s GPI
configuration in the Loaded Counters sheet in order to determine the managed element
column. The CFG table’s managed element will be set to the most commonly used GPI
column for the raw table’s associated with it. If this GPI column does not exist in the CFG
table it will use the next most commonly used GPI column, and so on. The raw table’s
managed element column will be set to the GPI column if defined.
If the GPI columns are not defined, then the configuration package will use the first PK
column which is not of the date type (for raw and CFG tables).
Tip: The DQ_OIT_CONFIG package will work best (and require minimum manual
configuration) if:
• The GPI column is defined for every raw table in the OIT Interface template Excel
spreadsheet
• The GPI column also exisst with the same name in the CFG table that has been
defined for that raw table
361
OPTIMA 8.0 Operations and Maintenance Guide
1. In the Data Quality Console, in the tree view, select the folder that contains the
configuration information that you want to save to Excel.
2. In the left-hand pane, right-click and, from the menu that appears, click Save to Excel.
3. In the Save As dialog box that appears, browse to the appropriate folder, type a name, and
click Save.
Availability
362
About the Data Quality Package
CFG_ELEMENTS_NOT_IN_DATA The total number of missing elements - in other words, the number
of elements defined in the CFG table that were not found in the raw
table.
DATA_ELEMENTS_NOT_IN_CFG
The total number of new elements - in other words, the number of
elements found in the raw table that were not defined in the CFG
table.
Nullness
There is a single report for nullness, which corresponds to the DQP_NULL_STATS database table,
and reports on the total number of rows and the number of null rows, per table, per level and per
column.
363
OPTIMA 8.0 Operations and Maintenance Guide
Tab Contents
1 An introduction to the report and each tab. The report has filters on the interface (or schema) name
and the date range.
2 and 3 The Data Quality statistics:
• At table level
• At managed element level
4 The missing elements.
5 The incomplete elements.
6 A summary of the processing for the tables.
You should investigate any tables where the data quality is poor. This could be caused by files not
being made available on the OSS for OPTIMA to collect, or by errors within the OPTIMA mediation
process.
The Managed Element listed will indicate the files missing, depending on its status:
• If the managed elements are completely missing, no files have been loaded into that table
for the day.
• If the managed elements are incomplete, not all of the files have been loaded - the period
will indicate which ones.
• If a managed element is missing from all the tables within a particular interface, it suggests
that the files were never provided to OPTIMA. However you should confirm this by looking
at the FTP logs and the OSS server.
• If managed elements are loaded on some tables and not others, this would indicate that
there are problems with the mediation process for that table, or that the measurement
object(s) associated with that table are no longer being collected.
Note: In order for tables to be displayed fully in the reports, you must have executed both
Availability and Completeness for that table. If a table is missing, then you should check the last tab
to verify if the processes have run for that table. The Health Check report (described in About the
Health Check Report on page 365) can then be used for further troubleshooting.
364
About the Data Quality Package
Tab Contents
1 An introduction to the report and each tab. The report has filters on the interface (or
schema) name and the date range.
2-4 The Data Quality statistics.
5 The missing elements.
6 The incomplete elements.
7 A summary of the processing for the tables.
You should investigate any tables where the data quality is poor. This could be caused by files not
being made available on the OSS for OPTIMA to collect, by errors within the OPTIMA mediation
process or by errors in the OPTIMA Summary process.
The statistics also show the number of records available in the source table for the summaries. If
the number of the records found in the summary table matches this value, this indicates that the
Summary process is OK, and that any missing records have not been loaded. Therefore, you
should focus your investigations on the raw tables.
Note: In order for tables to be displayed fully in the reports, you must have executed both
Availability and Completeness for that table. If a table is missing, then you should check the last tab
to verify if the processes have run for that table. The Health Check report (described in About the
Health Check Report on page 365) can then be used for further troubleshooting.
The results show the various statistics rolled up for each interface per day, by simply aggregating
the results from all the tables within that interface).
The managed element is the element level at which the files are produced, and therefore gives a
direct indication of the files that have been received and loaded into the OPTIMA database.
365
OPTIMA 8.0 Operations and Maintenance Guide
Tab Contents
The rest of this section describes these steps, and the rest of the Data Quality package, in more
detail.
2. Check log file for errors and ensure that all package bodies in the OSSBACKEND schema
are compiled correctly.
3. Run the following package function to create the partitions and check afterwards they have
been created:
AIRCOM.OSS_MAINTENANCE.MAINTAIN_TABLE_PARTITIONS
366
About the Data Quality Package
6. In the Data Quality Console dialog box, select the Data Source folder, and then the All
folder:
7. In the right-hand pane, right click, and from the menu that appears, click Add DataSource.
9. In the Data Quality Console dialog box, select the Schemas folder, and then the folder
corresponding to the schema name:
10. In the right-hand pane, right-click the CFG table that you want to use for data quality
processing, and from the menu that appears, click Properties.
11. In the dialog box that appears, click the Columns tab and configure the Managed Element
and Reported Element columns correctly.
Ensure the column order is the same as will be defined for the raw table:
367
OPTIMA 8.0 Operations and Maintenance Guide
13. From the Filter By Sets drop-down list, select Managed Element, and ensure the column
order is the same as will be defined for the raw table.
15. Right click the raw table, and from the menu that appears, click Properties.
16. In the dialog box that appears, on the General tab, set the loading type and granularity.
17. Select the Use Topology Table option, and select the CFG table that you have chosen to
use for data quality processing:
18. On the Columns tab, configure the Managed Element, Reported Element and Reported
Datetime columns correctly.
19. In the Filter By Sets drop-down list, select each of the three groups in turn, and ensure the
column order matches that used in the CFG table for the same level.
20. On the DQ Period Processing tab, in the Availability pane, select the levels on which you
want to report.
21. Click the View Config/Scheduling info button to check the configuration and retrieve the
DQP_CONFIG_ID.
22. Click Apply to save the configuration, and then click OK.
23. On all datasources (CFG and data/raw) grant the select privilege to ossbackend:
24. Check that the data quality tables DQP_AVAIL_ELEMENT and DQP_AVAIL_STATS have
partitions for the startdate of the period you wish to check.
25. In an SQL script run Data Quality Period Processing for availability using the following
script as the ossbackend user:
execute OSSBACKEND.DQ_PERIOD_PROCESSING.PROCESS_AVAIL (
:DQP_CONFIG_ID, :PERIOD_START, :PERIOD_END );
Where
DQP_CONFIG_ID is the configuration ID, specifying the data source configuration that you
are using (as defined in the Data Source Properties dialog box)
PERIOD_START and PERIOD_END define the range of the time period you want to report
on.
368
About the Data Quality Package
369
OPTIMA 8.0 Operations and Maintenance Guide
For future troubleshooting, you could also configure, but not enable, the following:
• Nullness at the Table and Managed Element levels:
370
About the Data Quality Package
This table shows an example of the expected configuration that would be displayed in the Health
Check report:
371
Scheduling Oracle Jobs
About Scheduling
For OPTIMA to be fully effective you must create ORACLE DBMS SCHEDULER jobs and run them
in the background.
The common processes that are always required and need to be scheduled are:
• AIRCOM.OSS_MAINTENANCE.RUN_OSS_MAINTENANCE_DAILY ('LOGS')
• AIRCOM.OPTIMA_SUMMARY.DO_WORK
Multiple versions of the DO_WORK process may be needed (with unique names), the
number required will be determined by the overall number of summaries and the resources
available to process them.
These processes may also be needed dependent on the individual customer setup:
• AIRCOM.OPTIMA_SANDBOX.REMOVE_EXPIRED_OBJECTS
• AIRCOM.OSS_MAINTENANCE.MAINTAIN_TABLESPACES
• AIRCOM.OPTIMA_ALARMS.MAINTAIN_ALARMS_TABLE
For each Vendor Interface the following processes will need to be scheduled:
• AIRCOM.OSS_MAINTENANCE.RUN_OSS_MAINTENANCE_DAILY ('<schema>')
• All CFG population procedures for that schema.
1. STATS_CONTROL - Controls full execution of the package and comes with the default
settings that will be used to populate the metadata tables
4. STATS_EXCLUDE_TABLES - controls which tables will not be included for the gather stats
process
These metadata tables must be populated for the gather stats process to work. This is
automatically done when you run the Installation/Upgrade scripts for OPTIMA 8.0.
When new Interfaces are added via the OIT, they are automatically added to the gather stats
process.
The STATS_CONTROL table comes with pre-defined settings for gathering your statistics. For
most AIRCOM OPTIMA installations these default settings are appropriate.
373
OPTIMA 8.0 Operations and Maintenance Guide
begin
dbms_scheduler.create_job(
job_name => 'AIRCOM.GATHER_STATS_OPTIMA'
,job_type => 'PLSQL_BLOCK'
,job_action => 'BEGIN GATHER_STATS.collect_schema_stats;
END;'
,start_date => SYSDATE+1
,repeat_interval => 'FREQ=DAILY;BYHOUR=03'
,enabled => TRUE
,comments => 'Gather Stats');
end;
/
Warning: The scheduled job should be created under the schema that the Gather Stats package
has been installed on. Failure to do so may prevent the job from running successfully.
To see if the last run for the scheduled job was successful:
begin
dbms_scheduler.run_job('GATHER_STATS_OPTIMA',false);
end;
/
begin
dbms_scheduler.STOP_JOB('GATHER_STATS_OPTIMA');
end;
/
374
Scheduling Oracle Jobs
DECLARE
vhour varchar2(50);
begin
for i in (select schema_name from stats_schema_metadata) loop
vhour:='FREQ=DAILY;BYHOUR=0'||mod(i.num,4);
begin
-- drop gather_stats job if already exists
dbms_scheduler.drop_job('GT_'||i.schema_name);
exception
when others then
null;
end;
dbms_scheduler.create_job(
job_name => 'GT_'||i.schema_name
,job_type => 'PLSQL_BLOCK'
,job_action => 'BEGIN
GATHER_STATS.collect_schema_stats(pschema=>'||q'[']'||i.schema_
name||q'[']'||'); END; '
,start_date => SYSDATE
,repeat_interval => vhour
,enabled => TRUE
,comments => 'Gather Stats');
end loop;
end;
/
BEGIN
DBMS_AUTO_TASK_ADMIN.DISABLE(
client_name => 'auto optimizer stats collection',
operation => NULL,
window_name => NULL);
END;
/
To verify that the Autotask Background Job has been disabled successfully:
If you are unsure of the Oracle Database version that you have installed:
375
OPTIMA 8.0 Operations and Maintenance Guide
Begin
dbms_scheduler.create_job(
job_name => 'GATHER_DICTIONARY_FIXED_STATS'
,job_type => 'PLSQL_BLOCK'
,job_action => 'BEGIN
gather_stats.collect_dictionary_stats; END;'
,start_date => SYSDATE
,repeat_interval => 'FREQ=MONTHLY;BYMONTHDAY=01;BYHOUR=01'
,enabled => TRUE
,comments => 'Gather Stats');
end;
/
STATS_EVENT_LOG
STATS_TABLES_LOG
Purpose: Log table level information for each table/interface, with execution times and CPU/IO
metrics.
376
About the OSS Maintenance Package
The OSS Maintenance package enables you to maintain your database. You can use the package
to:
• Maintain partitions
• Maintain tablespaces
• Gather statistics
Warning: OSS Maintenance should not be used to manage the data files for a database using
ASM.
Important: You must install the backend components using the OPTIMA combined backend
package, which will install the required components quickly and easily. For more information on
how to use this, see Installing the OPTIMA Combined Backend Package on page 21.
Partitioning involves dividing table data across data partitions according to values held in one or
more table columns. This facilitates the writing and retrieval of data, administration, index
placement and query processing.
The size of the partitions will vary, depending upon the size and quantity of the data that is loaded
into the tables. For example, raw data tables storing hourly data could be partitioned into daily
partitions, daily summary tables could be partitioned into weekly partitions and monthly summary
tables could be partitioned into monthly or yearly partitions. The OSS Maintenance package
enables you to maintain the table partitions in your database.
As a guideline raw interface data is typically stored for 1 to 3 months, where summary and busy
hour data is stored for 2 to 5 years. The physical storage of the database server needs to be able
to store the desired retention periods.
377
OPTIMA 8.0 Operations and Maintenance Guide
To use partition maintenance, tables must be created with a single partition in the very distant past
using the naming format PYYYYMMDD, for example P19700101. By default, the first partition created
by the OPTIMA Installation Tool uses this date. This is because Oracle cannot move data between
partitions, so the first partition must be dated at a time when there is no possibility of data existing.
Note: Sub-partition templates will be gathered by the CASCADE option. However, the OSS
Maintenance package does not currently support sub-partition by interval.
Based on the PARTTYPE that has been set, the OSS Maintenance application will add partitions
until it reaches the PARTADVANCE threshold.
For example, if the PARTTYPE is 1 (Daily), the PARTADVANCE is 6, then the OSS Maintenance
application will create 6 daily partitions in advance of the sysdate.
Based on the PARTTYPE that has been set, the OSS Maintenance application will delete any
partitions that are older than the PARTRETENTION threshold, in relation to the sysdate.
For example, if the PARTTYPE is 1 (Daily), the PARTRETENTIONPERIOD is 6, then the OSS
Maintenance application will delete any partitions that are more than 6 days old (in other words, 6
partitions behind the sysdate).
378
About the OSS Maintenance Package
When using OPTIMA, you may find that you need to change from one PARTTYPE to another (for
example, from hourly partitioning to daily, or vice versa). If you intend to do this, you should first
consider the following:
• Because each existing partition may contain data, the OSS Maintenance application cannot
create new partitions within an existing range.
Therefore, it will not create any new partitions using the new PARTTYPE until all of the
partitions for the old PARTTYPE have been wholly aged out - that is, until the sysdate has
reached the last of the future partitions determined by the PARTADVANCE.
As a basic example, consider the following scenario where you are currently using monthly
partitioning:
In this scenario:
o The sysdate is 01/04/2010
o The PARTADVANCE is 4
Therefore, the partitions that will be created for the future are P20100501, P20100601,
P20100701 and 20100801
If you then switch to weekly partitioning, then the OSS Maintenance application will not
start to create weekly partitions until the sysdate has reached 08/07/2010. At this point in
time, the last monthly partition in the range (P20100801) has begun to be created, and
because the PARTADVANCE is 4 (weeks), the first weekly partition (P20100808) can be
created.
Wk 1 Wk 2 Wk 3 Wk 4
sysdate (08/07)
Note: It is often the case that if the PARTTYPE is changed, then the PARTADVANCE and
PARTRETENTIONPERIOD are changed in order to keep the ranges for advance
partitioning and partition retention the same. In other words, a change from monthly to
weekly would mean multiplying by 4 (as a PARTADVANCE of 4 months = 16 weeks), or a
change from hourly to daily would mean dividing by 24 (as a PARTRETENTION of 48
hours = 2 days).
379
OPTIMA 8.0 Operations and Maintenance Guide
• The OSS Maintenance application cannot delete partitions until they have wholly expired
and moved outside the PARTRETENTIONPERIOD date.
Then if you are using monthly partitioning, then the old partitions kept are P20100301,
P20100201 and P20100101.
However, if you have changed from monthly partitioning to weekly partitioning, then the
OSS Maintenance application cannot start to delete partitions until the periods no longer
overlap - in other words, only complete partitions can be deleted.
Wk Wk Wk Wk Wk 1 Wk 2 Wk 3 Wk 4
1 2 3 4
380
About the OSS Maintenance Package
Wk Wk Wk Wk
1 2 3 4
In this example, monthly partitioning has created a number of partitions from February to June, but
then the OSS maintenance application has not been run for three months. If the next time that it is
run is the start of September (using weekly partitions), then there will be an unpartitioned 'gap' of
data between the start of June and the end of the first week in September that has to be filled.
To do this, it starts to create filler partitions using the current PARTTYPE, starting from the date of
the last partition created and ending at the date of the last partition that will be created according to
the PARTADVANCE. In our example above, if the PARTADVANCE is 3, when the OSS
Maintenance application is restarted on Sept 1st, weekly filler partitions will be created from 1st
June to 22nd September.
This table should contain one row for every partitioned PM table (raw or summary) in the OPTIMA
database.
Important:
• Before configuring partition maintenance, ensure that the COMMON_LOGS table is added to
the MAINTAIN_TABLE table
• If a partitioned table is not configured in MAINTAIN_TABLE then it will not have statistics
gathered for it
381
OPTIMA 8.0 Operations and Maintenance Guide
382
About the OSS Maintenance Package
Important: When you run the OSS maintenance package for the first time, then the columns not
populated automatically by the OIT should be left NULL. You can then use these columns later for
tuning - for more information, see Tuning the OSS Maintenance Package on page 389.
AIRCOM.OSS_MAINTENANCE.MAINTAIN_TABLE_PARTITIONS;
Notes:
• This is normally done inside the AIRCOM schema
• No parameters are passed to the procedure
Tip: If you want to run partition maintenance as part of the entire OSS Maintenance package, see
Scheduling the OSS Maintenance Package on page 388.
Maintaining Tablespaces
You can use the OSS Maintenance package to maintain the database’s tablespaces.
The tablespace maintenance procedures add new datafiles to tablespaces whose space is running
out. Datafiles are created using the following parameters:
This means that datafiles are created with an initial size of 100MB and extend by 100MB at a time
to a maximum size of 2000MB. New datafiles are added before the maximum datafile size of
2000MB is reached. The location of the directories where the datafiles are added is stored in a
DD_LUNS table.
There are four different tablespace maintenance procedures, which are described in the following
table:
Procedure Description
Maintain_Tablespaces_AllLUNS Adds a datafile for every LUN defined when a tablespace is
running out of space (Oracle striping).
Maintain_Tablespaces_By_Size Adds one new datafile on the least occupied LUN of the correct
type.
Maintain_Tablespaces_SingleLUN Creates one datafile at a time on a single LUN. Future datafiles
for the tablespace will remain on the same LUN.
Maintain_Tablespaces Processes all tablespaces using the method configured for each
tablespace. The above three procedures are then called within
this procedure.
Important: The OSS Maintenance package is only required for OPTIMA databases using non-ASM
(LUNs) for storage. For OPTIMA databases using ASM it is important to ensure that the datafiles
for your tablespaces are created as ‘max size unlimited’, because there is no need to manage the
tablespaces with ASM.
383
OPTIMA 8.0 Operations and Maintenance Guide
When using ASM, the ASM Disk group views (V$ASM_DISKGROUP/V$ASM_DISK) should be
regularly monitored by an Oracle DBA to view all available storage.
Note: Type one row per location on disk where a tablespace can be created.
LUN_TYPE Type a single letter defining the type of data which the D
LUN will store. The available LUN types are described
in a separate table below.
LUN_PATH Type the path (normally a location on disk) where the C:\optima_data\lun1\
datafiles will be put.
Important: Ensure the last character in the path is the
directory separator, that is, "\" in Windows and "/" in
UNIX.
FULL This column is reserved for future use. Type "N" for this N
column.
CAPACITY_MB Type the amount of space (in MB) on the LUN disk 8000
location which is available to be used by datafiles.
Notes:
• Do not include the entire disk space.
• Ensure that the specified amount of disk space will
be available and will not be used by other
processes.
Tip: As datafiles have a maximum size of 2000 MB,
TEOCO recommends defining this capacity as a
multiple of 2000.
S System tablespaces N
D Data tablespaces Y
I Index tablespaces Y
T Temporary tablespaces N
Important: Only LUN types D and I are maintained by the OSS Maintenance package.
384
About the OSS Maintenance Package
Tip: You can use the following query to populate the TABLESPACE_NAME column from the data
dictionary:
TEOCO recommends that system and temporary tablespaces are not included in the
MAINTAIN_TABLESPACE table and that their rows are deleted after they have been populated.
AIRCOM.OSS_MAINTENANCE.MAINTAIN_TABLESPACES;
385
OPTIMA 8.0 Operations and Maintenance Guide
Notes:
• This is normally done inside the AIRCOM schema
• No parameters are passed to the procedure
Tip: If you want to run tablespace maintenance as part of the entire OSS Maintenance package,
see Scheduling the OSS Maintenance Package on page 388.
Gathering Statistics
Gathering schema statistics in the database is important in ensuring that queries run quickly and
efficiently.
Statistics are gathered using Oracle’s DBMS_STATS package, which contains the following
procedures:
Item Description
GATHER_SC_STATS Gathers statistics for the schema name that is passed as a parameter to the
procedure.
GATHER_TB_STATS Gathers statistics for the table name that is passed as a parameter to the
procedure.
Note: All gather schema parameters can be altered to allow flexibility of the schema statistics that
are gathered. The defaults are to allow Oracle to know what to gather and how much.
Option Value
ESTIMATE_PERCENT 15
METHOD_OPT FOR ALL INDEXED COLUMNS SIZE 254
DEGREE 2
CASCADE TRUE
OPTIONS GATHER AUTO
386
About the OSS Maintenance Package
GATHER_ALL_STATS AIRCOM.OSS_MAINTENANCE.GATHER_ALL_STATS;
GATHER_SCHEMA_STATS AIRCOM.OSS_MAINTENANCE.GATHER_SCHEMA_STATS('AIRCOM');
GATHER_TABLE_STATS AIRCOM.OSS_MAINTENANCE.GATHER_TABLE_STATS('AIRCOM',
'TABLENAME');
COPY_TABLE_STATS AIRCOM.OSS_MAINTENANCE.COPY_TABLE_STATS('AIRCOM',
'TABLENAME');
GATHER_SC_STATS AIRCOM.OSS_MAINTENANCE.GATHER_SC_STATS('AIRCOM');
GATHER_TB_STATS AIRCOM.OSS_MAINTENANCE.GATHER_TB_STATS('AIRCOM',
'TABLENAME');
GATHER_DICTIONARY_STATS AIRCOM.OSS_MAINTENANCE.GATHER_DICTIONARY_STATS;
Note: This should be scheduled to run once per month, unless advised
otherwise.
GATHER_SYSTEM STATS Does not need to be scheduled to be run regularly. This procedure is for
use by DBAs and System Engineers only.
Important: If you need to change the defaults for GATHER_SC_STATS or GATHER_TB_STATS, then
you must use the relevant parameters.
387
OPTIMA 8.0 Operations and Maintenance Guide
A job needs to be set up for each schema to gather statistics for, replacing 'AIRCOM' with the
schema name. In most cases it is expected that GATHER_ALL_STATS will be used instead of
GATHER_SCHEMA_STATS.
Tip: If you want to run statistics gathering as part of the entire OSS Maintenance package, see
Scheduling the OSS Maintenance Package on page 388.
begin
oss_maintenance.run_oss_maintenance_daily('NOKIA_GPRS');
end;
Important: This schema must be maintained before any of the others, because it contains
the COMMON_LOGS table. This requires a valid partition for the current day to exist
before the OSS Maintenance package can run, as it needs to be able to log messages to
the COMMON_LOGS table.
• AIRCOM
• GLOBAL
• OSSBACKEND (if you are using the Data Quality module)
• All vendor schemas
Tip: You can create several concurrent DBMS_SCHEDULER jobs, one for each schema that you
want to maintain.
If your database is very small, it is possible (but not recommended) to maintain all of the vendor
schemas contained in the GLOBAL.VENDOR table at once. To do this, use the following function:
begin
oss_maintenance.run_oss_maintenance_daily('ALL');
end;
388
About the OSS Maintenance Package
When the OSS Maintenance package is run, a message will appear explaining that the
rebuilding indexes component will not run due to this setting.
When the OSS Maintenance package is run, a message will appear explaining that the
statistics gathering component will not run due to this setting.
It is recommended that you usually run the package with this default, but you can override this
value by specifying a value for this parameter of the run_oss_maintenance_daily function.
For example, to force the OSS Maintenance package to use an estimate percentage of 1% for the
ERICSSON_UTRAN schema, you should schedule the following:
begin
oss_maintenance.run_oss_maintenance_daily('ERICSSON_UTRAN',1);
end;
389
OPTIMA 8.0 Operations and Maintenance Guide
However, if you want to do this at any other time, you should force the OSS Maintenance package
to execute this.To do this, pass 'FORCE' as the third parameter to the
run_oss_maintenance_daily function, for example:
begin
oss_maintenance.run_oss_maintenance_daily(p_schema
=>'ERICSSON_UTRAN', p_processing_type => 'FORCE');
end;
However, after this initial run, you can modify this value.
Note: For more information on the MAINTAIN_TABLE table, see Configuring Partition Maintenance
on page 381.
Setting this parameter to either NULL or STALE will produce the same result - the table statistics
will be calculated by gather_stale_partition_stats. However, if you set the parameter to COPY, then
the table statistics will be calculated by copy_partition_stats and gather_copy_partition_stats.
Important: NULL or STALE should be used for as many tables as possible. In particular, monthly
and yearly tables will always use these values.
The processing times are logged by this procedure in two different messages:
390
About the OSS Maintenance Package
Maintenance
TEOCO recommends the following basic maintenance checks are carried out to ensure the OSS
Maintenance package is functioning correctly:
For broken / failed OSS Daily Checking jobs are executing correctly ensures that
Maintenance package jobs. maintenance is processing and that partitions and
tablespaces are allowing new data to be inserted.
The COMMON_LOGS tables for Daily / Weekly Any errors in the COMMON_LOGS table relating to the
errors. You can use the OSS Maintenance package (PRID 000827001) must be
following query to retrieve all investigated.
important (Warning level and
above) messages logged in
the last week:
select * from
common_logs where
severity>2 and
PRID='000827001' and
datetime > sysdate-7
order by datetime desc
The Loader and summary Weekly If partitions or tablespaces are not maintained, errors
loading for tablespace and will be found in the Loader and OPTIMA Summary
partition errors and check when data is inserted into the database.
partitions are available for
tables. Check that there are partitions available and that the
tablespace has space remaining.
Check for any “Unable to extend tablespace/datafile”
errors in the loading / summarizing which indicate that
tablespace maintenance is not working correctly.
Checking the max dates of raw and summary tables
ensures that the tables are still loading.
That the LAST_ANALYSED Monthly Checking that statistics have been recently generated
date for tables and indexes is ensures that queries on the tables are optimized
recent and that table statistics correctly.
have been gathered. If you are
using TOAD, you can find this
information in the Schema
Browser for Tables/Indexes on
the Stats/Size tab.
OSS Maintenance procedures not The Oracle job may not be running the scheduled jobs; check that the
running. Oracle job is configured correctly and has not broken.
The COMMON_LOGS and OSS_LOGGING package may not be installed
correctly; check that the COMMON_LOGS has a public synonym, and the
OSS_LOGGING package is installed (this is a requirement for the OSS
Maintenance package.)
The DD_LUNS, MAINTAIN_TABLESPACE and MAINTAIN_TABLE tables
may not be installed or configured correctly; check that these tables
contain the correct configuration.
391
OPTIMA 8.0 Operations and Maintenance Guide
COMMON_LOGS table log entries There are a number of possible causes, which should be logged in the
contain error messages. error message. However, if you are unable to resolve the problem from
the information in the error message, please contact TEOCO Support.
Raw or Summary tables fail to The partition maintenance is not creating new partitions correctly. To
insert data with “ORA-14400: resolve this:
inserted partition key does not map
to any partition” error. • Check that partitions exist for the table
• Check that the table is included in MAINTAIN_TABLE table
• Check that MAINTAIN_TABLE_PARTITIONS is running correctly in
the job
Raw or Summary tables fail to The tablespace maintenance is not creating new datafiles correctly. To
insert data with “Unable to extend resolve this:
tablespace/ data file” error.
• Check if the tablespace is full
• Check if the tablespace is configured correctly in the
MAINTAIN_TABLESPACE table
• Check that Maintain_Tablespaces procedure is running
correctly in the job
No statistics produced for any If all of the partitions for a table have no statistics, then the
partitions for a table copy_partition_stats procedure will not be able to copy valid
statistics.
To solve this, run the OSS Maintenance package with all of the tables
initially configured for gather_stale_partition_stats.
The statistics for partitioned tables Check that the table is configured in the AIRCOM.MAINTAIN_TABLE
are not being gathered table.
The OSS Maintenance package cannot gather statistics for any
partitioned tables which are not in the MAINTAIN_TABLE table.
The COPY option is not gathering Check that the table has 3 days worth of partitions retained (or 2 weeks
statistics for weekly partitioned tables). This is required for
gather_copy_partition_stats to run correctly.
If this is not the case, configure the table for STALE statistics gathering.
Performance Reporting
The procedures listed in the following table give timing statistics, which enable the administrator to
monitor the time taken to process the various OSS maintenance procedures for the current
schema:
392
About the OSS Maintenance Package
393
OPTIMA 8.0 Operations and Maintenance Guide
394
About the OPTIMA Report Scheduler
The Report Scheduler enables the Administrator to configure the OPTIMA Report Scheduling
System. The Report Scheduler comprises a configuration utility and an executable application
(Windows NT service or stand-alone.)
Note: The OPTIMA Report Scheduler will be withdrawn at the next release.
2. Install one of the following Report Scheduler applications to the backend binary directory:
o Windows NT service application (OptimaReportScheduler.exe)
- or -
o Stand-alone application (OptimaReportSchedulerGUI.exe)
Warning: If you install the Report Scheduler to a different directory, you must also copy
the crypter.dll to that directory.
3. If you are scheduling Microsoft Excel 2003 reports, ensure that you have the following
folder on the machine(s) on which the Report Scheduler is running:
C:\Windows\SysWOW64\config\systemprofile\Desktop
This folder is required for Excel 2003 to function properly when interacting with OPTIMA.
395
OPTIMA 8.0 Operations and Maintenance Guide
OptimaReportSchedulerConfig.exe
The Report Scheduler configuration utility appears. This picture shows an example:
3. On the Storage Type page that appears, choose the storage type you require by selecting
the appropriate option, and then click Next.
Username Type the username the Report Scheduler will use to connect to the database.
Password Type the password the Report Scheduler will use to connect to the database.
Tip: Click Test Connection to test the database connection before proceeding.
396
About the OPTIMA Report Scheduler
5. Click Next.
Allow Export to Select this checkbox if you want to export reports to email.
email
SMTP Server Type the name of the SMTP Server.
Port Number Type the port number of the SMTP Server.
SMTP Select this checkbox if the SMTP Server that you have defined requires
authentication authentication.
required
SMTP User If you have selected the 'SMTP authentication' option, type the SMTP username
Name/Password and password.
Report "From" Type the email address of the report sender.
address field
Tip: Click Test Connection to test the email connection before proceeding.
7. Click Next.
Event Log If you want debug level information to be included in the Event Viewer Application
checkbox log.
Note: This option only applies if you are running the Report Scheduler Windows
NT service application.
Backup Email If you want to backup email attachments in a separate directory. Select the
Attachments location of the backup directory in the Debug Directory field.
checkbox
9. Click Next.
10. If you want the Report Scheduler to run continuously, select the Run Continuous
checkbox. Otherwise, you will need to schedule the Report Scheduler in the Windows
scheduler.
12. On the PID Settings page, click the PID File Settings button to configure PID File settings.
For more information, see Configuring PID File Settings on page 400.
14. On the Log Settings page, click the Log Settings button to configure log file settings. For
more information, see Configuring Log File Settings on page 401.
16. On the Temp Directory page you can specify the Temp Directory to be used by the Report
Scheduler. If you do not specify a path, the Windows Temp directory is used by default.
397
OPTIMA 8.0 Operations and Maintenance Guide
If your network spans across multiple time zones, and you have configured the Report Scheduler to
use a specific time zone, when the scheduler is started, it will search for and run:
• All report schedules set on the same time zone as the Report Scheduler, where the next
run date is equal to or less than the database local time (for example, the Oracle
SYSDATE) adjusted by the time zone
• Any other schedules without a specified time zone, where the next run date is equal to or
less than the database local time (SYSDATE)
ANOTHER COUNTRY
SCHEDULER
DATABASE CLIENT
1 GREECE 14:00
2 GREECE 12:00
3 GMT 15:00
4 None Set 12:00
398
About the OPTIMA Report Scheduler
The Report Scheduler has been configured to use the GREECE time zone.
If the Report Scheduler is set running, then the database time (SYSDATE) is converted according
to the GREECE timezone, giving an actual runtime of 14:00.Therefore, the Scheduler will run:
• All report schedules set on the GREECE time zone, where the next run date is equal to or
less than 14:00
• Any other schedules without a specified time zone, where the next run date is equal to or
less than 12:00
This means that it deals with each example schedule record as follows:
If your network spans across multiple time zones, and you have configured the Report Scheduler to
use a specific time zone, when the scheduler is started, it will search for and run all report
schedules where the next run date is equal to or less than the database local time (for example, the
Oracle SYSDATE) adjusted by the specific time zone for each schedule.
Note: If no time zone has been set for a schedule, it will just compare the database local time with
the next run time.
Take an example network, when the following report schedules have been created:
1 GREECE 14:00
2 GREECE 12:00
3 GMT 15:00
4 None Set 12:00
If the Report Scheduler is set running, then it will treat each schedule record as follows:
399
OPTIMA 8.0 Operations and Maintenance Guide
This table describes the information to complete the PID File Settings dialog box:
Item Description
400
About the OPTIMA Report Scheduler
For more information on log files, see About Log Files on page 33.
2. Click Next.
3. On the Completing the Wizard page, check your settings in the Settings Summary pane.
4. Click Finish to save your settings and close the Log Settings Wizard.
401
OPTIMA 8.0 Operations and Maintenance Guide
OptimaReportSchedulerGUI.exe
If you want to run multiple instances on the same machine, then you must ensure that each
instance uses different INI files, PID files and log files, otherwise the PID file will prevent any other
instance from running.
1. Using the Configuration Wizard, create a separate ini file for each instance, storing them in
different directories. For example:
o C:\Optima Backend\Optima Report Scheduler\Instance1\OptRepSchedulerConfig.ini
o C:\Optima Backend\Optima Report Scheduler\Instance2\OptRepSchedulerConfig.ini
Important: You must use the same ini file name - OptRepSchedulerConfig.ini - for each
instance that you define.
402
About the OPTIMA Report Scheduler
3. Set a different PRID and PID file directory for each instance, using the Program ID for the
Report Scheduler, 816. For example:
o Instance1: PRID = 001816001; PID file directory = C:\Optima Backend\Optima Report
Scheduler\Instance1\PID
o Instance2: PRID = 001816002; PID file directory = C:\Optima Backend\Optima Report
Scheduler\Instance2\PID
5. Schedule each instance in the Windows Scheduler, including the ini file location as a
parameter:
Important: This will override the default ini file location set using the Configuration Wizard.
o "C:\Optima Backend\Optima Report Scheduler\OptimaReportSchedulerGUI.exe"
ini="C:\Optima Backend\Optima Report Scheduler\Instance1"
o "C:\Optima Backend\Optima Report Scheduler\OptimaReportSchedulerGUI.exe"
ini="C:\Optima Backend\Optima Report Scheduler\Instance2"
However TEOCO recommends the following basic maintenance checks are carried out for the
Report Scheduler:
Log messages for error Weekly In particular any Warning, Minor, Major and Critical
messages messages should be investigated.
You can choose to create a new log file every day. The information level required in the log file is
also defined in the Log Settings Wizard and will be one of the following:
• Debug
• Information
• Warning
• Minor
• Major
• Critical
These levels help the user to restrict low level severity logging if required. For example, if Minor is
selected then only Minor, Major and Critical logging will occur.
Note: Utilities are provided for searching and filtering log messages and also for loading the
messages into the database. For more information, see Checking Log Files on page 41.
403
OPTIMA 8.0 Operations and Maintenance Guide
If the application is run continuously, then it will monitor continuously for schedules. In this case, the
application can be terminated.
Troubleshooting
The following table shows troubleshooting tips for the Report Scheduler:
Application exits Another instance is running. Use Process Monitor to check instances running.
immediately.
Invalid or corrupt (INI) file.
New configuration Settings are not saved to the Check settings and location of file.
settings are not being configuration (INI) file.
used by the Restart the Report Scheduler processing
application. File created in the wrong (service and standalone) applications.
location.
Report Scheduler
processing (service and
standalone) applications
have not been restarted to
pick up the new settings.
Error emailing reports: Anti-virus software is Deactivate anti-virus software.
10053: Software running.
caused connection
abort.
SMTP Authentication Invalid SMTP Username Use the configuration utility to delete the SMTP
Error setting. Username setting on the Email page of the
configuration. For information about how to do
Port 25 is being blocked. this, see Configuring the Report Scheduler on
page 396. If the problem persists, for example
users outside the domain are still unable to
receive emails, then request that the customer's
IT department enable relaying without
authentication for the OPTIMA Server.
Check that no anti-virus software is blocking port
25.
Failure to export report User has insufficient Enable permissions.
to email address due privileges on Debug
to the following error: Directory or Debug Directory Create Debug Directory.
421 4.3.5 Unable to doesn't exist.
create data file:
Input/Output error
(Processing time is…)
404
About the OPTIMA Report Scheduler
Printer is unavailable. Default or specified printer Assign user a logon account for the Report
has not been installed on the Scheduler service and then add the assigned
Report Scheduler server. user to the printer.
To assign a logon account:
1. In the Services window, right-click the Report
Scheduler service and, from the menu that
appears, click Properties.
2. In the Report Scheduler Properties dialog box,
click the Log On tab and select the This Account
radio button.
3. Complete the user account and password
fields.
4. Click Apply and then close the Services
window.
To add a user for an installed printer:
1. In the Printers and Faxes window, select the
printer that you want to use to send reports.
2. In the Printer Tasks list, click Share This
Printer.
3. In the Printer Properties dialog box, select the
Security tab and click Add.
4. In the Select Users, Computers or Groups
dialog box, type the username and click OK.
5. Click Apply and then close the Printer
Properties dialog box.
Report Scheduler Ghost sessions not mapped 1. Locate the Oracle sqlnet.ora file on the
creates ghost sessions to OS processes. machine on which the OPTIMA database is
in the database, installed. This is normally in
requiring DBA ..Oracle\product\(version
intervention to kill number)\dbhome_1\NETWORK\ADMIN
inactive sessions and
leaving Excel 2. Open the sqlnet.ora file and check that this
processes behind. line exists in it:
SQLNET.EXPIRE_TIME=10
3. If the above line is not present, add it and save
the file.
4. Restart the Oracle listener and restart the
OPTIMA database.
405
OPTIMA 8.0 Operations and Maintenance Guide
Note: You need to run the Report Scheduler with a network user so it can access all
network file locations and the customer's email server.
• The email From address you are using is a valid email address
If "Invalid email address…" appears in the schedule history in the Schedule Explorer, it can
mean that the From email address (rather than the email address of the user to whom the
email is sent) is invalid. Ensure that you use a valid form of email address, for example,
[email protected].
• All anti-virus software has been disabled on the Report Scheduler machine.
Note: If reports are to be emailed externally, you may need to ask the customer's IT
department to open port 25 on the firewall for the Report Scheduler application.
• The version numbers of the Report Scheduler configuration utility and GUI are compatible.
• The Report Scheduler GUI is scheduled, for example, at 10 minute intervals, in the
Windows Task Scheduler.
Ensure the Report Scheduler is not set to run in continuous mode. The Report Scheduler
should be launched from the Windows Scheduler. In this way, it will connect to the
database, process the reports, then close down until re-launched by the Windows
Scheduler.
• The PID file and log file settings are correctly set.
To test the Process Monitor, kill the Report Scheduler during testing and check that the
Process Monitor removes the PID after the specified time.
To receive more detailed log messages (for testing purposes), set the severity level of the
log file to Debug.
• The database user can connect to the database.
• You are using an appropriate storage type. TEOCO recommends using the configuration
(INI) file rather than saving to registry.
• You are using the Report Scheduler stand-alone executable application and not the older
Windows NT service application.
You can check this in the Windows Services window. If the service is installed, uninstall it
by typing the following at the command prompt:
OptimaReportScheduler /uninstall
You specify the port number using the Report Scheduler configuration utility. For more
information, see Configuring the Report Scheduler on page 396.
• If an authenticated SMTP username and password are required for the Report Scheduler
to be able to send email.
406
About the OPTIMA Report Scheduler
If you are having problems, try typing a valid username into the Report Scheduler
configuration utility or try leaving it blank. If "SMTP Authentication Error" occurs, it means
the Report Scheduler is using an invalid username. In this case, you should use the
configuration utility to either remove the SMTP username or add a valid one. For more
information, see Configuring the Report Scheduler on page 396.
Note: If you remove the username, users outside the domain may not be able to receive
the emails. In this case, request that the Customer’s IT department enable relaying without
authentication for the OPTIMA server.
When using the Report Scheduler, you can also consult the following sources of information:
• For additional security restrictions on the mail server. For example, an error message of
"Connection closed gracefully" when using the telnet command indicates that the mail
server is closing the connection. Contact the customer's IT department for more
information about security restrictions on the mail server.
• The log files in the common log directory for additional error messages. To receive more
detailed log messages, set the severity level of the log file to Debug. TEOCO recommends
reducing the logging level again, once the Report Scheduler is working correctly.
[LogCompIniSection]
LogMaint_SuccessFul=0
LogMaint_UnSuccessFul=1
LogMaint_LogSize=512
LogMaint_OverrideEventsAuto=0
LogMaint_ClearManually=0
LogMaint_ClearLogbySize=1
LogMaint_Days=7
EnableLogMaintenance=0
LogMaint_TimeInterval=60
GenerateDailyLogs=1
PRID=789456123
LogLevel=WARNING
EnableLogging=1
LogDir=/OPTIMA_DIR/<application_name>/log
[Database]
UserName=test1
Password=ENC(Gev\wPrn)ENC
DBService=TEST
[Email]
SMTPServer=test.server
SMTPUserName=test.user
PortNumber=25
[email protected]
AllowExportToEmail=1
[Debug]
EnableDebugInEventViewer=1
EnableAttachmentDir=1
AttachmentDir=/OPTIMA_DIR/<application_name>/Debug
407
OPTIMA 8.0 Operations and Maintenance Guide
[StandAlone]
RunContinuous=1
[LogSettings]
LogDirectory=/OPTIMA_DIR/<application_name>/Log
[TIMEZONE]
UseTimeZone=1
OptimaAbbrev=opt_1014
TimeZoneName=Australia/NSW
TimeZoneAbbrev=LMT
OptimaDescription=Test
SystemBias=0
SystemStandardName=GMT Standard Time
SystemStandardBias=0
SystemDaylightName=GMT Standard Time
SystemDaylightBias=-60
Important: In the [DIR] section (also called [Directory Parameters] in some INI files), you should
replace:
• OPTIMA_DIR with the OPTIMA home directory, which is set as an environment variable,
and will be different depending on whether you are using Windows or UNIX
• <application_name> with the name of the application (minus any file extension)
• Forward slashes with back slashes, if you are using Windows
408
About OPTIMA Alarms
The alarms defined in the OPTIMA front end are processed by two backend programs:
• The Alarms Processor checks the next schedule date of each alarm and then processes
and updates any alarm whose schedule date is due. For more information, see About the
Alarms Processor on page 409.
• The Alarm Notifier polls the database for recently raised alarms and sends alarm
notifications via email or SMS. For more information, see About the Alarm Notifier on page
414.
Important: When using OPTIMA alarms, it is important to run the Alarms Maintenance
scheduled job periodically, in order to ensure that performance is kept at an optimum level.
For more information, see Maintaining Alarms on page 429.
Important: You can run more than one instance of the Alarms Processor, but to avoid locking
records, you should use the Filter parameter to enable the Alarms Processors to process different
alarm definitions. For more information, see Configuring the Alarms Processor on page 410.
In Windows, type:
opx_ALM_GEN_817.exe opx_ALM_GEN_817.ini
In Unix, type:
opx_ALM_GEN_817.exe opx_ALM_GEN_817.ini
Note: All applications are scheduled in a usual operation within the data loading architecture.
409
OPTIMA 8.0 Operations and Maintenance Guide
Parameter Description
LogDir The location of the directory where log files will be stored.
PIDFileDir The location of the directory where monitor (PID) files will be created.
TempDir The location of the directory where temporary files will be stored.
Parameter Description
410
About OPTIMA Alarms
Parameter Description
[DIR]
LogDir=/OPTIMA_DIR/<application_name>/log
TempDir=/OPTIMA_DIR/<application_name>/temp
PIDFileDir=/OPTIMA_DIR/<application_name>/prid
[MAIN]
LogGranularity=3
LogLevel=1
RefreshTime=0
RunContinuous=0
StandAlone=0
MachineID=000
ProgramID=817
InstanceID=001
Verbose=1
[OPTIONS]
Database=OPTRAC_VM
UserName=OPTIMA_ALARM_PROC
Password=ENC(l\mlofhY)ENC
Filter=VENDOR=201
411
OPTIMA 8.0 Operations and Maintenance Guide
8011 Definition ID: <definitionID> current time < next polling time. DEBUG
8012 Definition ID: <definitionID> current time > next polling time. DEBUG
8013 problemTextSet for query set: <problemText>. DEBUG
8014 Error: <currentDateTime>. DEBUG
8015 Procedure Type: <type>. DEBUG
8016 Oracle Error: <oracleErrorCode>. DEBUG
8017 Definition ID: <definitionID>. DEBUG
8018 SQL Query <sqlQuery>. DEBUG
8019 Problem Text <problemText>. DEBUG
8020 MaxRows <numOfRows>. DEBUG
8021 <ExceptionErrorMessage>. CRITICAL
8022 <ExceptionErrorMessage>. CRITICAL
412
About OPTIMA Alarms
413
OPTIMA 8.0 Operations and Maintenance Guide
AlarmNotifier.exe
Warning: If you install the Alarm Notifier to a different directory, you must also copy the crypter.dll
to that directory.
Important: You must install the backend components using the OPTIMA combined backend
package, which will install the required components quickly and easily. For more information on
how to use this, see Installing the OPTIMA Combined Backend Package on page 21.
For more information, see Required Tables for the Alarm Notifier.
If you have used the OPTIMA Database Installer, these tables and grants should be generated
automatically.
If you have not used the Database Installer, please contact Product Support to obtain the required
scripts.
The application is started, with the Alarm Notifier dialog box minimised.
- or -
Right-click the Alarm Notifier icon in your system tray and, from the menu that
appears, click AIRCOM OPTIMA Alarm Notifier.
415
OPTIMA 8.0 Operations and Maintenance Guide
When the Alarm Notifier is disabled, the Disabled Alarm Notifier icon is
displayed in your system tray.
Last Execution The date and time that the processing of alarms was last completed.
Current Action The action that is currently being performing. The possible actions are:
• Waiting
• Executing
• Updating Log
• Error
Status pane
416
About OPTIMA Alarms
EXECUTE NOW Click this button to force an immediate execution of the Alarm
Notifier.
Note: The EXECUTE NOW button is disabled when the Automatic
Execution Enabled checkbox is selected.
Automatic Execution Enabled Select this checkbox if you want the Alarm Notifier to automatically
execute at a specified polling interval. You set the polling interval on
the Database tab. See Configuring Database Settings on page 421
for more information.
When Automatic Execution is enabled, the time remaining (in
seconds) until the next scheduled execution is shown in a progress
bar.
Execution pane
417
OPTIMA 8.0 Operations and Maintenance Guide
1. In the Alarm Notifier dialog box, on the Modem Configuration tab, complete the following
information:
Comm Port From the drop-down list, select the port number on the host
machine to which the handset is connected.
Comm Settings This field displays the default baud rate, parity, data bit, and stop
bit settings for the majority of handsets.
Tip: For newer handsets, you may be able to improve performance
by increasing the baud rate.
Note: TEOCO recommends that you:
• Do not change the Comm settings if the Alarm Notifier is
working correctly
• Consult the modem manufacturer for advice if you do need to
change the Comm settings
Use DTR (Data Terminal Ready) Select this checkbox if you are using a modem that is configured to
use DTR.
The RS-232 Data Terminal Ready signal is lowered when the
computer wishes the modem to hang up. The computer wishes to
hang up when people have ended their login session ends or when
they fail to respond to the login prompt.
Tip: Try using this option if your Comm port is set correctly but
your modem is not responding correctly.
Note: This option is not supported by all handsets.
Allow Priority SMS's to be sent Select this checkbox if you want to send alarm notifications as 16-
(Flash SMS) bit text messages of class 0 (Flash SMS). On phones that support
this feature, alarm notifications will appear as Flash SMS (also
called blinking SMS or alert SMS) messages.
The user will not have to delete this message, and it will appear
immediately on his handset without him having to open it.
Note: The flash SMS feature is not supported by all handsets, so
test before using this as a standard.
Enable Interactive SMS Select this checkbox if you want to use interactive SMS. Enabling
interactive SMS means that the Alarm Notifier can perform certain
actions, such as returning information, by responding to specified
keywords received via SMS. For information about using
interactive SMS, please contact TEOCO Support.
Set Message Centre Number Click this button to set the Short Message Service Centre (SMSC)
number on the attached modem or handset. In the dialog box that
appears, type the Message Service Centre number and click OK.
Your network operator can provide you with this information.
Note: The maximum length of the Message Centre Number can be
only 11 digits.
Check Phone PIN State Click this button to check if a phone has a pin code set and unlock
its SIM card for use.
You should use this if you are using a modem that has no other
interface.
418
About OPTIMA Alarms
Test Interactive SMS Click this button if you have configured interactive SMS and you
want to test the response without having to send the modem an
SMS message.
When prompted, enter the phone number that will be used as the
incoming SMS number, and then enter a keyword.
The Alarm Notifier dialog box will return the response that would be
sent to the user, had they SMS'd that keyword from that number to
the modem or mobile handset attached to the host PC.
The results of the test are displayed in the Modem Test Response
window.
Test Modem Settings Click this button to test that you have correctly configured your
modem settings. The results of the test are displayed in the Modem
Test Response window.
1. In the Alarm Notifier dialog box, on the Mail Configuration tab, complete the following
information:
Mail Server IP Address / Name Type the hostname or IP address of the mail server to connect
to.
Mail Sent From (Alarm User Alias) Type the sender's name that will appear in the From field of
received email notifications.
Note: This field is not used for authentication and can be set to
anything.
419
OPTIMA 8.0 Operations and Maintenance Guide
420
About OPTIMA Alarms
1. In the Alarm Notifier dialog box, on the Database Configuration tab, complete the
following information:
User Name Type the user name you want to use to connect to the database.
It is recommended that you connect as the same user that owns
the required tables described in Prerequisites for Using the Alarm
Notifier on page 414.
Password Type the password required to connect to the database.
This password is encrypted when it is written to the INI file.
Database Select the SID that identifies the database to connect to from the
drop-down list. You can also type the name into the same box.
Tip: You can find the SID listed in the Oracle tnsnames.ora file.
Database Alarm Polling Interval Type the interval (in seconds) that the program will wait before
checking for new alarms when in automatic execution mode. See
Executing the Alarm Notifier on page 417 for more information.
Verbose Logging (For Debug Select this checkbox if you want to show more detailed information
Purposes) about actions performed and errors encountered in the Current
Actions window. See About the Current Actions Window on page
425 for more information.
Warning: This option is useful for debugging the application, but
can cause slower performance if many alarms are being
processed.
Do not process cleared alarms Select this checkbox if you do not want the Alarm Notifier to send
notifications for cleared alarms (in other words, notifications that
state an alarm condition no longer exists).
Test Database Settings Click this button to test that the Alarm Notifier can successfully
connect to the database with the parameters you have set. The
results of the test are displayed in the Database Test Response
window.
1. In the Alarm Notifier dialog box, on the SMSC Configuration tab, complete the following
information:
Address Type the socket network address of the SMSC (either TCP/IP or X.25),
for example, 192.168.88.1 for TCP/IP connections.
Port Type the port which is used for TCP/IP connections only.
Single port connectivity only Select this checkbox if you are using single port connectivity.
Normally, an SMPP connection requires two socket connections: one for
transmitting and one for receiving SMS messages. However, some
SMSC operators only provide a port for transmitting messages or handle
both operations on a single socket connection.
Important: Only change this setting if instructed to by your network
operator.
System ID Type the System ID provided by your network operator. This setting is
used to identify you or your application.
System Type Type the System Type provided by your network operator.
Notes:
• This setting is used as additional information to identify your
application.
• This setting is optional.
Password Type the password required to connect to the SMSC.
This is encrypted when it is written to the log file.
TON Type the short value representing the Type of Number (TON) of the
address for your application, for example, this could be a TCP/IP
address. If you have not been provided with this information, type 1 in
this field.
This information is often used by the SMSC for internal billing.
NPI Type the short value representing the Numbering Plan of the address for
your application, for example this could be a TCP/IP address. If you have
not been provided with this information, type 1 in this field.
This information is often used by the SMSC for internal billing.
422
About OPTIMA Alarms
SMPP Version Type the long value representing the SMPP Interface version that your
application supports.
Notes:
• Some older SMSC implementations require a one digit value, for
example, 3, whereas more recent implementations expect a two-
digit value, for example, 33 or 34.
• The SMPP Interface version must be sent in hexadecimal format. If
you are using the 3.4 version, a hexadecimal 0x34 value must be
sent. To achieve this, set the SMPP Interface version to 52, which
corresponds to the required 0x34 hexadecimal value. You do not
need to do this if you are using a 3.3 or lower version, as
hexadecimal values from 0x0 to 0x33 are allowed.
Transceiver Select this checkbox if the transmitting and receiving of SMS messages
is to be handled via a single port.
Notes:
• When using this option, ensure you also select the Single port
connectivity only checkbox.
• This option is not required for standard SMPP links.
Important: Only change this setting if instructed to by your network
operator.
Send from Type the sender information which is shown when the message arrives
at the mobile. Usually this is a mobile number in international format or a
short number identifier. Request this information from your network
operator, if you are unsure.
Note: This setting can be an alphanumeric string but TEOCO
recommends testing whether your SMSC operator supports
alphanumeric senders.
Dest TON Type the short value representing the TON for the bstrDestination value.
If you have not been provided with this information, type 1 in this field.
Dest NPI Type the short value representing the NPI for the bstrDestination value.
If you have not been provided with this information, type 1 in this field.
Validity(hr) Type the long value containing the validity period of the SMS message in
hours. The validity period determines how long a message is stored by
the SMSC and how long it tries to deliver it to the mobile if the mobile is
not reachable. The maximum Validity value depends on the SMSC
operator but the range is between 48 and 72 hours.
Important: If your SMSC does not support this setting, type 0 in this
field.
Source TON Type the short value representing the TON for the bstrOriginator value. If
you have not been provided with this information, type 1 in this field.
Source NPI Type the short value representing the NPI for the bstrOriginator value. If
you have not been provided with this information, type 1 in this field.
423
OPTIMA 8.0 Operations and Maintenance Guide
Option Type the long value representing the SMS message option. The
following options are available:
0 - Normal SMS messages
2 - Delivery notification
4 - Direct display messages
8 - 8bit encoded messages
16 - User Data Header (logo or ringing tone)
32 - Virtual SMSC
64 - Unicode messages 128: EMS messages
Warning: Do not change this setting unless instructed to by your
network operator. Incorrect use of this option can cause the Alarm
Notifier to fail when attempting to send alarm notifications.
Use SMSC as primary send Select this checkbox if you want notifications to be sent via the SMSC.
mechanism
The Alarm Notifier will first attempt to send a notification via the SMSC. If
this fails, it will then attempt to send the notification via an attached
modem or handset. This method provides a backup send mechanism in
the event of a LAN failure.
Use SMSC Keep Alives Select this checkbox if you want an Enquire Link request to be sent to
the SMSC every thirty seconds to ensure that the connection to the
SMSC does not time out during periods of inactivity.
Tip: Try using this option if errors occur after periods of inactivity but the
connection worked correctly initially.
Note: This setting is not required by all SMSCs.
Test SMSC Settings Click this button to test that your SMSC configuration is set up correctly.
The results of the test are displayed in the Server Test Response
window.
424
About OPTIMA Alarms
Tip: You can display more detailed information in the Current Actions window by selecting the
Verbose Logging checkbox on the Database Configuration tab. For more information, see
Configuring Database Settings on page 421.
Right-click the Alarm Notifier icon in your system tray and, from the menu that
appears, click Show Status Window.
- or -
In the Alarm Notifier dialog box, click the Show Status Window button
Note: When you open the Current Actions window for the first time, it appears minimized in the
top left-hand corner of your screen:
425
OPTIMA 8.0 Operations and Maintenance Guide
You should locate and resize the Current Actions window as you require, and it will then open with
same location and dimensions in the future.
Alarm Validity
You need to manually define values in the ALARM_SEVERITY table in the OPTIMA database:
426
About OPTIMA Alarms
Important: If these options are missing or invalid, the Notifier will assume all notifications are valid
forever, and all notifications will always be sent immediately.
To understand how these parameters work together, consider the following examples:
1 MAJOR 1 2
2 MINOR 3 5
3 CRITICAL 0 0
If for any reason a notification cannot be sent within the specified period (or if the notifier is disabled
and enabled at a later date) when it processes alarms that have exceeded their validity period, the
Notifier will not send any notification, and the event will be logged as Expired.
You can also define periods when notifications cannot be sent, depending on their severity type.
For example, you may want to avoid sending non-critical notifications to users in the middle of the
night.
To do this, you need to manual define values in the ALARM_SEVERITY_NOT_RULES table in the
OPTIMA database:
427
OPTIMA 8.0 Operations and Maintenance Guide
The time between the two START_TIME and END_TIME entries on the specified day will be a
'blackout' period, where notifications will be put on hold, and sent at the next available time that falls
outside that period. If the notification should expire during this waiting period, it will never be sent.
You must add as many entries per severity to create all of the blackout periods required by the
customer. While in this state, the Alarm Notifier will log notifications as being 'On Hold Due to
Severity Time'.
To understand how these parameters work together, consider the following examples:
You need to manually define values in the ALARMS_INTERACTIVE table in the OPTIMA
database:
SQL 1 Y VARCHAR2 The SQL statement that provides the information you want
(4000) the Alarm Notifier to return when the specified keyword is
received from the user.
Tip: You can create the SQL statement in an SQL editor,
and paste it into this field.
When defining this field, you should remember the
following:
• You can only send a limited amount of characters via
SMS, so your statement should only return a small
amount of information
• Ensure that you name the returned data fields
something appropriate, as the headers will be sent with
the information to identify it
Tip: You can use placeholders in your SQL query, which
will be substituted at runtime by parameters sent by the
user in the SMS request. Placeholders are defined using
the pipe character "|"and then a number, starting from 2,
and going up to as many as required.
Each word that the user sends after the keyword will be
considered as a new placeholder and associated with its
relative place.
For example, if a user sends the text 'TEST CAT DOG' the
Alarm Notifier would read TEST as the keyword, and CAT
as placeholder |2 and DOG as placeholder |3.
428
About OPTIMA Alarms
DESCRIPT 2 Y VARCHAR2(2 The information that starts the message sent back to the
ION 00) user, and so should be short and descriptive.
KEYWOR 3 Y VARCHAR2(2 This is the word the user must sms to the Alarm Notifier to
D 0) have it perform a particular query. It must be stored in the
databes in uppercase, but the user can sms it to the alarm
notifier in any format.
To understand how these parameters work together, consider the following examples:
Note: In the second example, the placeholder '|2' is included, to be substituted at runtime.
Maintaining Alarms
When using OPTIMA alarms, it is important to periodically run the Alarms Maintenance scheduled
job (AIRCOM.OPTIMA_ALARMS.Maintain_Alarms_Table), using DBMS_Scheduler. This job will:
• Delete all of the old alarms
• Reduce the size of the AIRCOM.ALARMS table and its primary key to reduce the space
used following the delete
• Gather statistics on the AIRCOM.ALARMS table and its primary key
Tip: It is recommended that this is run once daily per schema, at night-time.
429
OPTIMA 8.0 Operations and Maintenance Guide
To configure this scheduled job, ensure that the following parameters are set correctly in the
OPTIMA_Common table:
1. If you are using the Web-based USER ALARM Viewer, then set the
ALARMS_USEACKNOWLEDGE parameter to 1. By default this is 0.
2. Define the number of days for which to keep alarms after they have been cleared,
acknowledged, forwarded or notified (as appropriate), using the
ALARMS_DELETEAFTERDAYS parameter. After this number of days has been exceeded
(the default is 1 day), the next time the Alarms Maintenance scheduled job is run, then the
following rules will be followed:
o If Definition is set for SNMP Forward, then the alarm will only be deleted if both the
SET and CLEAR events have been forwarded
o If an Alarm Handler is Active for an alarm, then the alarm will only be deleted if both the
SET and CLEAR events have been processed by the Alarm Handler
o If the Web-based USER ALARM Viewer is being used, then the alarm will only be
deleted if both the SET and CLEAR events have been acknowledged by the user
When trying to test email settings If the notifier says the mail has You would have to trace the event
using the Alarm Notifier, the been sent, then there can be a through the SMTP server logs.
following message is received: problem with the SMTP server or
the client receiving the mail.
"Authenticated message sent.
SMTP session closed", Can be a problem with the ‘From’ Make sure that there are no spaces in
email address the ‘Mail Sent From’ data.
However, the tested email
address is not receiving the email. Check if there is a ‘space’ in the Make sure your from address include a
email address you have provided @ symbol. The relay servers require
in the ‘Mail Sent From’ data. If an email address with the symbol and
there is space it will not work will not forward the mail without it.
Check if there is an ‘@’ symbol in For example, make the from address
the ‘Mail Sent From’ data. It will [email protected] or
not work without it. something similar.
Email sent by the notifier does not The Exchange Server might Talk to the IT department and make
reach the recipient block the emails for different sure that the emails sent from Notifier
reasons. are not blocked.
If there is 3rd party anti-spam Ensure that you get the anti-spam
software installed on the software to exclude OPTIMA emails
Exchange server, it might scan from being blocked.
the contents of every mail and
delete the mail if it is recognised
as a spam mail.
When running Alarms Notifier, an Database not upgraded properly/ Make sure that the Database is
error message pops up Alarms table doesn’t have all the upgraded properly and that the Alarms
‘NOTIFIED invalid identifier. No columns used by the Alarms table contains all the necessary
further processing will be done’ Notifier columns for Alarms Notifier
User doesn't want a user login on Not using the appropriate Login Alarms Notifier provides different
any of their servers. They don't option. Authentication types like None, Pop,
want to have an open session of Login, Plain etc (Mail Configuration
Windows NT server tab- Authentication type). If the user
chooses the ‘None’ option they don’t
need to provide a username and
password.
430
About OPTIMA Alarms
Only ‘set’ Alarms are notified. No Option to send notification for Make sure that in the Alarm Handler
notification made for clear alarms. Clear Alarms not selected when definition the ‘Apply Handler on Clear
configuring the handler. Alarms’ option is checked.
Make sure that the ‘Do not process
cleared alarms’ checkbox on the
Database Configuration tab in the
Alarms Notifier is not selected.
The Version of Alarm Notifier Install Alarms Notifier version 3.2.
used might not have this feature.
Only from Alarm Notifier V3.2 the
notification is made for Clear
Alarms
Once the option “Use SMSC Keep The reason that multiple You should configure a dedicated
Alives” is selected; the SMS programs cannot use the same account for OPTIMA.
notifier will disconnect another account is because the Alarms
application developed by the Notifier keeps its session to the
customer that is also connecting SMSC open permanently.
to the SMSC server. Both
applications are using same
account to connect to the SMSC.
Delay in receiving SMS The Alarms and Log tables may Implement some jobs to clean up the
notification have grown too large, which log and the alarms tables for older
cause a delay in the processing events.
of the alarms.
Investigation needs to be done by the
In this case, the Notifier may: operator to resolve the delays in the
SMSC
• Send the SMS late.
• Send the SMS on time, but
with a delay in the SMSC
before the notification is
delivered
The error 'Send via SMSC failed: This is a standard status SMSC Check the Send From mobile phone
CIMD2 Error Code {11}' appears. error, which means 'Teleservice number and clarify it is numeric.
not provisioned'.
Note: This can be alphabetic although
you must check with your SMSC.
The Alarms Notifier is sending the The Alarms Notifier queries all If you want to stop this you will have to
total backlog for alarms, not just alarms where the NOTIFIED field manually update the ALARMS table by
the latest. in the ALARMS table is either setting the NOTIFIED field value to 1.
NULL, 0 or 2, and then sends the
notification accordingly.
431
OPTIMA 8.0 Operations and Maintenance Guide
432
About the SNMP Agent
OPTIMA uses the SNMP Agent to provide an outgoing interface for alarms compliant with X733
through SNMP protocols. SNMP clients can request information from the SNMP Agent about
alarms in the database. The SNMP Agent can also send SNMP traps to these SNMP clients.
The SNMP Agent uses a MIB (Management Information Base), which is a virtual database used for
managing the SNMP entities. If you install the SNMP Agent using the OPTIMA Combined Backend
Package (recommended), then the required MIB can be found in \Program Files (x86)\AIRCOM
International\AIRCOM OPTIMA Backend 8.0\Documentation.
Fault management systems can integrate with OPTIMA's SNMP interface which provides SNMP
trap forwarding to named IP addresses and an SNMP Agent for more granular interaction by the
FM system.
433
OPTIMA 8.0 Operations and Maintenance Guide
• A coldstart trap is sent to the FMS when the system is first initialised to notify the FMS that
the Agent is active.
• The ability to perform a FMS initiated re-synchronization. The FMS can set a writable re-
synchronization flag in the MIB via an SNMP SET command. During this time no TRAP is
sent but is stored until the re-synchronization flag is reset.
As well as the basic configuration of one agent and one FMS shown above, you can configure the
SNMP interface using a number of other combinations. For more information on configuring
possible scenarios with different numbers of agents and FMSs, see:
• Configuring the SNMP Interface for a Single Agent and Multiple FMSs on page 434
• Configuring the SNMP Interface for Multiple Agents and Multiple FMSs on page 435
Configuring the SNMP Interface for a Single Agent and Multiple FMSs
It is possible to configure the SNMP interface for a variety of scenarios, including a single agent
and different numbers of FMSs.
This diagram shows an example scenario, with one agent and two FMSs:
To define this sort of configuration, you should use the 'IPAddress', 'Port' and 'Community'
parameters in the [TRAP-LISTENER] section of the ini file to specify the location of each FMS that
you want to use.
For more information, see Configuring the SNMP Agent on page 437.
If you are using this sort of configuration, you should consider the following points:
• The tables in the database for alarms are mapped to the X.733-compliant MIB views, which
have dependencies on the columns and the type of data in the column, including size limits
• There is no restriction on the SNMP GET and SET requests that are managed by the agent
434
About the SNMP Agent
Configuring the SNMP Interface for Multiple Agents and Multiple FMSs
It is possible to configure the SNMP interface for a variety of scenarios, including different numbers
of agents and different numbers of FMSs.
This diagram shows an example scenario, with three agents and five FMSs:
435
OPTIMA 8.0 Operations and Maintenance Guide
If you are using a configuration with multiple agents and multiple FMSs, you should consider the
following points:
• When configuring different Alarm events to go to different FMSs, the Alarm SET events
have to be mutually exclusive. This is because of the method used to record events sent
using SNMP TRAP events.
• The FMSs do not need to map to the Alarm SET events, and an FMS may receive different
alarms for different agents.
In the example scenario above, three Alarm SET events have been created - A, B and C. These
events have their own CUSTOMIZABLE CONTENT MIB views, which are configured in three
separate INI files, one for each Agent.
Note: The standard deployment scripts provide three MIB views for PERFORMANCE, SYSTEM
and TCA alarms respectively.
Type the executable name and a configuration file name into the command prompt. If you are
creating a new configuration file, this is when you choose the file name.
In Windows type:
opx_ALM_GEN_820.exe opx_ALM_GEN_820.ini
In Unix type:
opx_ALM_GEN_820 opx_ALM_GEN_820.ini
436
About the SNMP Agent
Important: If you are creating an SNMP interface with multiple agents, you should create a
separate INI file for each agent, using the 'ExtEnterpriseOid' parameter to differentiate between
agents.
The SNMP Agent configuration (INI) file is divided into seven sections.
Parameter Description
Parameter Description
FolderFileLimit The maximum number of output files that can be created in each output (sub)
folder.
This must be in the range of 100-100,000 for Windows, or 100-500,000 on
Sun/UNIX, otherwise the application will not run.
Warning: Depending on the number of files that you are processing, the lower
the file limit, the more output sub-folders that will be created. This can have a
significant impact on performance, so you should ensure that if you do need to
change the default, you do not set the number too low.
The default value is 10,000.
InstanceID The three-character program instance identifier (mandatory).
InterfaceID The three-digit interface identifier (mandatory).
LogGranularity Defines the frequency of logging, the options are:
0 - Continuous
1 - Monthly
2 - Weekly
3 - Daily
437
OPTIMA 8.0 Operations and Maintenance Guide
Parameter Description
LogLevel (or Sets the level of information required in the log file. The available options are:
LogSeverity)
1 - Debug
2 - Information (Default)
3 - Warning
4 - Minor
5 - Major
6 - Critical
LogOptions 0 - Do not generate a log file.
1 - Generate a log file to the specified directory.
ProgramID The three-character program identifier (mandatory).
UseFolderFileLimit Indicates whether the folder file limit should be used (1) or not (0).
The default value is 0 ('OFF').
Verbose 0 - Run silently. No log messages are displayed on the screen.
1 - Display log messages on the screen.
TestConnectionDelay When attempting to recover from a loss of database connection, the SNMP
Agent will test the database connection each time it queries the database to get
the latest alarms data.
If the connection is lost, the SNMP Agent will attempt to reconnect three times; if
the connection is not restored after this, the SNMP Agent will terminate.
This parameter specifies the number of seconds to delay before each re-
connection attempt. The default value is 30.
Parameter Description
AlarmTableView The database view to query when populating the Alarm table in the MIB. For
more information, see Configuring Views on page 444.
DbPollInterval The database polling interval in minutes.
EnterpriseOid The Enterprise OID used in the MIB.
Note: The Enterprise OID is 23322.
ExtEnterpriseOid The Enterprise OID used when sending traps.
By default this is 0 (not used), but can be specified if you do not want to use the
Enterprise OID when sending traps.
HeartbeatTrapInterval The time in minutes between sending the heartbeat trap.
ObjectTableView The database view to query when populating the Object table in the MIB. For
more information, see Configuring Views on page 444.
Port The port number the SNMP Agent listens for incoming requests.
ReadCommunity The community string used in the GET, GETNEXT request.
ResyncTable The database view to query when sending alarm traps due to a
resynchronization. For more information, see Configuring Views on page 444.
ResyncType The resynchronization type:
0 - Agent .
1 - Manager.
SendEndOfResyncTr 0 - Do not send an end of resynchronization trap.
ap
1 - Send an end of resynchronization trap.
438
About the SNMP Agent
Parameter Description
SysLocation The location where the SNMP Agent is running, for example, a physical location
or a machine name.
SysName The name of the SNMP Agent. The default setting for this parameter is OPTIMA
SNMP Agent.
TrapGuardPeriod The delay time (in milliseconds) after each trap is sent, for example,
TrapGuardPeriod=1000 means a 1 second delay after each trap is sent.
TrapView The database view to query when sending alarm traps. For more information,
see Configuring Views on page 444.
WaitForRequestTime The time in seconds to wait for incoming requests.
outSeconds
WriteCommunity The community string used in the SET request.
Parameter Description
Parameter Description
Important: If you want to send traps to multiple destinations, you should specify the Community,
IPAddress and Port for each destination as separate entities. The first set of parameters should
have no suffix, but the parameters for each additional destination should be suffixed by a number,
starting with 1 for the second destination (Community1, IPAddress1, Port1), 2 for the third
destination (Community2, IPAddress2, Port2) and so on.
Parameter Description
439
OPTIMA 8.0 Operations and Maintenance Guide
Parameter Description
Parameter Description
Parameter Description
Admin_clear This section maps the OPTIMA Severity levels onto corresponding MIB
perceivedSeverity values.
Clear
The available MIB perceivedseverity options are:
Critical
0 - Indeterminate
Information_only
1 - Critical
Intermediate
2 - Major
Minor
3 - Minor
Major
4 - Warning
Warning
5 - Cleared
One of these options can be mapped to each parameter, which represents an
OPTIMA Severity level.
For example, 'Clear=5' indicates that the OPTIMA Severity level 'Clear'
corresponds to the MIB perceivedSeverity value of 5 (Cleared).
440
About the SNMP Agent
1. The SNMP Manager (at the customer end) starts a resync with an SNMP SET request on
the resyncFlag OID to 1.
3. The Agent queries the database view defined in the ResyncTable parameter in the [SNMP-
AGENT] section of the INI file.
4. The Agent sends a trap for each row from this database view.
5. If the SendEndofResyncTrap parameter in the [SNMP-AGENT] section of the INI file is set
to 1, then Agent sends the endOfReyncTrap.
1. The SNMP Manager (at the customer end) starts a resync with an SNMP SET request on
the resyncFlag OID to 1.
2. The Agent stops sending new events as traps, and waits (it does not update the MIB table
with new events).
3. The SNMP Manager uses an SNMP GET or SNMP WALK request to obtain all of the trap
information from the TRAP MIB.
4. After the SNMP Manager has synchronized, then the SNMP Manager uses an SNMP SET
request to set the resyncFlag OID to 0.
5. The Agent sends all of the new events (that occurred when the resyncFlag OID was set to
1) as traps.
The SNMP agent reads the INI file, creates the SNMP session, connects to the database, creates
the log file, creates the PRID file, and builds the MIB in memory. The agent then queries the
AlarmTableView database view and populates the alarmsTable MIB table in the agent memory
based on the result set returned from the database.
441
OPTIMA 8.0 Operations and Maintenance Guide
The agent then enters into a main loop and performs the following steps:
2. The agent responds to a PDU GET message by searching for the value of the OID in the
MIB message stored in memory. The following are the options:
o The agent responds to a PDU GETNEXT message by returning the next OID after OID
received in the message
o The agent will respond to a PDU SET message by setting the OID value from the
message in the MIB memory
o No database connections are made when responding to these requests
Once the agent finishes responding to a request or waits for a request to be over, it performs the
following actions:
1. The agent checks to see if a Heartbeat trap should be sent by comparing the pollHeartBeat
value and the last time a Heartbeat trap was sent. When it is ready to send the heartbeat
trap, the agent builds the heartbeat trap PDU using the values listed in the
HEARTBEAT_TRAP section of the INI file. For more information on the Heartbeat_Trap
section, see Configuring the SNMP Agent on page 437. It then sleeps for period
TrapGuardPeriod milliseconds if the trap was sent without error and then resets the last
heartbeat time.
2. The agent checks to see if the resyncFlag value is 1. If yes, it performs the following
actions:
o Queries the ResyncActiveAlarms database view
o For each row in the result set, the agent builds the trap PDU for alarm reading the 11
column values in the row, sends the alarm trap, sleeps for TrapGuardPeriod
milliseconds period if the trap was sent without error, and then inserts a row into the
SNMP_UPDATE database table
o Once it finishes processing the result set, the agent calls
SNMP_PKG.SET_FWD_IN_ALL_ALARMS_TBL database procedure which updates
the ALARM table fields related to the SNMP AGENT, resets the resyncFlag to 0, and if
SendEndOfResyncTrap is set, the agent sends a EndOfResyncTrap
3. The agent then checks the database polling time and sends any new alarms traps. The
agent compares the last database polling time to the DbPollInterval:
o When database polling is due, the agent queries the database view TrapView
o For each row in the result set, the agent builds the trap PDU for alarm reading the 11
column values in the row, sends the alarm trap, sleeps for TrapGuardPeriod
milliseconds period if the trap was sent without error, and inserts a row into database
table SNMP_UPDATE
4. Once it finishes processing the result set, the agent calls the
SNMP_PKG.SET_FWD_IN_ALL_ALARMS_TBL database procedure which updates the
ALARM table fields related to the SNMP AGENT. If the TrapView database view is not
empty:
o Clears the current MIB tables alarmsTable and objectsTable
o Queries the AlarmTableView database view and populates the alarmsTable MIB table
in the agent memory
442
About the SNMP Agent
o Queries the ObjectTableView database view and populates the objectsTable MIB table
in the agent memory
o Resets the last db polling time
The SNMP agent reads the INI file, creates the SNMP session, connects to the database, creates
the log file, creates the PRID file and then builds the MIB in memory. The agent then queries the
AlarmTableView database view and populates the alarmsTable MIB table in the agent memory
based on the result set returned from the database.
The agent queries the ObjectTableView database view and populates the objectsTable MIB table in
the agent memory based on the result set that is returned from the database.
The agent then enters into a main loop and performs the following steps:
2. The agent responds to a PDU GET message by searching for the value for the OID in the
MIB message stored in memory. The following are the options:
o The agent responds to a PDU GETNEXT message by returning the next OID after OID
received in the message
o The agent responds to a PDU SET message by setting the OID value from the
message in the MIB memory
o No database connections are made when responding to these requests
Once the agent is finished responding to a request or waits for a request to be over, it performs the
following actions:
1. The agent checks to see if a Heartbeat trap should be sent by comparing the pollHeartBeat
value and the last time a Heartbeat trap was sent
3. The Agent builds the heartbeat trap PDU using the values listed under the
HEARTBEAT_TRAP section of the INI file. For more information on the Heartbeat_Trap
section, see Configuring the SNMP Agent on page 437. It then sleeps for a period of
TrapGuardPeriod milliseconds if the trap was sent without error and then resets the last
heartbeat time.
4. The agent then checks the database polling time and sends any new alarms traps. The
agent compares the last database polling time to the DbPollInterval:
o When database polling is due, the agent sends traps if resyncFlag value = 0
o The agent queries the TrapView database view
o For each row in the result set, the agent builds the trap PDU for alarm reading the 11
column values in the row, sends the alarm trap, sleeps for period TrapGuardPeriod
milliseconds if the trap was sent without error, and inserts a row into database table
SNMP_UPDATE
443
OPTIMA 8.0 Operations and Maintenance Guide
5. Once it finishes processing the result set, the agent calls the
SNMP_PKG.SET_FWD_IN_ALL_ALARMS_TBL database procedure which updates the
ALARM table fields related to the SNMP AGENT. If the TrapView database view was not
empty and traps were sent, the agent:
o Clears the current alarmsTable and objectsTable MIB tables
o Queries the AlarmTableView database view and populates the alarmsTable MIB table
in the agent memory
o Queries the ObjectTableView database view and populates the objectsTable MIB table
in the agent memory
o Resets the last db polling time
Configuring Views
In the [SNMP-AGENT] section of the configuration (INI) file, there are some parameters that query
views in the database. You can configure these views to control the behavior of the SNMP Agent.
Important: You cannot change the names or number of columns in these views, but you can
change the formulas that provide the data in them.
This view Is used for Uses And must have this header
these
columns
Alarm Table Active events, that X735 alarm CREATE OR REPLACE FORCE VIEW
is, SET events that columns AIRCOM.SNMP_ALARM_MIB_<ALARM TYPE>
have no (NOTIFICATIONID, ALARM_DATETIME,
corresponding PERCIEVEDSEVERITY, FIRSTOCCURENCE,
CLEAR event. OCCURENCE,
DEFINITION_ID, ELEMENT_ID,
MANAGEDOBJECT, IDEVENTTYPE,
IDPROBABLECAUSE,
SPECIFICPROBLEM, PROPOSEDREPAIRACTION,
ADDITIONALTEXT, TRENDINDICATOR)
Object Table Objects for which Object CREATE OR REPLACE FORCE VIEW
alarms are valid. columns AIRCOM.SNMP_OBJECTS_<ALARM TYPE>
(DEFINITION_ID, ELEMENT_ID,
ELEMENT_NAME)
Trap Unforwarded events, X735 alarm CREATE OR REPLACE FORCE VIEW
that is, events that columns AIRCOM.SNMP_TRAP_MIB_<ALARM TYPE>
have not been (NOTIFICATIONID, ALARM_DATETIME,
forwarded by the PERCIEVEDSEVERITY, FIRSTOCCURENCE,
SNMP Agent. OCCURENCE,
DEFINITION_ID, ELEMENT_ID,
MANAGEDOBJECT, IDEVENTTYPE,
IDPROBABLECAUSE,
SPECIFICPROBLEM, PROPOSEDREPAIRACTION,
ADDITIONALTEXT, TRENDINDICATOR)
There are now 3 different alarm types, and this must be specified at the end of the view name, as
follows:
• For performance alarms, use PERFORMANCE
• For system alarms, use SYSTEM
• For TCAs, use TCA
444
About the SNMP Agent
For example, the Alarm Table view for a system alarm is AIRCOM.SNMP_ALARM_MIB_SYSTEM,
and the Trap view for a TCA is AIRCOM.SNMP_TRAP_MIB_TCA.
Maintenance
In usual operation, the SNMP Agent should not need any special maintenance. However, TEOCO
recommends the following basic maintenance checks are carried out for the SNMP Poller:
Log file for error messages. Weekly In particular any Warning, Minor, Major and Critical
messages should be investigated.
You can either obtain the version details from the log file or you can print the information by typing
the following command at the command prompt:
In Windows:
opx_ALM_GEN_820.exe -v
In Unix:
opx_ALM_GEN_820 –v
For more information about versioning, see About Versioning on page 33.
445
OPTIMA 8.0 Operations and Maintenance Guide
Troubleshooting
The following table shows troubleshooting tips for the SNMP Agent:
Cannot save configuration (INI) User has insufficient privileges on Enable permissions.
file. configuration (INI) file or directory.
Make file writable.
The file is read only or is being used
by another application. Close the Parser to release the
configuration (INI) file.
Application exits immediately. Another instance is running. Use Process Monitor to check
instances running.
Invalid or corrupt (INI) file.
SNMP session not created Network problem Report to system administrator.
446
About the SNMP Agent
447
OPTIMA 8.0 Operations and Maintenance Guide
448
About the SNMP Agent
5005 Read Trap Sequence number from persistent file <SeqNoFileValue>. INFORMATION
5010 Could not access MIB object TrapSequenceNumber. WARNING
5015 Created <SeqNoFileValue> and set Trap Sequence value to 0. INFORMATION
5500 Could not access MIB object PollHeartBeat. WARNING
Querying database for traps. INFORMATION
5505 Overriding INI HeartbeatTrapInterval with value from persistent file INFORMATION
<HeartBeatFileValue>.
5510 Could not access MIB object PollHeartBeat. WARNING
SQL->select * from <tableName> order by <alarmDate>. INFORMATION
5515 Created file <HeartBeatFileValue> and set PollHeartBeat to INI file value. DEBUG
5516 Empty result set, no traps to send. INFORMATION
5520 Query failed. Error-> <errorDetails>. MAJOR
5530 Sent alarm trap: Seq{<TrapSequenceNumber>} DEBUG
<ColoumName_ManagedObject> {<ManagedObjectValue>}
<ColoumName_AlarmDate> {<AlarmDateValue>} to
<TrapListenersAddress> / <TrapListenersPort>.
5535 Failed to send alarm trap to <TrapListenersAddress> / <TrapListenerPort>. DEBUG
5540 Error executing SQL <sqlStatement> Error <errorDetails>. WARNING
5541 Inserted row into SNMP_UPDATE table. DEBUG
5550 Successfully updated database. INFORMATION
5555 Error occurred when updating database. WARNING
5560 Error occurred when updating database <errorDetails>. WARNING
6000 Sent coldstart trap to <TrapListenersAddress> / <TrapListenersPort>. INFORMATION
6005 Failed to send coldstart trap to <TrapListenersAddress> / WARNING
<TrapListenersPort>.
6100 Sent Heartbeat trap Seq{ <TrapSequenceNumber> } to INFORMATION
<TrapListenersAddress> / <TrapListenersPort>.
449
OPTIMA 8.0 Operations and Maintenance Guide
450
About the SNMP Agent
451
OPTIMA 8.0 Operations and Maintenance Guide
[MAIN]
LogGranularity=0
LogLevel=1
LogOptions=1
TrapsOnly=0
FileFolderLimit=0
InterfaceID=001
ProgramID=002
InstanceID=003
Verbose=1
TestConnectionDelay=30
[SNMP-AGENT]
Port=161
ResyncTable=AIRCOM.SNMP_ALARM_MIB_PERFORMANCE
ReadCommunity=public
WriteCommunity=public
DbPollInterval=1
HeartbeatTrapInterval=10
EnterpriseOid=23322
ExtEnterpriseOid=23322.3
AlarmTableView=AIRCOM.SNMP_ALARM_MIB_PERFORMANCE
ObjectTableView=AIRCOM.SNMP_OBJECTS_PERFORMANCE
TrapView=AIRCOM.SNMP_TRAP_MIB_PERFORMANCE
TrapsOnly=0
WaitForRequestTimeoutSeconds=1
ResyncType=manager
TrapGuardPeriod=0
SysName=SNMPAgent
SysLocation=London
SendEndOfResyncTrap=1
[DATABASE]
Database=OPTRAC_VM
UserName=optima_snmpagent_proc
Password=ENC(l|mlofhY)ENC
[TRAP-LISTENERS]
IPAddress=127.0.0.1
Port=162
Community=public
IPAddress1=127.0.0.1
Port1=163
Community1=public
IPAddress2=127.0.0.1
Port2=164
Community2=public
[HEARTBEAT_TRAP]
NotificationID=99999
PerceivedSeverity=4
ProbableCauseID=0
452
About the SNMP Agent
EventTypeID=2
AdditionalText=HEARTBEAT_TRAP
[END_OF_RESYNC]
NotificationID=99999
PerceivedSeverity=4
ProbableCauseID=0
EventTypeID=2
AdditionalText=END_OF_RESYNC
[OPTIMA-SEVERITY-MAPPING]
Intermediate=1
Warning=4
Minor=3
Clear=5
Major=2
Critical=1
Information_only=0
Admin_Clear=5
Important: In the [DIR] section (also called [Directory Parameters] in some INI files), you should
replace:
• OPTIMA_DIR with the OPTIMA home directory, which is set as an environment variable,
and will be different depending on whether you are using Windows or UNIX
• <application_name> with the name of the application (minus any file extension)
• Forward slashes with back slashes, if you are using Windows
453
OPTIMA 8.0 Operations and Maintenance Guide
454
About the SNMP Agent
Notes:
• The standards for ideventType and idprobablyCause can be found in the URL IANA-ITU-
ALARM-TC-MIB
• The standards for perceivedSeverity and trendIndicator (not used) can be found in the URL
ITU-ALARM-TC-MIB
The MIB also contains an alarmtrap, which contains the following objects (which correlate to the
definitions in the table above):
trapSequenceNumber,
notificationID,
perceivedSeverity,
firstOccurance,
eventTime,
objectId,
ideventType,
idprobableCause,
specificProblem,
proposedRepairAction,
additionalText,
trendIndicator
455
OPTIMA 8.0 Operations and Maintenance Guide
A coldStart trap signifies that the SNMP entity, supporting a notification originator application, is
reinitialising itself and that its configuration may have been altered.
Here is an example of the coldStart trap received by the trap receiver software:
Source: 127.0.0.1
SNMP Version: 2
Trap OID:
.iso.org.dod.internet.snmpV2.snmpModules.snmpMIB.snmpMIBObjects.snmpTraps.coldSt
art
Variable Bindings:
Name: .iso.org.dod.internet.mgmt.mid-2.system.sysUpTime.0
Name: snmpTrapOID
456
About the SNMP Agent
The values for the objects in the alarmTrap are read from the INI values in the
[HEARTBEAT_TRAP] section. For more information, see Configuring the SNMP Agent on page
437.
Here is an example of the HeartBeat trap received by the trap receiver software:
Source: 127.0.0.1
SNMP Version: 2
Variable Bindings:
Name: .iso.org.dod.internet.mgmt.mib-2.system.sysUpTime.0
Name: snmpTrapOID
Name: .iso.org.dod.internet.private.enterprises.aircom.alarms.trapSequenceNumber.0
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.notification
ID
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.perceived
Severity
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.firstOccura
nce
457
OPTIMA 8.0 Operations and Maintenance Guide
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.eventTime
Name:
.iso.org.dod.internet.private.enterprises.aircom.objects.objectsTable.objectsEntry.objectId
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.ideventTy
pe
Value: [Integer] 2
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.idprobable
Cause
Value: [Integer] 0
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.specificPr
oblem
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.proposed
RepairAction
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.additional
Text
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.trendIndic
ator
458
About the SNMP Agent
The values for the objects in the alarmTrap are read from the database view taken from the INI
parameter 'TrapView', defined in the [SNMP-AGENT]. For more information, see Configuring the
SNMP Agent on page 437.
Here is an example of the Alarm trap received by the trap receiver software:
Source: 127.0.0.1
SNMP Version: 2
Variable Bindings:
Name: .iso.org.dod.internet.mgmt.mib-2.system.sysUpTime.0
Name: snmpTrapOID
Name: .iso.org.dod.internet.private.enterprises.aircom.alarms.trapSequenceNumber.0
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.notification
ID.7878
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.perceived
Severity.7878
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.firstOccura
nce.7878
Value: [OctetString]
459
OPTIMA 8.0 Operations and Maintenance Guide
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.eventTime
.7878
Name:
.iso.org.dod.internet.private.enterprises.aircom.objects.objectsTable.objectsEntry.objectId.4
.77.83.67.54
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.ideventTy
pe.7878
Value: [Integer] 2
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.idprobable
Cause.7878
Value: [Integer] 3
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.specificPr
oblem.7878
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.proposed
RepairAction.7878
Value: [OctetString]
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.additional
Text.7878
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.trendIndic
ator.7878
Value: [OctetString]
460
About the SNMP Agent
The values for the objects in the alarmTrap are read from the INI values in the [END_OF_RESYNC]
section. For more information, see Configuring the SNMP Agent on page 437.
Important: End of Resync traps are only sent if the 'ResyncType' parameter in the [SNMP-AGENT]
section of the INI file is set to 'AGENT'.
Here is an example of the End of Resync trap received by the trap receiver software:
Source: 127.0.0.1
SNMP Version: 2
Variable Bindings:
Name: .iso.org.dod.internet.mgmt.mib-2.system.sysUpTime.0
Name: snmpTrapOID
Name: .iso.org.dod.internet.private.enterprises.aircom.alarms.trapSequenceNumber.0
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.notification
ID
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.perceived
Severity
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.firstOccura
nce
461
OPTIMA 8.0 Operations and Maintenance Guide
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.eventTime
Name:
.iso.org.dod.internet.private.enterprises.aircom.objects.objectsTable.objectsEntry.objectId
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.ideventTy
pe
Value: [Integer] 2
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.idprobable
Cause
Value: [Integer] 0
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.specificPr
oblem
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.proposed
RepairAction
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.additional
Text
Name:
.iso.org.dod.internet.private.enterprises.aircom.alarms.alarmsTable.alarmsEntry.trendIndic
ator
462
About the File Splitter
The File Splitter splits a single file that contains a variety of different objects into a number of files
containing similar objects. This is done prior to parsing, and makes the parsing of the data easier -
rather than trying to parse a variety of objects in a single file, it can more effectively parse similar
objects in different files, one file at a time.
In this way, the File Splitter splits data 'horizontally' (that is row/object-by-row/object), compared to
the Data Validation application, which splits data 'vertically' (that is column-by-column).
Important: The File Splitter is generally only used for Siemens files.
01929G053 153ZAGA
MF.USMM.CY4
07-09-2801:45 ****2432 0 0 0 0 0 0
0 0 0 11144…
07-09-2801:45 ****3088 0 0 0 0 0 0
0 0 0 0 0 0 547 698 0 253
534…
01929G053 153ZAGA
MF.USMM.CY4
07-09-2801:45 ****2432 0 0 0 0 0 0
0 0 0 11144…
And 3088_MF_USMM_CY4_MSC1_FR00000.spf:
153ZAGA MF.USMM.CY4
07-09-2801:45 ****3088 0 0 0 0 0 0 0 0 0 0
0 0 547 698 0 253 534…
Type in the executable file name and the configuration (INI) file name into the command prompt:
463
OPTIMA 8.0 Operations and Maintenance Guide
In Windows:
opx_SPL_GEN_414.exe opx_SPL_GEN_414.ini
In Unix:
opx_SPL_GEN_414 opx_SPL_GEN_414.ini
The following table describes the relevant parameters in the [DIR] section:
Parameter Description
Parameter Description
464
About the File Splitter
Parameter Description
StandAlone 0 – Run the application without a monitor file. Do not select this option if
the application is scheduled or the OPTIMA Process Monitor is used.
1 – Run the application with a monitor file.
InterfaceID The three-digit interface identifier (mandatory).
ProgramID The three-character program identifier (mandatory).
InstanceID The three-character program instance identifier (mandatory).
EnableBackup 1 – Enable to backup original input raw file(s) after successfully being
processed.
0 – Do not enable backup therefore delete the input raw file(s) after
successfully being processed.
InputFileMask Filter for input file to process, for example, *C*.*
UseFolderFileLimit Indicates whether the folder file limit should be used (1) or not (0).
FolderFileLimit The maximum number of output files that can be created in each output
(sub) folder.
This must be in the range of 100-100,000 for Windows, or 100-500,000 on
Sun/UNIX, otherwise the application will not run.
Warning: Depending on the number of files that you are processing, the
lower the file limit, the more output sub-folders that will be created. This
can have a significant impact on performance, so you should ensure that
if you do need to change the default, you do not set the number too low.
The default is 10,000.
Parameter Description
Number The number of separate reports you want to create for the file that you
are splitting.
Each report section defines the criteria that will be used to split the file
(for example, the search strings and/or termination strings).
REPORT1, REPORT 2 and so The name of the first report, second report and so on.
on.
Each report that you define will also have a separate section in the INI file. The following table
describes the parameters for each of these sections:
Parameter Description
TerminationStr Indicates whether you want to split the data into files based on a Start
String and Termination string (1) or not (0).
Important: If you do this, you cannot also set the SearchStr parameter
to 1 and use that method as well.
465
OPTIMA 8.0 Operations and Maintenance Guide
Parameter Description
SearchStr Indicates whether you want to split the data files into files based on
one or more search strings (1) or not (0).
Important: If you do this, you cannot also set the TerminationStr
parameter to 1 and use that method as well.
AddHeader Indicates whether the header will be added to each split file (1) or not
(0) when SearchStr is used.
SearchStringNumber Stores the number of search strings that you want to use. There
should be a corresponding number of search strings defined - for
example, if SearchStringNumber = 2, then there should be 2 search
strings (SearchString1 and SearchString2) defined.
SearchString1, SearchString2 The string on which you want to search.
and so on.
For example, if my SearchString value is *2368, then each row
containing the value 2368 will be collected into a single file and stored
in the C:\Temp\414\out\2368 sub-directory.
Maintenance
In usual operation, the File Splitter application should not need any special maintenance. During
installation the File Splitter application will be configured to maintain the backup and log directories
automatically.
However TEOCO recommends the following basic maintenance checks are carried out for File
Splitter application:
Input directory for a backlog of Weekly Files meeting the maintenance criteria should not be
files meeting the maintenance in the input directory. A backlog indicates a problem
criteria. with the program.
Log messages for error Weekly In particular any Warning, Minor, Major and Critical
messages messages should be investigated.
The log file is expected to have information related to any error files found in the particular
directory. For more information about the log file, see Checking a Log File Message on page 172.
However, if the File Splitter is run continuously, then the input directory is monitored continuously
and in this case, it can be terminated.
466
About the File Splitter
In Windows:
opx_SPL_GEN_414.exe -v
In Unix:
opx_SPL_GEN_414 -v
For more information about obtaining version details, see About Versioning on page 33.
Troubleshooting
The following table shows troubleshooting tips for the File Splitter:
Application not Application has not been Use Process Monitor to check last run
processing input files. scheduled. status.
Application has crashed and Check process list and monitor file. If
Process Monitor is not configured. there is a monitor file and no
corresponding process with that PID, then
remove the monitor file.
Note: The process monitor will do this
automatically.
Incorrect configuration settings. Check configuration settings.
Files in Error Directory. Incorrect configuration settings. Check log file for more information on the
problems.
Invalid input files. Check error file format.
Files are not being split. The search/termination strings are Check that the strings are defined
not found. correctly.
The output mask is incorrect. Change the output masks.
467
OPTIMA 8.0 Operations and Maintenance Guide
DirFrom=/OPTIMA_DIR/<application_name>/in
DirTo=/OPTIMA_DIR/<application_name>/out
DirBackup=/OPTIMA_DIR/<application_name>/backup
ErrorDir=/OPTIMA_DIR/<application_name>/error
LogDir=/OPTIMA_DIR/<application_name>/log
TempDir=/OPTIMA_DIR/<application_name>/temp
PidFilePath=/OPTIMA_DIR/<application_name>/prid
CombinerDir=/OPTIMA_DIR/<application_name>/combiner
EXEName=<application_name>
EnableBackup=1
EnableCombiner=1
[MAIN]
InputFileNameAsColumn=0
LogGranularity=3
LogLevel=1
RefreshTime=1
TruncateHeader=0
RunContinuous=0
StandAlone=0
UseFolderFileLimit=0
FolderFileLimit=10000
InterfaceID=001
ProgramID=414
InstanceID=001
[REPORTS]
Number=1
REPORT1=split_2352_2496
;REPORT2=split_2368_2512
468
About the File Splitter
[split_2352_2496]
OutputDirectory=/OPTIMA_DIR/<application_name>/out
UseInputFileMask=1
InputFileMask=*.spf
UseOutputFileMask=0
OutputFileMask=.csv
AddHeader=1
TerminationStr=0
StartString=OBJTYPE
TerminationString=END
SearchStr=1
SearchStringNumber=4
SearchString1=*2456|/OPTIMA_DIR/<application_name>/out/2456
SearchString2=*3184|/OPTIMA_DIR/<application_name>/out/3184
SearchString3=* 128|/OPTIMA_DIR/<application_name>/out/128
SearchString4=*2512|/OPTIMA_DIR/<application_name>/out/2512
[split_2368_2512]
OutputDirectory=/OPTIMA_DIR/<application_name>/out
UseInputFileMask=1
InputFileMask=*.spf
UseOutputFileMask=0
OutputFileMask=.csv
AddHeader=1
TerminationStr=0
StartString=OBJTYPE
TerminationString=END
SearchStr=1
SearchStringNumber=2
SearchString1=*2368|/OPTIMA_DIR/<application_name>/out/2368
SearchString2=*2512|/OPTIMA_DIR/<application_name>/out/2512
469
OPTIMA 8.0 Operations and Maintenance Guide
Important: In the [DIR] section (also called [Directory Parameters] in some INI files), you should
replace:
• OPTIMA_DIR with the OPTIMA home directory, which is set as an environment variable,
and will be different depending on whether you are using Windows or UNIX
• <application_name> with the name of the application (minus any file extension)
• Forward slashes with back slashes, if you are using Windows
470
Functions, Procedures and Packages
This appendix lists the Functions, Procedures and Packages associated with the various OPTIMA
schemas. The Job Scheduler and Frontend and Backend jobs and schedules are also described.
Function Description
471
OPTIMA 8.0 Operations and Maintenance Guide
Function Description
Procedure Description
Package Description
472
Functions, Procedures and Packages
Package Description
Package Description
Function Description
Package Description
473
OPTIMA 8.0 Operations and Maintenance Guide
Package Description
Scheduling Jobs
There are a number of different ways to schedule jobs used in OPTIMA:
• Cron (used to schedule UNIX backend programs)
• SCHEDTASK (used to schedule the Windows backend programs)
• The Oracle DBMS Scheduler (DBMS_SCHEDULER) (Recommended for scheduling
Oracle jobs)
- or -
• The Oracle Job Scheduler (DBMS_JOBS) (can also be used for scheduling Oracle jobs)
UNIX
UNIX backend programs are scheduled using this functionality. This is an example of a crontab
configuration:
0,15,30,45 * * * * /opt/AIoptima/run/RunProcessMonitor_205.sh
0 1 * * * /opt/AIoptima/run/RunDirectoryMaintenance.sh
0 5 * * * /opt/AIoptima/run/RunLoaders_DirMaintenance.sh
0 * * * * /opt/AIoptima/run/RunOpxLog.sh
20 ** * * /opt/AIoptima/run/RunLoaders_LOGS.sh
WIN OS
474
Functions, Procedures and Packages
DBMS_SCHEDULER
DBMS_SCHEDULER enables users to perform resource plan management, which allows them to
control:
• The number of concurrent jobs for a particular job_class
• The order of executing of a job or groups of jobs
• Switching jobs from one resource plan to another during the day
• And much more
Scheduler Components
The Scheduler uses three basic components to handle the execution of scheduled tasks. An
instance of each component is stored as a separate object in the database when it is created:
Component Description
DBMS_JOBS
475
OPTIMA 8.0 Operations and Maintenance Guide
476
Functions, Procedures and Packages
477
OPTIMA 8.0 Operations and Maintenance Guide
478
Glossary of Terms
Glossary of Terms
A
Agent
In the context of SNMP, this is a software module that performs the network management functions
requested by the network management stations.
An agent module may be implemented in any network element that is to be managed, such as host,
bridge or router.
B
BSC
Base Station Controller. A piece of equipment that controls one or more BTSs.
BTS
C
Columnar Object
An object that is part of an SMP table. There is no instance of the columnar object for each row in
the table.
CSV
Comma-Separated Values. A type of data format in which each piece of data is separated by a
comma.
F
FTP
File Transfer Protocol. The standard protocol for exchanging files across the Internet.
I
INI
Initialization file. INI files are used to initialize, or set parameters for, the operating system and
certain programs.
IP
Internet Protocol. This defines the format for all data travelling through a TCP/IP network, performs
the routing functions and provides a mechanism for processing unreliable data.
479
OPTIMA 8.0 Operations and Maintenance Guide
K
KPI
M
MAC
MIB
Management Information Base. A type of database used to manage the devices in a network. MIBs
are especially used with SNMP.
MSC
Mobile Switching Centre. In a cellular network, this is a switch or exchange that interworks with
location databases.
P
PDU
Protocol Data Unit. The PDU format is used to send and receive SMS messages.
R
RFC
RMON
S
SDCCH
Stand-alone Dedicated Control Channel. This is a channel used in GSM to provide a reliable
connection for signalling and SMS messages.
SMI
SMP
480
Glossary of Terms
SMPP
Short Message Peer-to-peer Protocol. The protocol used for exchanging SMS messages between
SMS peer entities such as SMSCs.
SMS
Short Message Service. The text messaging system, enabling messages to be sent to/from GSM
phones and to external systems (for example, email or voicemail). Messages that cannot be
delivered straight away (due to the receiver's mobile being switched off or out of range) are stored,
and delivered as soon as possible.
SMSC
Short Message Service Center. A network element in the mobile telephone network which delivers
SMS messages.
SMTP
Simple Mail Transfer Protocol. A protocol used to send and receive email messages.
SNMP
Simple Network Management Protocol. SNMP is the protocol used for network management and
the monitoring of network devices and their functions.
SQL
Structured Query Language. SQL is an ANSI and ISO standard computer language for getting
information from and updating a database.
T
TCH
TCP
Transmission Control Protocol. The protocol used (along with the IP) to ensure reliable and in-order
delivery of data across the Internet.
481
OPTIMA 8.0 Operations and Maintenance Guide
482
Index
starting • 48
Defining
Index monitoring settings • 261
reports • 186, 197
Direct database loading, using the summary for • 329
Directory Maintenance Application
about • 267
A checking the application is running • 275
checking the log file • 275
Alarm Notifier checking the version • 275
about • 414 configuration (INI) file • 277
configuring • 417, 419, 421, 422 configuring • 269
executing • 417 installing • 268
installing • 414 maintaining • 274
Alarms starting • 268
about • 409 stopping • 275
troubleshooting • 276
C Directory Maintenance Process, about • 268
Checking
an application is running • 83, 150, 177, 188, 208, E
236, 263, 275, 404 Error Files, checking • 172, 187, 206
error files • 172, 187, 206 Error tables, loader • 222, 235
log files • 41, 82, 172, 187, 207, 235, 263, 275, Examples
403 Data Validation configuration (INI) file • 190
version details • 82, 149, 177, 188, 207, 236, 263, Parser configuration (INI) file • 178
275 Executing, Alarm Notifier • 417
Combining Process, about • 195 External Programs
Configuration (INI) file, example • 178, 190, 198, 245, about • 41
265, 277, 407, 452 configuring • 32
Configuring scheduling • 31
Alarm Notifier • 417, 419, 421, 422
Data Quality Package • 332
Data Validation Application • 183 F
Directory Maintenance Application • 269
File Combiner Application
external programs • 32
about • 193
FTP Application • 66
checking for error files • 206
Loader file mappings • 227
checking the application is running • 208
Loader table mappings • 228
checking the log file • 207
OPTIMA Parser • 167
checking the version • 207
partition maintenance • 381
configuration (INI) file • 198
Report Scheduler • 396
defining reports • 197
reports • 219
maintaining • 206
SNMP Agent • 437
starting • 196
statistics gathering • 386
stopping • 207
tablespace maintenance • 384
File Combiner Configuration Utility, troubleshooting •
Consumer groups, in OPTIMA • 51
208
File locations and naming. about • 30
D FTP Application
checking the application is running • 82
Data Loading Process checking the log file • 82
starting • 40 checking the version • 82
stopping • 40 configuration (INI) file parameters • 68
Data Quality Package configuring • 66
configuring • 332 installing • 60
configuring period processing • 348 prerequisites • 63
installing • 331 stopping • 82
Data Validation Application
checking for error files • 187
checking the application is running • 188 I
checking the log file • 187
Installing
checking the version • 188
Alarm Notifier • 414
configuration (INI) file • 190
Data Quality Package • 331
maintaining • 187
Data Validation Application • 182
starting • 185
Directory Maintenance Application • 268
stopping • 188
File Combiner Application • 195
troubleshooting • 188
FTP Application • 60
Data Validation, about • 181
OPTIMA • 20
Database Server
Parser • 166
rebooting • 48
Parser Configuration Utility • 166
483
OPTIMA 8.0 Operations and Maintenance Guide
484
Index
maintaining • 445
troubleshooting • 446
SNMP Poller U
checking the application is running • 150 Upgrading, OPTIMA • 20
checking the version • 149
maintaining • 149
stopping • 149 V
troubleshooting • 151
Validation Process, about • 182
Starting
Version details, checking • 82, 177, 188, 207, 236,
Data Loading Process • 40
263, 275
database server • 40
Versioning
Directory Maintenance Application • 268
about • 33
Loader • 216
checking the version • 177
Loader GUI • 216
Parser • 166
Process Monitor • 259
Report Scheduler • 402
Statistics gathering
configuring • 386
scheduling • 387
Stopping
Data Loading Process • 40
Data Validation Application • 188
Directory Maintenance Application • 275
File Combiner Application • 207
FTP Application • 82
Parser • 176
Process Monitor • 263
Report Scheduler • 404
SNMP Agent • 445
SNMP Poller • 149
Summary
configuring • 286
connecting to the database • 285
using for direct database loading • 329
viewing log messages • 325
viewing oracle jobs • 327
Summary reports
adding • 291
defining time zones in • 302
deleting • 320
editing • 320
T
Table settings, for the Loader • 225
Tablespace maintenance
configuring • 384
scheduling • 385
TCAs
defining • 225, 228
Time zones
defining in summary reports • 302, 304
using in report schedules • 396
Troubleshooting
Data Validation Application • 188
Directory Maintenance Application • 276
File Combiner Application • 208
Loader • 241
OPTIMA • 43
OSS Maintenance Package • 391
Parser • 177
Parser Configuration Utility • 177
Process Monitor • 265
Report Scheduler • 404
SNMP Agent • 446
SNMP Poller • 151
485
OPTIMA 8.0 Operations and Maintenance Guide
486