Distribution System Reliability: Default Data and Model Validation
Distribution System Reliability: Default Data and Model Validation
Distribution System Reliability: Default Data and Model Validation
2, May 1998
Abstract-Distribution system reliability assessment is Since a majority of customer reliability problems stem
able to predict the interruption profile of a distribution from distribution systems [l],utilities must focus on distribu-
system based on system topology and component reliabil- tion systems if substantial improvement in customer reliability
ity data. Unfortunately, many utilities do not have enough are to be gained. Deregulation of the industry has also made it
historical component reliability data to perform such an critical for utilities to provide this level of reliability at the
assessment, and are not confident that other sources of lowest possible cost. To do this, predictive reliability assess-
data are representative of their particular system. As a ment methods are needed [ 2 ] .
result, these utilities do not incorporate distribution sys- Distribution system reliability assessment is a quickly
tem reliability assessment into their design process and maturing field. It has evolved from the first EPRI program in
forego i t s significant advantages. This paper presents a 1978 [ 3 ] , to programs developed and used in-house by utili-
way of gaining confidence in a reliability model by devel- ties [4-71, to commercially available software products. These
oping a validation method. This method automatically planning tools are able to predict the reliability of a distribu-
determines appropriate default component reliability data tion system based on system topology and component reh-
so that predicted reliability indices match historical val- ability data. Unfortunately, these products will never gain
ues. The result is a validated base case from which incre- widespread use until utilities are confident that available data
mental design improvements can be explored. is representative of their actual system.
Ideally, a utility will have a large amount of historical
data from which it can determine the reliability of various
I. INTRODUCTION components such as lines, prlotection devices, and switches.
Most utilities, however, do not have this information avail-
Customer demands for reliable power are quickly able. Values may be obtained from published data corre-
changing. Not only is more energy being demanded, but this sponding to other systems, but this data may not be represen-
energy must be provided at increasing levels of service reli- tative of the system under consideration. This data discrep-
ability. A sustained interruption can cost certain customers ancy is most evident when predicted system reliability indices
hundreds of thousands of dollars per hour. Even a momentary do not agree with historically computed reliability indices.
interruption can cause computer systems to crash and indus- Most utilities do not have a substantial amount of histori-
trial processes to be ruined. To many customers with sensitive cal component reliability data. Nearly all utilities, however,
electric loads, reliability as well as the cost of energy may have historical system reliability data in the form of reliability
drive decisions such as: where a new plant is to be located, indices (e.g., SAIFI, SAIDI-see Section 2.1). When a sys-
whether an existing plant will be relocated, or whether a tem is modeled, the reliability indices predicted by the as-
switch to a new energy provider will be pursued. sessment tool should agree with these historical values. If so,
a certain level of confidence in the model is achieved and
more specific reliability results (e.g., the reliability of a spe-
PE-870-PWRS-2-06-1997 A paper recommended and approved by
the IEEE Power System Planning and Implementation Committee of cific load point or the impact of a design change) can be
the IEEE Power Engineering Society for publication in the IEEE trusted to a higher degree. When this confidence has been
Transactions on Power Systems. Manuscript submitted December 27, achieved and predicted results match historical results, the
1996; made available for printing June 9, 1997. reliability model is said to be validated.
This paper presents a new method of distribution system
reliability model validation. It first identifies which default
component reliability parameters should be modified by per-
forming a sensitivity analysis on a test system. It then presents
a method of computing these parameter values so that pre-
MTTS: Mean Time To Switch-this is the expected time it ground line sections might be assigned a different failure rate
will take after a failure occurs for sectionalizing per unit length. In this way, the number of reliability parame-
switch to be toggled. ters used in a given system is greatly reduced. There are still,
PSS: Probability of Successful Switching: the probability however, many more default parameters than can be effec-
of a switching device (a protection device, a load- tively adjusted with the limited amount of historical data that
break switch, or a no-load break switch) of actually is typically available. To help decide which default parame-
switching if it is supposed to switch. The comple- ters will be adjusted, a sensitivity analysis will be performed.
ment of PSS is switch unreliability:
The sensitivity of a function to a parameter is defined as Because system reliability indices are most sensitive to
the partial derivative of the function with respect to that pa- changes in overhead line default values, overhead line hM,
rameter. This is a measure of how much the value of the overhead line is ,
and overhead line MTTR will be the default
function will change if the parameter is perturbed. This can be values that will be adjusted when validating a reliability
approximated by actually perturbing the parameter (keeping model. For a given area under consideration, these results in
all other parameters fixed) and measuring how much the three degree of freedom in a search space looking for three
function changes. For example, if the default overhead line values: MAIFI, SAIFI, and SAIDI.
MTTR is increased by 1%, and the resulting SAIDI value There is yet another justification for choosing overhead
increases by O S % , the sensitivity of SAIDI to default over- line parameters for adjustment. Since line failure rates and
head line MTTR is (0.5 / 1) * 100% = 50%. A sensitivity repair rates are largely dependent upon vegetation, tree-
analysis of MAIFI, SAIFI, and SAIDI to all default compo- trimming, and weather, there will likely be significant varia-
nent parameters has been performed and the results are sum- tion in line failure rates from utility to utility. This means that
marized in Table 3. finding representative data from previously published data
will likely be more difficult for overhead lines than for other
types of components.
Since MAIFI and SAIFI are predominantly affected by
overhead line failure rates, model validation will usually re-
sults in representative values for these failure rates. SAIDI,
however, can be significantly influenced by other component
types. For model validation to obtain a representative value of
d Failure Rate line MTTR, it is essential to obtain the best possible values
Mean Time To Repair 0% 0% 6% from other sources for parameters with a high SAIDI sensi-
Unreliability * -1% 7% 12% tivity.
Reclosers
Recloser FX I 0% I 4% I
Recloser MTTR I 0% I 0% I 6% 4.3 Default Component Parameter Multipliers and
Unreliablity * I -8% I 15% I 26%
I Sectionalizing Switches I I I I Customized Component Values
Sustained Failure Rate 0% I 4% I 3%
Mean 'I'ime To Repair 0% I 0% I 3%
Switching Time 0% I 0% I 22% Although a utility may not have enough historical data to
, ... ., -,"
determine the reliability parameters for all components on its
*Unreliability is defined as (1 - PSS). system, it may know the relative reliabilities of certain com-
ponents. For example, it may be known that the overhead line
As can be seen in Table 3, MAIFI is much more sensitive
failure rate in a certain heavily treed area is twice as much as
to the default overhead line momentary failure rate (AM) than
in a lightly treed area. Similarly, the MTTR of a component
to any other reliability parameter. This is a good indication far away from a crew dispatch location may by 50% longer
that if the predicted value of MAIFI does not agree with the than a component close to the dispatch location. This type of
historical value of MAIFI, default overhead line A M should be information can be incorporated into the reliability model in
adjusted. the form of default component parameter multipliers.
Table 3 also shows that SAIFI is much more sensitive to A default component parameter multiplier is simply a
the default overhead line sustained failure rate (As) than to number that corresponds to a component parameter. The
any other reliability parameter. This implies that the default number that will be assigned to the component parameter is
overhead line As should be adjusted if the predicted value of equal to the default component parameter multiplied by the
SAIFI does not agree with the historical value of SAIFI. corresponding default component parameter multiplier. For
The last set of reliability index sensitivities correspond to example, if a specific line section has a MTTR multiplier of
SAIDI. Table 3 shows that SAIDI is predominately effected 1.5 and the default line MTTR is 4 hours, then the line will be
default overhead line hs and default overhead line MTTR. It assigned a MTTR of (1.5 x 4)= 6 hours.
should be pointed out that the sensitivity of SAIDI to compo- A utility may also have specific component reliability
nent default parameters is more distributed than MAIFI and data for certain components on its system, but not for all
SAIFI. This is reflected in relatively high sensitivities to re- components. To accommodate this information, each compo-
closer unreliability and switching time. Therefore, if predicted nent parameter is allowed to be customized. If a value is cus-
SAIDI values do not match historical SAIDI values, any one tomized, is not allowed to be changed during the model vali-
or all of these parameters could be undesirable. dation process.
708
V. DETERMINING DEFAULT PARAMETER VALUES
4.5 , I