Abstract
As shown elsewhere, the general risk equation relates expected
throughput capacity of any system to both system resources and positive risk
of loss of throughput capacity. Two risk measures are required, a natural
MEL-risk measure, and an artificial MEL-risk measure, equivalent to
Markowitz's standard deviation measure. We show that the two apparently
distinct risk measures are intimately related, and that which one is
appropriate depends merely on the time period over which the risk is
calculated. We show, ultimately by application of the Central Limit Theorem,
that if we merely sufficiently alter the time period, at some point the need
for one measure will abruptly transition into the need for the other, without
any change in the underlying physical system. This allows a comprehensive
MEL-risk measure for use with the general risk equation. This comprehensive
measure defaults to either the natural MEL-risk measure, or the artificial
MEL-risk measure, depending not on the physical system, but merely on the
time period over which the risk is calculated.
Notes
We are currently acquiring citations for the work deposited into this collection. We recognize the distribution rights of this item may have been assigned to another entity, other than the author(s) of the work.If you can provide the citation for this work or you think you own the distribution rights to this work please contact the Institutional Repository Administrator at digitize@ucalgary.ca