Seamless Time Migration between Risk Measures used in the General Risk Equation

Date
2003-12-19
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
As shown elsewhere, the general risk equation relates expected throughput capacity of any system to both system resources and positive risk of loss of throughput capacity. Two risk measures are required, a natural MEL-risk measure, and an artificial MEL-risk measure, equivalent to Markowitz's standard deviation measure. We show that the two apparently distinct risk measures are intimately related, and that which one is appropriate depends merely on the time period over which the risk is calculated. We show, ultimately by application of the Central Limit Theorem, that if we merely sufficiently alter the time period, at some point the need for one measure will abruptly transition into the need for the other, without any change in the underlying physical system. This allows a comprehensive MEL-risk measure for use with the general risk equation. This comprehensive measure defaults to either the natural MEL-risk measure, or the artificial MEL-risk measure, depending not on the physical system, but merely on the time period over which the risk is calculated.
Description
Keywords
Computer Science
Citation