Please use this identifier to cite or link to this item:
Title: Seamless Time Migration between Risk Measures used in the General Risk Equation
Authors: Bradley, James
Keywords: Computer Science
Issue Date: 19-Dec-2003
Abstract: As shown elsewhere, the general risk equation relates expected throughput capacity of any system to both system resources and positive risk of loss of throughput capacity. Two risk measures are required, a natural MEL-risk measure, and an artificial MEL-risk measure, equivalent to Markowitz's standard deviation measure. We show that the two apparently distinct risk measures are intimately related, and that which one is appropriate depends merely on the time period over which the risk is calculated. We show, ultimately by application of the Central Limit Theorem, that if we merely sufficiently alter the time period, at some point the need for one measure will abruptly transition into the need for the other, without any change in the underlying physical system. This allows a comprehensive MEL-risk measure for use with the general risk equation. This comprehensive measure defaults to either the natural MEL-risk measure, or the artificial MEL-risk measure, depending not on the physical system, but merely on the time period over which the risk is calculated.
Appears in Collections:Bradley, James

Files in This Item:
File Description SizeFormat 
2003-733-36.pdf1.26 MBAdobe PDFView/Open
2003-733-36.ps10.99 MBPostscriptView/Open

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.