Recorded as the largest municipal loss in U.S. history, Orange County suffered a loss of $ 1.6 billion in December 1994 and went to bankruptcy shortly thenceforth. The County Treasurer, who was besides the wrecker of this fiscal catastrophe, Robert Citron, managed to construct a $ 20.5 billion portfolio by leveraging 7.5 billion of investor equity. With old ages of success, the investing scheme exposed and multiplied the investor equity to the hazard of involvement rates, which eventually rose six times in 1994 from 3.45 % to 7.14 % and led to the largest municipal failure.
Several tools of fiscal hazard direction, such as continuance and VaR ( value at hazard ) can be employed to analyse the investing failure. As a feature of a bond, continuance steps sensitiveness of monetary value alterations with alterations in involvement rates. As another prevailing step, Value at Risk ( VaR ) is defined as “ a loss that will non be exceeded at some specified assurance degree and specified clip skyline ” ( Hull, 2007 ) . Harmonizing to Jorion ( 2006 ) , “ VaR measures the worst expected loss over a given skyline under normal market conditions at a given degree of assurance.
The aim of this paper is to seek to look into the quandary which the Treasurer faced by utilizing appropriate econometric techniques in the field of fiscal hazard direction. The staying paper is organized as follows: in subdivision 2, I present a reappraisal of the relevant literature. Section 3 describes the information and analyses the instance inquiries. Finally, I discuss the findings and sum up the decisions in subdivision 5.
2. Literature reappraisal
Since Orange County declared the bankruptcy on Dec 6th 1994, it has been treated as authoritative instance in the field of fiscal hazard direction. Following the first reappraisal “ County in Crisis ” by Richard Irving ( 1995 ) , Philippe Jorion and Robert Roper published the book “ Large Stakes Gone Bad: Derived functions andBankruptcy in Orange County ” ( 1995 ) . Analyzing the bankruptcy in footings of VaR, Philippe Jorion maked the cyberspace instance survey: “ Orange County Case: Using Value at Risk to Control Financial Risk ” .
Markowitz ( 1959 ) foremost introduced “ portfolio theory ” , which formed the footing of modern hazard direction and put the beginning of step of value-at-risk ( VaR ) . J.P. Morgan Bank ( 1994 ) officially designed VaR as a step of market hazard and freely provided VaR to establishments at Risk Metrics. Although the construct of VaR has now been incorporated in the Basel II Capital Accord ( 2003 ) , many articles, such as Bredow ( 2002 ) and Yamai and Yoshiba ( 2002 ) , believe that VaR underestimates the hazard of securities with fat-tailed belongingss and a high potency for big losingss every bit good as neglects the tail dependance of plus returns.
3. Data and instance analysis
To demo the portfolio construction, the balance sheet ( Table 1 ) of Orange County as of 01 December 1994 is given by the assignment description. Of all the assets, the weight of structured notes histories for 38 % and the weight of fixed-income securities is 57.7 % . the staying portion of assets contains hard currency ( 3.2 % ) and collateralized mortgage duties ( 1.1 % ) . In the column of liabilities, it can be seen that 7.5 billion of investor equity is leveraged into 20.5 billion of investings.
Table 1. The balance sheet of Orange County as of 01 December 1994
Assetss ( $ )
Liabilitiess ( $ )
Structured notes ( 38 % )
Inverse floating-rate notes ( 26.1 % )
Others ( double index notes ( 0.7 % ) , floating-rate notes ( 2.9 % ) , index-amortizing notes ( 8.3 % ) )
Fixed-income securities ( 57.7 % )
Cash ( 3.2 % )
Collateralized mortgage duties ( 1.1 % )
Rearward redemption understandings ( 63.2 % )
i‚® Investor equity ( 36.8 % )
The 2nd portion of information is used to mensurate the volatility of the alteration in outputs and calculate the monthly portfolio VaR, incorporating 5-year outputs on current US Treasury issues between 1953 and 1994. This information is obtained from excel file attached behind assignment description.
3.2Duration and effectual continuance ( for inquiry 1 )
As mentioned above, continuance, which usually means the Macaulay Duration, is a step that summarizes approximative response of bond monetary values to alter in outputs. Different with Macaulay continuance, effectual continuance is frequently used to depict the response of monetary value of bond with embedded options to give alteration and dependant on option pricing theoretical account. Duration can be calculated as:
B is the present value of all payments of from the bond,
I indexes the period of the hard currency flows,
degree Celsius is the sum of hard currency flow in every payment period,
is the price reduction factor,
D is the Macaulay Duration for the bond.
Harmonizing to the expression, it can be intuitively found that greater output ( Y ) is associated with lower present value of bond and the way of sensitiveness between those should be negative. Furthermore, it can be proved that greater clip to adulthood is ever followed by greater alteration in monetary value to alter in involvement rates, viz. greater continuance. Besides the adulthood, leveraging the portfolio can besides increase the continuance.
By rearward redemption understandings, the investor equity ( 7.5billion ) is leveraged into the whole portfolio ( 20.5 billion ) . The purchase rate is precisely shown as:
it means the consequence of any monetary value alteration to investor equity will be enlarged 2.7 times. Sing the mean continuance of the securities in the portfolio is 2.74 old ages, the effectual continuance of the fund should be calculated as:
Effective continuance =2.742.7=7.4 ( old ages )
3.3 Convexity: difference between theoretic and existent loss ( for inquiry 2 )
For a little alteration in outputs y / vitamin D Y, the corresponding alteration in bond monetary value can be about expressed as:
is the dependent alteration in bond monetary value,
B is the initial value of bond monetary value,
is the modified continuance of the bond,
is the alteration in output.
On the other manus, for big alterations in output continuance is less accurate in showing the dependent alterations in bond monetary value. For illustration, two bonds with same continuance can hold different alteration in value of their portfolio for big alterations in outputs. It is fundamentally because that in the first order estimate, continuance can non supply the information of bond convexness. Convexity for a bond, as a construct of 2nd order, is a step of the sensitiveness of the continuance of a bond to alterations in involvement rates. It is the leaden norm of the ‘times squared ‘ when payments are made. Convexity for a bond is
Sing the convexness, the dependent alteration in bond monetary value to give alteration can be expressed in the undermentioned 2nd order estimate:
Based on the above statement, the theoretical loss in this instance utilizing the continuance estimate is as follows:
Compared to the existent loss ( 1.64 billion ) , the loss in continuance estimate is much larger. With the considerable alteration in involvement rate ( 350bp ) , the difference between continuance estimate and existent loss can be partially explained as the excess portion in convexness estimate:
3.4 volatility and VaR ( for inquiry 3 )
As a traditional step to measure up the hazard of fiscal instrument, volatility refers to the standard divergence of returns of fiscal instruments in a specific period. A simple method to mensurate volatility is the simple-moving-average method, viz. :
( Assume mean return=0 )
Using the simple-moving-average method, the volatility of the alteration in outputs in December 1994 can be calculated as:
=0.402 % ( Assume mean return=0 )
As another step to measure up the hazard of fiscal instrument, value at hazard ( VaR ) is nil but the reverse loss distribution map, or quantile map, which can be denoted by
Pr ( X a‰¤ ) = Q }
There are several ways to calculate the value of VaR, including variance-covariance method, Historical-Simulation method and Monte Carlo Simulation method. In variance-covariance method, the returns are assumed to be usually distributed. So the lone two factors which we need to gauge are mean ( expected ) return and standard divergence. Using the fixed normal distribution curve, we can calculate the value of VaR at a assurance degree in a specific period.
Based on the attached information, the mean alteration in 5-year monthly output is 0.01 % , the standard divergence of the alteration in outputs is 0.40 % . Under the premise of normal distribution, the expected lowest and highest alteration in 5- twelvemonth monthly output at 95 % assurance degree is estimated as:
=0.01 % ; =0.40 %
0.01 % +1.65*0.40 %
0.01 % -1.65*0.40 %
It means that at 95 % assurance degree, the expected largest alteration in 5-year monthly output will non transcend 0.67 % .
In this instance, it is assumed that the portfolio is merely exposed to the alteration in involvement rates. So the alteration in output can to the full explicate the alteration in value of the portfolio by the construct of continuance. Using continuance estimate, we can happen the monthly portfolio VaR ( by the variance-covariance method ) in December 1994 at the 5 % cut off point.
Compared to variance-covariance method, the Historical-Simulation method does non do the premise of normal distribution, but assumes that the history will reiterate itself. By seting the historical returns in go uping order from lowest to highest, the Historical-Simulation builds its ain distribution without gauging any discrepancies or convariances. The histogram of the monthly alteration in output is drawn in figure 1:
If we reorganize the information of monthly output alteration in a new histogram ( figure 2 ) which compare the frequence of the returns, it is much clearer to happen the denseness of alterations in output at a specific interval. For illustration, at the highest point of the reorganized histogram ( the highest saloon ) , there are more than 35 times when the monthly output alteration is between 0.05 % and 0.08 % . At the far right, we can see a saloon at 2.03 % ; it shows the one individual month ( in Feb 1980 ) within the period of 40 old ages when the monthly output alteration was every bit high as 2.01 % .
From the cumulative denseness curve, we can happen that with 95 % assurance, the lowest monthly output alteration will non transcend -0.56 % and the highest monthly output alteration will non transcend 0.66 % .
Simply following the stairss we take in the variance-covariance method, we can besides cipher the monthly portfolio VaR ( by the Historical-Simulation method ) in December 1994 with 95 % assurance degree:
It is interestingly noticed that the two consequences is rather similar to each other. But it does non intend that the two methods are really indistinguishable or the alteration in output is usually distributed. This point can be verified by the fact that the lowest output alteration with 95 % assurance degree is -0.65 % in the variance-covariance method while it is -0.56 % in Historical-Simulation method. It remains us of the chief difference between the two methods, viz. the premise of normal distribution.
3.5 change overing the VaR ( for inquiry 4 )
If we make the premise that monthly returns at different period are usually distributed and uncorrelated to each other, the monthly VaR figures ( from the above two methods ) can be converted into an one-year figure utilizing the ‘root-T ‘ regulation. Under the same assurance degree, the montly VaR is converted as:
The estimated one-year loss we get from the above computation is non consistent with the existent loss ( 1.64 billion ) . One account should be that the premise of normal distribution is non realistic. Specifically, the Durbin-Watson trial can be applied to prove the autocorrelation of the returns.
3.6 EWMA theoretical account and time-varying volatility ( for inquiry 5 )
When we measure volatility by the simple-moving-average method, it is questionable to give all older informations the same weight. It is more sensible to give more recent informations larger weight and older informations smaller weight. The exponentially leaden traveling norm ( EWMA ) method fixes the job by presenting the smooth parametric quantity ( i?¬ ) . The expression of exponentially weighted moving norm ( EWMA ) is shown as:
or rewritten as:
Where usually the smooth parametric quantity ( i?¬ ) is set as 0.94.
In this instance, =0.30588 %
31887 % .
The class of computation is shown in affiliated computations excel file. The existent alteration in involvement rates and EWMA prognosiss are compared in the below tabular array.
Actual alteration ( % )
( % )
Acceptance part ( 95 % )
( -0.4947 0.5147 )
( -0.5103 0.5303 )
( -0.5109 0.5309 )
( -0.5207 0.5407 )
( -0.5011 0.5211 )
( -0.5161 0.5361 )
It can be seen that all the existent alterations in involvement rates fall into the credence part, which is based on EWMA calculating standard divergence with 90 % assurance degree.
If the EWMA prediction volatility for December 1994 is used to cipher the VaR, the one-year greatest loss at the 5 % cut off point will be:
The new value of one-year VaR is significantly smaller than what we get in Section 3.4. it is chiefly because that the monthly volatility for Dec.1994 based on EWMA method is smaller than earlier.
3.7 Backtest EWMA theoretical account ( for inquiry 6 )
The household of VaR theoretical accounts is edifying merely when they predict hazard moderately good. Backtesting is merely one of the tools which check whether a theoretical account is equal. Give a assurance degree for back testing, excessively many exclusions mean the theoretical account underestimates hazard. The simplest method of backtesting is to enter the failure rate which is the exceeded proportion of exclusions compared to the expected figure for given VaR degree and given period.
In this instance, we make the backtesting of the EWMA theoretical account on the last 100 months can see if at the 5 % left tail cut off degree ( for the normal distribution ) there are more than 5 ‘outliers ‘ . The consequence is reported in the undermentioned figure ( Figure 3 ) .
Figure 3. the backtesting map for the last 100 months
The figure shows two facts: on one manus, there are two outliers which lie under the lower edge built by the EWMA prediction VaR at 5 % cut off degree. On the other manus, there are five outliers which lie beyond the upper edge forecasted by EWMA method. Both of them back up the cogency of the EWMA theoretical account.
In the paper, the Orange County bankruptcy instance is reviewed by using the tools of fiscal hazard direction. After analyzing the information of balance sheet and monthly output alteration in the last 40 old ages, we discuss and seek to reply all the instance inquiries. As one of the root of the catastrophe, the effectual continuance of fund is enlarged by leveraging the investor equity about 3 times. Exposed to involvement rate hazard, the portfolio bears immense loss which can be calculated by continuance estimate. The step of volatility and VaR provide a more accurate manner to weigh the expected worst loss at a assurance degree. With different premise, the variance-covariance method and historical simulation method study different consequences of VaR. sing the portfolio is merely exposed to involvement rate hazard in the instance, we can unite the VaR and continuance estimate to gauge the monthly and one-year loss of the portfolio at 5 % cut off point. In the terminal, the backtest of 100 last monthly information shows the cogency of the EWMA method.