Messages from the Academic Literature on Risk Management for the Trading Book

Messages from the Academic Literature on Risk Management for the Trading Book

After completing this reading, you should be able to:

  • Explain the following lessons on VaR implementation: time horizon over which VaR is estimated, the recognition of time-varying volatility in VaR risk factors, and VaR backtesting.
  • Describe exogenous and endogenous liquidity risk and explain how they might be integrated into VaR models.
  • Compare VaR, expected shortfall, and other relevant risk measures.
  • Compare unified and compartmentalized risk measurement.
  • Compare the results of research on “top-down” and “bottom-up” risk aggregation methods.
  • Describe the relationship between leverage, the market value of an asset, and VaR within an active balance sheet management framework.

In this chapter, we address fundamental issues of a highly technical nature in current VaR-based approaches to risk management. We delve into implementation issues such as questions on the necessity of adding time variation in volatility and the appropriate time horizon for backtesting of VaR. We also study the merits and demerits of VaR as a risk metric and look at alternative metrics that have been put forward for consideration in the literature. Finally, our focus will be on management aspects such as the borderline between banking and trading books.

Selected Lessons on VaR Implementation

Time Horizon for Regulatory VaR

VaR has been used to determine the regulatory capital various institutions need to set aside as a way to mitigate the risks they face. However, one major issue that has been a constant subject of debate ever since the 1998 Market Risk Amendment came into force has much to do with the horizon over which VaR is calculated. The 1998 amendment set this horizon at 10 days. What’s more, it allows firms to calculate the one-day VaR and then use the square root of time rule to estimate the 10-day VaR.

Is the Ten-day Horizon Appropriate?

Most academics and industrial experts are in agreement idea that the appropriate horizon for VaR should depend on the characteristics of the position; that an across-the-board application of the 10-day VaR is not optimal. They argue that the horizon should depend on the characteristics of the position and the asset involved. For instance, it is well known that credit-related products have certain risks that can only be captured at a longer horizon than 10 days.

Other opponents of a constant risk horizon that do not vary across positions have put forward the following submissions:

  • Using the 10-day VaR to protect against losses in a liquidity crisis results in a gross probability-problem mismatch. In particular, the ten-day horizon at 99% implies that an event roughly happens 25 times a decade. In reality, however, a liquidity crisis is unlikely to happen even once over a 10-year period. In other words, a 10-day horizon implies a significantly higher frequency of liquidity than is observable in historical data.
  • A constant horizon is inappropriate even for the same financial product because trade execution strategies are subject to parameters that vary across time. Such parameters include transaction costs, expected price volatility, and risk aversion.
  • VaR horizon should depend on the economic purpose of VaR. The argument among risk managers is that while a one-day horizon is appropriate for trading purposes, it falls short when it comes to the management of longer-term solvency and capital risks.

Some of the suggestions put forward to help managers compute VaR over longer horizons include:

  • Calculating the VaR at shorter horizons and then scaling up the result to the desired time period using the square root of time rule. $$ \text{T}-\text{day VaR} = 1-\text{day VaR } * \text{ square root}(\text T) $$ The problem with scaling is that it is likely to underestimate tail risk.
  • Directly calculating the VaR over the desired horizon, whatever length it may be.
    But the downside of approach this is that there may not be enough data to support such an exercise. For example, there would be very limited P/L data on newly traded products.
  • Incorporating a tool that predicts future trading activity in VaR models.

Bottom line: A horizon that is longer than 10 days is highly recommended but there is no obvious solution. There is no “one-size-fits-all” solution to the horizon quagmire.

Is Square-Root of Time Scaling a Good Idea?

To estimate long horizon VaR as short-horizon VaR scaled by the square root of time, we apply a set of restrictive assumptions that rarely hold true in practice. The square root of time has been discredited for the following reasons:

  • For starters, the rule assumes that risk factors are i.i.d. i.e., independent and normally distributed. In other words, the square root of time rule assumes that price moves are independent. In practice, the distribution of losses changes with time. There are time periods with more losses than others.
  • When applied to quantiles, the square root of time rule assumes that returns follow a normal distribution. In reality, returns exhibit excess kurtosis, e.g. they are “fat-tailed.”
  • It has also been established that when risk factors exhibit jumps (sudden and robust loss-generating moves), scaling by the square root of time systematically under-estimates risk. In fact, the downward bias tends to be amplified by longer horizons. The argument is that the rule does not sufficiently scale jump risk.

Bottom line: Although laden with multiple practical inaccuracies, academics (and industrial experts by extension) have not been able to come up with a strong enough alternative. Therefore, the square root of time rule continues to be a useful tool in risk estimation. It plays an important role in the calculation of regulatory capital as guided by the Basel Capital Accord.

Time-varying Volatility

Certain asset classes are known to exhibit time-varying volatility. In other words, the prices (and returns by extension) fluctuate over different time periods. For example, some assets exhibit low volatility during the summer when a sizeable proportion of traders are on vacation.

Is It Necessary to Incorporate Time-varying Volatilities and Correlations?

Whether to account for time-varying volatility in VaR models or not has been a disturbing question for a while. It is safe to say that the industry has been warming up to measures of risk that react to changing conditions fast, such as the exponential time-weighted measures of volatility. There’s a realization that using historical simulation without considering time-varying volatility can result in estimates that underestimate the true risk of positions. That’s because the underlying risk factors exhibit time-varying volatility.

Some scholars oppose incorporation of time-varying volatility in VaR. Their argument is that volatility forecast capacity decays quickly with a time horizon for most equity, fixed income, and foreign exchange assets. The implication is that when estimating the VaR over short time periods, time-varying volatility needs to be taken into account. When estimating risk over longer horizons, however, incorporating time-varying volatility has less of an impact.

Bottom line: Recognizing time-varying volatility is necessary going forward, given that it is a constant feature in many financial risk factors. In fact, most models being used to model the prices of financial instruments such as stocks and swaps incorporate time-varying volatility. However, if regulators recognize time-varying volatility when calculating regulatory capital needs of firms, such capital may end up being highly volatile and pro-cyclical.

What are the Methods used to Incorporate Time-Varying Volatility in VaR for Large and Complex Portfolios?

One of the industry standards to incorporate time-varying volatility in VaR is the Exponentially Weighted Moving Average Approach (EMWA) which is a constrained version of an IGARCH (1,1) model whose decay parameter is set to 0.97.

An alternative is to use weighted historical data, where the weight attached to a given observation is given by:

$$ w_i=\cfrac {\theta^i (1-\theta)}{1-\theta^n } $$

where \(w_i\) is the weight of an observation made i days ago, n is the total number of days in the historical window, and \(\theta\) is a number between zero and one which controls the rate of memory decay.

Filtered Historical Simulation (FHS) is an even simpler approach, where risk factors are first filtered through a GARCH model and volatility updated. For portfolios with large numbers of risk factors, the FHS approach entails separately filtering each risk factor and building volatility forecasts for each factor. The weight can be applied following the EWMA or the alternative method introduced above, with the assumption that correlation structure across risk factor stays constant over time. Time-varying volatilities and correlations can also be estimated by multivariate GARCH models but the estimation becomes increasingly hard as the number of risk factors gets bigger.

As a result, industry players have resorted to taking less burdensome alternatives such as applying simple weighting across observations. Alternatively, shortening the data window for VaR estimation has also been embraced. These alternatives are less accurate but can be used when it comes to large and complex portfolios.

Backtesting VaR

Backtesting is one of the most popular methods used to validate VaR models. To validate a model, a risk manager assesses whether the model does what it is built to do: to estimate the VaR with some level of accuracy. If there are too many exceptions (number of actual observations over and above the expected level), the model is considered flawed and has to be reexamined.

For regulatory capital, a multiplier is imposed on VaR, depending on the number of backtesting exceptions a bank experiences. Several tests have been put forward to statistically establish whether VaR has the correct number of exceptions. One such test is the unconditional coverage likelihood ratio test. Although simple to implement, the test has two major flaws:

  1. When the number of trading days used in VaR evaluation is limited (e.g., 252 days per year), or when the confidence level is high (e.g., 99% as in regulatory VaR), the test has low power. Power, in this context, refers to the probability of making the correct decision, i.e., rejecting the model when, in fact, it is false.
  2. Since the test only counts exceptions, its power may be improved by considering other aspects of the data such as the grouping of exceptions in time.

To remedy the fact that the number of exceptions may show some affinity to time, a conditional coverage likelihood ratio test has been put forward. This test accounts for both the timing and the number of exceptions. However, it is still susceptible to the power problem. A bunch of other backtesting methods have been proposed, including the use of the mean squared error, and measuring the magnitude of the observed exceptions.

In conclusion, backtesting has the following weaknesses:

  • It is not effective when the number of VaR exceptions is small.
  • It is less effective over longer time horizons due to portfolio instability

Exogenous versus Endogenous Liquidity Risk

In broad terms, Liquidity is the capacity to transact quickly and at a low cost. During a financial crisis, the ease with which a trader can unwind a position can be adversely affected. In particular, the time it would take to unwind a position without a material effect on price may increase significantly. There are two types of liquidity risk: exogenous and endogenous liquidity risk.

Exogenous liquidity risk refers to liquidity risk that’s outside a trader’s control. It corresponds to the variability of bid-ask spreads for average-sized transactions. Endogenous liquidity risk, on the other hand, refers to liquidity fluctuations driven by individual action such as the size of the investor’s position. This type of risk is under a trader’s control and usually comes up when a trader unloads large positions which prove difficult for the market to absorb easily. In other words, endogenous liquidity looks at the effect of liquidating large positions on market prices. Although these two types of risks are both important, more emphasis is put on endogenous risk.

Conventional value at risk models ignore liquidity risk and this leads to an underestimation of overall risk as well as the flawed calculation of capital requirements for institutions.

How Do We Handle Exogenous and Endogenous Liquidity Risk?

Exogenous liquidity risk can be handled by calculating the liquidity-adjusted VaR (LVaR) measure which incorporates a bid/ask spread by adding liquidity costs to the initial estimate of VaR.

Studies have suggested addition of some endogenous liquidity costs to a trade’s returns before computing the position’s VaR.

However, there’s an overriding argument that liquidity risk can be managed better by adhering to accounting standards and valuation methods. Failure to apply most liquidity risk adjustments to trading books has been largely attributed to non-compliance with actual accounting standards during valuation process. It is also important to note that some authors have suggested integration of liquidity risk into market and credit risk.

VaR, Expected Shortfall, and Other Risk Measures

VaR estimates the maximum loss that can occur given a specified level of confidence. It is a popular measure of risk not least because it is easy to calculate and interpret. Despite the significant role VaR plays in risk management, it stops short of telling us the amount or magnitude of the actual loss. What it tells us is the maximum value we stand to lose for a given confidence level. If the 95% VaR is, say, $2 million, we would expect to lose not more than $2 million with 95% confidence but we do not know what amount the actual loss would be. To have an idea of the magnitude of expected loss, we need to compute the expected shortfall.

Expected shortfall (ES) is the expected loss given that the portfolio return already lies below the pre-specified worst-case quantile return, e.g., below the 5th percentile return. Put different, expected shortfall is the mean percent loss among the returns found below the q-quantile (q is usually 5%). It helps answer the question: If we experience a catastrophic event, what is the expected loss in our financial position?

The expected shortfall (ES) provides an estimate of the tail loss by averaging the VaRs for increasing confidence levels in the tail. It is also called the expected tail loss (ETL) or the conditional VaR.

Coherent Risk Measures

It is possible to estimate coherent risk measures by manipulating the “average VaR” method. A coherent risk measure is a weighted average of the quantiles (denoted by \(q_p\) of the loss distribution:

$$ M_\emptyset =\int _{ 0 }^{ 1 }{ \emptyset (p) q_p dp } $$

where the weighting function \(\emptyset (p)\) is specified by the user, depending on their risk aversion. The ES gives all tail-loss quantiles an equal weight of [1/(1 – cl)] and other quantiles a weight of 0. Therefore, the ES is a special case of \(M_\emptyset\).

Under the more general coherent risk measure, the entire distribution is divided into equal probability slices weighted by the more general risk aversion (weighting) function.

We could illustrate this procedure for n = 10. The first step is to divide the entire return distribution into nine (10 – 1) equal probability mass slices (loss quantiles) as shown below. Each breakpoint indicates a different quantile.

For example, the 10% quantile(confidence level = 10%) relates to -1.2816, the 30% quantile (confidence level = 30%) relates to -0.5244, the 50% quantile (confidence level = 50%) relates to 0.0, and the 90% quantile (confidence level = 90%) relates to 1.2816. After that, each quantile is weighted by the specific risk aversion function and then averaged to arrive at the value of the coherent risk measure.

Unified versus Compartmentalized Risk Measurement

Under a compartmentalized approach, the aggregate risks present across a bank’s trading and banking books are measured separately. Under a unified approach, we explicitly consider the interaction between these risks to study the implications of modeling aggregate risk present across the bank.

Most institutions determine their aggregate economic capital needs via a two-step process. First, to be determined are capital needs tied to individual risk types. Then, the stand-alone economic capital requirements are added up to obtain the overall capital requirements for the bank. The Basel regulatory framework, in particular, uses a “building block” approach. In this approach, a bank’s regulatory capital requirement is the sum of the capital requirements for various risk categories, coined pillar I and pillar II. Pillar 1 risks include market, credit, and operational risks. Pillar 2 risks include concentration risk, stress tests, liquidity risk, residual risk, and business risk.

As such, the Basel regulatory framework essentially puts forward a non-integrated compartmentalized approach to risk management. However, simply calculating individual risks and adding them together will not necessarily produce an accurate measure of true risk. Industry experts and scholars have advocated for a single-step, unified risk management process of calculating capital needs. This is informed by the need to take correlations and interrelationships, among various risks, into account.

“Top-down” and “Bottom-up” Risk Aggregation Methods

An institution’s risk can be distilled into credit, market, and operational risk. However, there are linkages among these risks, and the process of separating them can get complicated. For instance, if a bank takes a loan facility denominated in foreign currency, the bank will be exposed to both currency (foreign exchange) risk and credit risk. It is, therefore, important to consider the interactions among various risks when undertaking a risk management process. This leads us to two different risk aggregation methods.

A top-down approach to risk management assumes that all major risks of a bank are separable and can be addressed independently. In this case, a firm establishes a strategy to identify the major risks which weigh on it.

A bottom-up risk aggregation method attempts to establish the interactions among various risk factors. The starting point is a census of a company’s processes so as to isolate all the risks facing it.

Which One is More Appropriate?

In order to have a substantive assessment of the approach that’s more appropriate, academics have put forward the ratio of unified capital to compartmentalized capital. This is simply the ratio of integrated risks to separate risks. Multiple top-down studies have calculated a ratio of 1, suggesting that a unified approach indeed results in risk diversification. Bottom-up studies have also calculated a ratio of 1 but research is not conclusive. In fact, evidence of risk compounding has been found, which leads to a ratio greater than 1. This raises questions about risk diversification.

Bottom line: Although most academic studies advocate for the evaluation of credit and market risk jointly, a conservative approach involves evaluation of each risk separately. Independent assessment of each of the two risks results in capital requirements that surpass the figures calculated under the joint approach because of diversification. As such, separate evaluation provides an upper bound on the integrated capital level.

Relationship between Leverage, Market Value of Asset, and VaR Within an Active Balance Sheet Management Framework

Definitions:

  • Leverage refers to the ratio of total assets to total equity.
  • Economic capital is the amount of capital that a bank needs to ensure that it stays solvent given its risk profile.

For a bank that actively manages its balance sheet, the amount of leverage becomes procyclical, i.e., positively correlated with the state of the market. The change in leverage occurs because shifting market prices force changes to risk models and the resulting capital requirements. This, in turn, leads to changes in the balance sheet. If, for example, markets shift such that a bank’s risk exposure gets higher than before, the bank is forced to increase its capital holding. Therefore, capital requirements tend to amplify boom/bust cycles. Active risk management leads to frequent changes in the balance sheet.

Leverage is inversely related to the market value of total assets. This implies that when net worth rises, leverage decreases, and when net worth declines leverage increases. The end result is a cyclical loop where asset purchases are made when the prices of such assets are rising, and asset sales are made when asset prices are declining.

A bank’s level of economic capital goes hand in hand with VaR. Banks often target a given ratio of VaR to economic capital.

Question 1

Let a balance sheet for an institution be given such that the liabilities side is $1.2 billion in debts and $82 million in equity. Calculate the total leverage.

  1. 14.63.
  2. 0.014.
  3. 15.63.
  4. 0.016.

The correct answer is C.

Note that leverage is the total assets to equity ratio. This is given by the expression

$$ L=\frac { \text{Assets} }{ \text{Equity} } $$

But from the accounting equation:

$$\begin{align*} \text{Assets}& = \text{Total Equities} + \text{Total Liabilities}\\ &=\$\text{1,200 million} + \$\text{82 million} = \$\text{1,282 million}\\ \Rightarrow L&=\frac { 1,282 }{ 82 } =15.63 \end{align*}$$

Question 2

Find the weight of an observation 5 days ago if the total number of days in the historical window is 20 and a 0.9 control rate of memory decay.

  1. 0.2969.
  2. 0.2198.
  3. 0.1216.
  4. 0.0672.

The correct answer is D.

The weight of observation \(i\)-days ago is given by:

$$ W\left( i \right) =\frac { { \theta }^{ i }\left( 1-\theta \right) }{ 1-{ \theta }^{ n } } $$

Where \(n\) is the number of days in historical window and \(\theta \) is the control rate of memory decay,

$$ \therefore W\left( 5\right) =\frac { { 0.9 }^{ 5 }\left( 1-0.9 \right) }{ 1-{ 0.9 }^{ 20 } } =0.06722$$

Question 3

Alkbatros Bank has the proportion of capital to be held per total \(VaR\) as $4.3 million and the future value of assets is $87 million.If the Value at Risk is $0.8 million, the Leverage for Albatros Bank is closest to:

  1. 25.29.
  2. 467.63.
  3. 0.0021.
  4. 0.0395.

The correct answer is A.

We were given that Leverage \(L\):

$$ L=\frac { A }{ K } =\frac { 1 }{ \lambda } \times \frac { A }{ VaR } $$

$$ \text{Where A is the future value of assets, and } \lambda \text{ is the proportion of capital to be held per total VaR}$$

$$ \Rightarrow L=\frac { 1 }{ 4.3 } \times \frac { 87 }{ 0.8 } =25.29 $$

Question 4

The relative spread \(S\) of a portfolio has a sample mean of 20.6 and variance of 40.2. If the 99% quantile of distribution is 0.50, find the expression of cost of liquidity of the portfolio if \({ P }_{ t }\) is today’s position value.

  1. \({ P }_{ t }23.77\).
  2. \({ P }_{ t }11.89\).
  3. \({ P }_{ t }20.35\).
  4. \({ P }_{ t }40.7\).

The correct answer is B.

Remember that:

$$ { COL }_{ t }={ P }_{ t }\left( \frac { \hat { \mu } +\widehat { { a }_{ 0.99 } } \hat { \sigma } }{ 2 } \right) $$

So if \(variance=36.13=\hat { { \sigma }^{ 2 } } \) , Then: \(\sqrt { 40.2 } =\hat { \sigma } =6.3403\)

Therefore:

$$ { COL }_{ t }={ P }_{ t }\frac { \left( 20.6+0.50\times 6.3403 \right) }{ 2 } =\frac { 23.77 }{ 2 } ={ P }_{ t }11.89 $$

Shop CFA® Exam Prep

Offered by AnalystPrep

Featured Shop FRM® Exam Prep Learn with Us

    Subscribe to our newsletter and keep up with the latest and greatest tips for success
    Shop Actuarial Exams Prep Shop Graduate Admission Exam Prep


    Daniel Glyn
    Daniel Glyn
    2021-03-24
    I have finished my FRM1 thanks to AnalystPrep. And now using AnalystPrep for my FRM2 preparation. Professor Forjan is brilliant. He gives such good explanations and analogies. And more than anything makes learning fun. A big thank you to Analystprep and Professor Forjan. 5 stars all the way!
    michael walshe
    michael walshe
    2021-03-18
    Professor James' videos are excellent for understanding the underlying theories behind financial engineering / financial analysis. The AnalystPrep videos were better than any of the others that I searched through on YouTube for providing a clear explanation of some concepts, such as Portfolio theory, CAPM, and Arbitrage Pricing theory. Watching these cleared up many of the unclarities I had in my head. Highly recommended.
    Nyka Smith
    Nyka Smith
    2021-02-18
    Every concept is very well explained by Nilay Arun. kudos to you man!
    Badr Moubile
    Badr Moubile
    2021-02-13
    Very helpfull!
    Agustin Olcese
    Agustin Olcese
    2021-01-27
    Excellent explantions, very clear!
    Jaak Jay
    Jaak Jay
    2021-01-14
    Awesome content, kudos to Prof.James Frojan
    sindhushree reddy
    sindhushree reddy
    2021-01-07
    Crisp and short ppt of Frm chapters and great explanation with examples.

    Leave a Comment