The chart above shows the use of an over-looked measure for determining the risk of owning equities. The graphic depicts the Ulcer Index as applied to the Dow Jones Industrial Average (DJIA) since October 1928 - which is when easily accessible archived records are available.

The index is sometimes applied as a metric for indicating the strength or weakness of a portfolio manager's performance. Most pertinently it addresses the issue of how the portfolio has suffered in terms of its maximum draw-down. The notion of draw-down is useful for determining risk and volatility of the returns that one would actually experience in holding a portfolio (in this case the 30 stocks of the DJIA). It differs from the more commonly used Sharpe Ratio which is used and is based on the standard deviations of the returns.

The standard deviation measures the amount of variation around the average and is probably the most widely used measure of portfolio performance and investment risk.

But the standard deviation suffers from two major weaknesses for assessing the real nature of risk in holding financial assets. On the one hand it takes the same view of variability which is above average (which for a long investor is desirable) as it does for variability below the average which is clearly not desirable for the investor who has a long portfolio.

The second and more fundamental problem with the Sharpe Ratio is that it fails to register the sequencing of returns. When calculating the Sharpe Ratio one can tabulate say the weekly or monthly returns from a portfolio in one column and then measure the standard deviation of the returns, convert it to an annualized rate and then divide the annualized returns for the portfolio by the standard deviation. This gives rise to the Sharpe Ratio which was popularized by Nobel laureate William Sharpe and which is one of the most widely followed benchmarks for assessing the skills (and inherent risks) of a portfolio management strategy.

The problem with this approach can be glimpsed by thinking of randomly sorting the returns or sorting them in ascending or descending form. The actual returns experienced by an investor under many sorting scenarios will be vastly different with sequences of losses/gains that will have entirely different characteristics to the actual series of returns. And yet the standard deviation of the returns will be identical.

What is missing from the standard deviation metric is any sense of how much risk and discomfort is experienced by someone holding a portfolio as a result of the actual sequence of the returns. As losses will often cluster this can only be properly reflected by reference to the notion of a draw-down and not in relation to variability or deviation from the average return of the series.

**Draw downs**

In essence the draw-down is the amount which a portfolio loses, tracked on a periodic basis, from its current level in relation to the high water mark of the portfolio's returns. The high-water mark itself is a moving amount and in regard to the DJIA it can basically be seen as a continuous recording of the maximum value of the index that has been achieved as of the date of measurement. From this high-water mark one can calculate the percentage change for each snapshot in time of the portfolio with respect to the current maximum value that the index has attained.

Sometimes this is depicted metaphorically by the notion of measuring the distance from the peak to the present valley as the time series develops. The notion of a maximum draw-down is simply to keep track of the present valley (assuming that the index is not currently making a new high) in regard to the highest peak value.

The chart above uses the technique described by Peter Martin who developed the Ulcer Index and who describes the construction in some detail here . The Ulcer Index is described as follows:

Ulcer Index measures the depth and duration of percentage draw-downs in price from earlier highs. Technically, it is the square root of the mean of the squared percentage drops in value. The greater a draw-down in value, and the longer it takes to recover to earlier highs, the higher the UI. The squaring effect penalizes large draw-downs proportionately more than small draw-downs.

The simple adaptation that I have introduced to the technique is to re-calculate the Ulcer Index for every trailing 52 week period from the extended DJIA time series. The high water-mark itself is calculated from the very beginning of the series but the calculation of the squared percentage drops in value is done on the basis of summing the trailing 52 weeks only. This sum of squared percentage drops in value is then divided by 52 and the square root is taken for the resulting value. Each of the values obtained forms a data point in the series displayed in the graph above.

The Ulcer index is far more useful as a measurement of investor discomfort (which is why the originator decided to call it the

*ulcer*index), since it calculates actual re-tracements in one's account equity. There is no escaping the actual sequencing of returns by just measuring simple variability as is implied in using the standard deviation and the Sharpe Ratio.

Also evident on the graphic is the time it takes for the portfolio to regain its value in relation to its historic highs at the point that the measurement is made.

Just from observing the data it can be seen that the 1930's still far exceeds the discomfort that would have been experienced by an investor over the past year.

However it is also significant to see that the recent values registered on the Ulcer Index are in excess of anything seen since the 1930's and surpass the market drops in the 1970's and those seen in the early 2000's.

## No comments:

## Post a Comment