Thirty years after he was awarded the Nobel Memorial Prize in Economic Sciences, Harry Markowitz’s groundbreaking work from the 1950s still powers financial innovation
Precisely 30 years ago, in December 1990, Harry Markowitz was awarded the Nobel Memorial Prize in Economic Sciences, officially the Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel, alongside Merton H Miller and William F Sharpe.
Markowitz is referred to, quite rightfully, as the father of Modern Portfolio Theory (MPT). MPT provides the structural framework for today’s investment markets. At one extreme, it has led to the drive towards passive investment through indexation, while at the other it has underpinned the most sophisticated risk-controlled approaches to active management.
The topic that Markowitz was asked to explore for his doctoral thesis at the University of Chicago was how best to structure a portfolio of equities. His thesis covered the computation of efficient sets for large numbers of securities, and how to incorporate mean-variance analysis into the theory of rational behaviour under uncertainty.
According to Markowitz, Milton Friedman argued that while the dissertation contained no mistakes, he could not award a PhD in economics for a dissertation that was not about economics. It is ironic that Markowitz, 38 years later, achieved the Nobel Memorial Prize in Economic Sciences for work described in exactly that thesis and first published in 1952.
His revolutionary approach was to recognise that while equities may offer dividend streams in the future, they also have risks in the form of volatile share prices, which should be taken into account in a systematic manner. The way to do this, he postulated, was by defining risk as the standard deviation of the total equity returns and recognising that the standard deviation of the returns of a portfolio of equities, or assets in general, will be reduced by the extent that the securities are uncorrelated with each other.
Combining stocks which tend to move in opposite directions will reduce the overall standard deviation in the portfolio to levels below that of individual stocks. It is then possible to produce an ‘efficient frontier’ of portfolios, each of which maximises future return expectations for a given level of overall portfolio risk. That insight, obvious today, was a revelation when first proposed.
The mathematical problem that Markowitz solved was, given a set of stocks with expected returns, standard deviations and correlations with one another, how do you determine the set of efficient portfolios that maximises the expected return for a given level of risk?
Investing everything into the single stock with the highest expected return represents the ‘portfolio’ with the maximum expected return, but of course that portfolio solution would also have a very high risk level attached. Adding more stocks reduces the overall portfolio risk. Deciding which ones to add to create the maximum expected return for a given level of overall portfolio risk was solved by Markowitz in 1956 with the creation of what is known as the ‘critical line algorithm’. This produces a set of efficient portfolios that maximises returns for a given level of risk, tracing the now famous efficient frontier.
Since Markowitz’s ground-breaking work, a whole field, known as financial economics, has arisen, and numerous papers developing further insights are published each year. However, it is fair to say that Markowitz’s original approach has stood the test of time and is still the benchmark against which alternatives are compared.
Markowitz was initially looking at the problem of risk and return in a portfolio of equities, although the theory has validity for any number of asset classes as well as individual components within an asset class. The calculation of efficient portfolios of equities using the critical line algorithm requires complex mathematical manipulation involving matrix inversions of matrices representing the returns of the securities under investigation.
Analysing a complete stock market of 1,000 securities would require an inversion of matrices consisting of a thousand rows and a thousand columns and calculations of correlations of one stock with each and every one of the other 999.
Such calculations were not economically feasible using the computing power available when Markowitz’s work first came out. That may have provided the spur to further theoretical developments in the form of the single index model developed by William Sharpe. This simplifies the problem of trying to assess correlations of one stock with hundreds of others, by giving an approximation; namely that a large amount of the behaviour of an individual stock can be explained by its correlation with the behaviour of the average of all the other stocks – that is, with the market as a whole that can be represented by a capitalisation-weighted index.
This approach led to the Capital Asset Pricing Model (CAPM) and subsequently, to ideas of efficient markets. (It should, of course, be evident that efficient markets are not prerequisites for the applicability of Markowitz’s original work on producing optimal portfolios.)
Using the mean-variance approach at the asset allocation level has proved to be an easier problem in computing terms since the number of variables is reduced from the many hundreds required at the stock level to at least an order of magnitude lower – at most 10 to 20 different asset classes. But, while the mean-variance approach has proved popular at the intellectual level, and for providing conceptual frameworks for asset-allocation decisions, it has proved to be difficult to implement in practical methodology.
The crux of the problem is the methodology assumes that the numbers used have high reliability or confidence. In other words, that if the sterling/dollar exchange rate has an expected appreciation of 5% and the euro/dollar exchange rate of 5.1%, then if the assumed correlation is high, there is a meaningful difference that should be exploited to the maximum in the production of efficient portfolios.
The award of the 1990 Nobel Memorial Prize in Economic Sciences to Harry Markowitz was a recognition of the influence over the previous decades that his work had had on the academic development of Modern Portfolio Theory (MPT) and its impact on the world of investment. Indeed, my own first exposure to MPT was as an analyst working for the Shell UK pension fund in the mid-1980s.
At that time, software programs were just becoming available to allow practitioners the ability to construct efficient portfolios using computers. But, while the idea of efficient frontiers for asset allocation sounded good, the practice was riddled with complications: how valid is past data of return correlations for predicting future risk? Where should estimates of future returns come from? And the most frustrating issue of all, coping with the fact that the results were highly sensitive to small changes in return forecasts. Without the ability to incorporate confidence or error margins in return forecasts, portfolio allocations swing wildly on small changes across assets with similar figures.
Later that decade, working for an investment bank, I ended up programming the critical-line algorithm and we used it primarily to explore asset allocation ideas where we attempted to use forward-looking analysis by incorporating broker return forecasts combined with volatility data from options markets. One application that was a variant producing an efficient frontier for debt issuers able to issue in different currencies, where the objective was to produce portfolios with minimum cost for a given level of risk rather than maximum return. The efficient frontier was the bottom half of a parabola in contrast to the top half usually seen as an efficient frontier.
It was a pleasant surprise in the mid-1990s, when I was a director of the investment division of Commercial Union (now part of Aviva) to be asked by Daiwa Securities whether we would be interested in the work of a Nobel Memorial Prize winner they had working with a team in New Jersey on active quantitative equity strategies. That person was Harry Markowitz himself, who was heading a group that had launched a US and Japanese equity strategy. This was based on using cross-sectional regressions of equity markets to identify anomalies and applied Markowitz’s mean-variance optimisation to construct portfolios.
My reply was yes, if we had exclusive rights outside the US. A few months later, discussions led to us launching three funds in the UK based on the work of Harry’s team. We hired a very active marketing manager, the late William Battersby, and met investment consultants, pension funds and organised seminars together with Harry.
These meetings would usually start with me telling the audience that the most remarkable aspect of Harry’s Nobel Memorial Prize was that it was given for work essentially done for his PhD thesis. That was astounding since the average number of people who read the average PhD thesis is said to be 2.8.
Harry would then come on stage and recount a few stories of his own on his selection of a topic for his PhD thesis and the thought processes that led to the creation of the efficient frontier. The quant funds that Harry’s team produced had a good performance but in the mid to late 1990s, I was told by one, now eminent, investment consultant: “Joseph, no UK pension fund would ever invest in a quant fund”.
Times have changed of course and today successful quant fund managers see their strategies as a continuation of a theoretical approach first enunciated by Harry almost 70 years go.
Some 25 years later, in June 2019, I spent a stimulating few days with Harry in his San Diego office together with Yves Choueifaty, CEO of TOBAM. I experienced once again Harry’s charm and humility despite his great achievements. Perhaps the most amazing thing about Harry is that at the age of 93, he is still working hard in his office near the beach. He is currently writing another book and, as he related, intends to continue working until he is 105.
Altering the differences can, and often does, produce huge swings in the relative proportions of different asset classes along the efficient frontier. The reality is that at the asset allocation level differences of that order are meaningless and, indeed, the return estimates themselves are subject to continuous revision. Portfolios need to be robust against small differences in return estimates as well as small changes in relative returns. The popular approach of risk parity applied to asset allocation is a simple way of avoiding this issue by allocating equally to different assets weighted by their risk, ignoring return estimates.
Perhaps where MPT has had most influence is in setting an idealised mathematical framework that the real world can be compared with. It led to Sharpe’s creation of the idea of portfolio beta and CAPM and Eugene Fama’s 1970 formulations of three progressively stronger hypotheses concerning efficient markets.
The rise of passive investing driven matching market-cap-weighted indices has brought down the cost of gaining exposures to investment beta to almost zero for institutional investors. It is worth noting, though, that market-cap-weighted portfolios may not necessarily lie on the efficient frontier at all or represent the lowest risk portfolios. Maximising diversification – firms such as TOBAM practice – would produce portfolios with lower overall portfolio risk than the market.
Separating alpha, the excess performance over the market returns, from market beta, is the framework behind the structure of the today’s fund management industry. Passive funds now not only encompass matching market capitalisation weighted indices, but also exploit what used to be active management and even hedge fund approaches based on semi-persistent excess returns based around company, size, value, momentum, quality, which can be structured as ‘smart beta’ ETFs with low management fees.
Today there is a tsunami of interest in ESG and sustainability. How does that relate to MPT? Analysing sustainability in a quantitative manner requires the calculation of the total impact each company makes. This encompasses environment capital, human capital and social capital alongside the more traditional financial capital so beloved of shareholders.
It is then possible to calculate the total impact of a company in euros per year per million euros invested in it. That then enables an efficient frontier to be produced of these impact intensity figures versus portfolio risks analogous to Markowitz’s efficient frontier of portfolio returns versus portfolio risks. Conceptually, MPT can then be extended to produce a ‘Sustainable Portfolio Theory’ with a three-dimensional efficient surface representing portfolio impacts as well as portfolio returns plotted against portfolio risks. The mathematics of this have not yet been defined and perhaps there is another Nobel Memorial Prize waiting the person who succeeds in creating it.