Over 30 years ago, the US economist Benjamin Graham distinguished two basic approaches to investing – qualitative and quantitative.
In his book ‘The intelligent investor’, he wrote: “The first or predictive approach could also be called the qualitative approach, since it emphasises prospects, management and other non-measurable, although highly important, factors that go under the heading of quality.
“The second or protective approach may be called the quantitative or statistical approach, since it emphasises the measurable relationships between selling price and earnings, assets, dividends and so forth.”
Interest in the quantitative or ‘quant’ approach to investing developed rapidly in the 1970s. This was backed by a growing body of academic research that suggested that, by feeding in factors like price to earnings or price to book multiples into models, investors could construct portfolios that would systematically outperform the market.
Traditional, fundamental investment managers made use of this research by putting into their investment process screens that would generate a list of the best stocks according to a series of criteria. Analysts would then pick from the top of this list.
The initial use of quant techniques, therefore, was to enable analysts to narrow their search for winning companies. Since then their use has expanded to include new factors such as momentum to take account of market sentiment, or profitability.
Today, quant techniques are used in two distinct ways, depending on the affiliation of the manager to quant or fundamental styles of investment management.
Quant managers will use them to establish relationships between various factors, such as price to book, and returns. They will then attempt, in a systematic and disciplined way, to capture those factor returns.
Fundamental managers, on the other hand, will use quant techniques simply to control risk or to screen out their investment universe. The ultimate responsibility for picking stock remains with the analyst or portfolio manager.
The idea that quantitative and qualitative techniques are mutually exclusive is now discredited. Bob Litterman, head of the Quantitative Equity Research at Goldman Sachs Asset Management, insists that the search for alpha should involve both. “We believe a strong quantitative process can and should be based on fundamental investment principles.”
Yet quantitative and fundamental investment approaches are polarised in respect of their strengths and weaknesses. Laurie Weston, partner, client services, at Advanced Investment Partners (AIP), a Florida-based quant boutique, says that what distinguishes the quant’s valuation analysis is its breadth, in contrast to the depth of the fundamental manager’s analysis:
“Quantitative investors will rely upon computer-based analyses of a large dataset of financial macro, price and volume data. The incredible expansion in the affordability of computer resources, plus the availability of vast databases provides the quant with the foundation of evaluating thousands of stocks within hours.”
The limitations of computers – and humans – are in many respects the limitations of quant and fundamental investment techniques.
Axa Rosenberg Asset Management, for example, uses a so called ‘expert system’ to replicate the thinking of an expert – in this case the way an equity analyst will dissect a company’s balance sheet and income statement and try to arrive at an estimate of next year’s earnings and a true estimate of the company’s worth.
Yet the system has its limitations, says Agustin Sevilla, chief investment officer of Axa Rosenberg in London. “You can only programme what a computer can understand and computers can do some things very well and other things they have a very tough time dealing with.
“For example, when we started covering European companies back in the early 1990s we had great difficulty in dealing with price data. And the reason is that price data from Europe back in those days could come in one of a number of currencies. So we could get a price of an Italian stock in lira or sterling depending on where it traded.
“These are things that a person will convert automatically. An analyst that is used to this information will make the conversion without even blinking, but it’s very hard to do that on a computer.
“So subtlety is very difficult for us to deal with and we spend probably most of our research resources dealing with issues of subtlety.”
Sevilla says that the use of quant techniques within an investment process must take account of their limitations. “ Embedded in the way in which we pick stocks is the fact that there are model limitations. A computer cannot talk with management, so it’s hard for a computer to make sense of a management forecast about future profitability.”
As a result, the manager using a quant-based model is likely to make more cautious forecasts of outperformance than the fundamental manager.
“A very big difference in the way quantitative managers pick stocks and the way fundamental managers pick stocks is that if you were to ask a fundamental manager what they think the probability is that that stocks in their portfolio will outperform the market over the next six months, they will always say something north of 90%. That’s a very high probability rate. Whereas we would expect a typical holding in our portfolio to outperform with a probability of about 60%.”
Some of the limitations of computers when analysing information could, theoretically, be removed with the development of artificial intelligence (AI), an attempt to automate human tasks that require intelligent, non-linear behaviour. Asset managers are already using AI in a limited way to aid their investment processes. AIP, for example uses non-linear computational techniques such as neural networks and genetic algorithms to confirm stock pricing forecasts.
Another AI technique that is attracting attention is genetic programming – the use of computers to solve problems without telling them what to do. State Street Global Advisors’ Advanced Research Centre is using this technique to design a model for the biotech sector.
Mark Hooker, director of the ARC, says some sectors, such as biotechnology, do not fit the normal models. “If you use book to price ratio, price to earnings ratio, analysts’ estimates, momentum, all the standard tools of the trade for equity quant, you end up with a less useful model for biotechnology than you do in many other sectors.
“So we need different ways of coming at this. One way of doing this is pull out a bunch of things that we can measure from a number of sources – a biotech company’s balance sheet, an analyst’s reports on biotech companies, anything we think might be useful - and throw them into an AI process.
“We then let that run for a while on a subset of the data, so that you keep some out for verification.
“It then develops a collection of hypothetical models through its trial and error mechanism, producing a shortlist of the most important factors. Some will be very strange and they’ll get thrown away. But some of them might be a non-linear combination of two factors that we had never thought of combining.
“What is useful about that technique is that it gets us out of our normal thought patterns and suggests new things.”
Another form of AI is also being used in research, which SSgA is sponsoring at MIT Sloan School of Management’s Laboratory for Financial Engineering. The aim is to develop natural language programs that can scan database and documents, such as analysts and annual reports, to identify and analyse key financial information about companies, says Hooker.
“Lots of people make use of analysts’ reports for the numerical content – what’s the price target, what’s the recommendation change, what’s the earnings estimate. But the analysts also produce paragraphs of commentary. So it’s trying to go beyond the top level of information and find useful patterns there.
“One of the motivations for this is that there tends to be quite an asymmetry in the provision of news by firms. If companies have positive news then they will tend to put it out fairly quickly and so if a company has positive news you’ll see a steady news flow from them.
“But there’s often an incentive to conceal negative information and that means that there’s going to be an asymmetry. Most of the news is going to be positive and of fairly small significance. More infrequently you’re going to have bad news that has larger content to it.
“An analyst producing a quarterly report on a company will spend a lot of time trying to discover if there’s negative information about that company, but not receiving that information. We think there might be patterns in the text that fill that gap.”
The machine reader will look for keywords and key phrases that analysts use in their report when they have suspicions but no hard evidence about a company’s performance.
“We think that might be productive line, and we are hoping to use a version of it in models,” Hooker says.
Yet there is some scepticism about some of the more exotic tools that quant managers use. Brian Bruce, director and head of equity investments at PanAgora Asset Management in Boston, says that the latest quant techniques can encourage some quant managers to take their eye off the ball.
“The problem with a lot of quantitative managers is that they have these huge data sets and new tools and they just can’ t wait to use them no matter what the end result.
“Using quant techniques is like building a house. You can have the latest and greatest hammer but if you don’t know what sort of house you’re going to build who knows what you’re going to end up with,” says Bruce.
“With artificial intelligence you feed in datasets, you build a model and then it learns and makes decisions. But I’m not quite sure what it’s learned and I’m not quite sure why it’s making the decision it’s making. As an investor I would be nervous trusting my money to something I can’t look inside and see exactly what it’s doing. Using artificial intelligence is a much more difficult problem than people had hoped it would be.”
Bruce suggests that the way forward for quantitative managers is not through developing new tools but by making better use of the tools they have.
The new generation of quant investing, he says, is about looking for information that other people have missed or cannot find. One way of doing this is through the use of ‘conditioning’ – a process whereby where one factor conditions another in a model.
One example is the extent to which a company’s market capitalisation will condition the importance of its debt equity ratio, he says.
“Everybody throws debt/equity ratio in some form into their models because across the range of small and large caps securities, it gives a little bit of power. It adds a little to the forecast. If you have higher debt relative to equity it’s kind of bad, although it’s not really bad.”
Yet the picture changes dramatically once a model is designed to identify extremes or clusters among company sizes, Bruce says. “We found that if you look at the very smallest companies debt equity ratio is really bad.”
On the face of it, this is illogical because banks do not normally lend large amounts to small companies. “The reason is, they didn’t. These small companies were larger companies at some point, and the reason their debt/equity ratio is so big is that things have gone wrong. That’s a very powerful forecaster of future negative returns.”
For the mass of stocks, all but the largest 100 in the universe, the debt/equity ratio is unimportant, he suggests. “For everything in the middle, if you regressed debt/equity ratio against future price returns, it had absolutely no power whatever. It shouldn’t be in the model, because there’s nothing there.”
At the level of the largest companies, the effect is the reverse of that found in the smaller companies. This is because the very largest companies almost never go out of business, he says. “They have diversified business lines, they have great access to the capital markets. In some cases the government won’t let them go out of business.”
Therefore, it is sensible for such companies to take on as much debt as they can to leverage their equity, he says. “So the effect is that when you’re big, the more debt you have the better.”
“Not too many people look at debt/equity ratio and break it up into those terms. We’ve taught ourselves to think that way.
“Everybody is looking at larger and larger datasets, but it is in the tail, in the very extremes of the data, that you can find the most interesting insights,” says Bruce.