As investors show renewed interest in value strategies, quant managers look for better uses of alternative data

Key points

  • The value factor has started to outperform
  • The use of alternative data to improve on quant strategies is growing
  • From text-based data to satellite imaging, managers are finding additional sources of alpha and sustainability information
  • The cost of using alternative data can be extremely high

It took a global pandemic, a war in Europe and the ensuing economic damage, but in the end, the long-awaited comeback of the value factor finally happened. The uncertain prospects for the global economy, caused by two years of extraordinary difficulties, have led investors to shed growth stocks and bet on value, ending over a decade of underperformance of the value factor. 

Equities have performed badly across the board this year, but value stocks have contained losses. Gross returns for the MSCI World Value index, for instance, were -5.38% from January to April, compared with -12.89% on the MSCI World index. 

Investment Metrics, the analytics provider formerly known as Style Analytics, reports that in April factors such as book to price, earnings yield, cash-flow yield, sales to price and EBITDA to EV, recorded positive performances in the US and Europe, as did stocks with a high dividend yield and dividend growth over five years. Growth-oriented factors took a beating, while ‘quality’ factors such as return on equity, net profit margin and earnings growth stability – as defined by Investment Metrics – also performed well. 

One can but hope, says Gideon Smith, co-chief investment officer of AXA Investment Managers’ Equity-QI, that the above is good news for quant managers, and that the present environment is one where they can thrive. 

While it is true that institutional investors often choose multi-factor strategies designed to outperform in all market conditions, it would be fair to say that the underperformance of the value factor has been a powerful headwind for quants. 

Smith says: “Investors choose quant strategies because they want alpha and outperformance, as well as attractively-priced products, and sometimes those things can go hand in hand in quant strategies.”

He adds: “Until a few years ago, investors were looking at multi-factor systematic strategies as a cheap source of alpha. It turns out that it’s not as easy as that. One of the big challenges to many quantitative managers has been the headwinds to value investing, as value was often a core factor. Any strategy with value at its heart would have struggled for a number of years.”

After all, the value factor is perhaps best understood from an economic perspective, and while there is disagreement on its definition, quant investors have been able to rely on it to generate outperformance under the most difficult market conditions. 

However, while a reversal of that trend would be positive for quant managers, competition within this market will remain strong, or even grow, as managers fine-tune strategies by integrating ever greater amounts of alternative data.

Smith says: “I think the advantage of quantitative management comes from being able to move beyond simple or naive factors. Over the long run, investing in attractively valued, high-quality companies that have demonstrated earnings growth or earnings momentum, is a winning strategy, but you’ll be exposed to the whims of the market. 

“I think we can do better. Whether that’s by building more proprietary signals, or processing the data better up front, these are some of the ways we think we can add value in the process.”

The whole premise of companies like AXA IM Equity-QI is to use data and computing power to turn information into alpha. Until a few years ago, the main type of data used related to prices, financial ratios, accounting, broker recommendations and the like. 

Data big bang

Smith says that in recent years there has been an explosion of alternative data of two broad types. One relates to ESG criteria in their various forms; the other is text-based data, including regulatory filings, earnings reports, transcripts of company earnings calls with analysts, and even news sources and social media chatter.

At the same time, computing power has grown exponentially as has the availability of machine learning techniques. In 2017, AXA IM Equity-QI released its first neural network model – a type of computer programme that simulates the human brain – in its Sustainable Equity strategy, with the aim of improving stock selection. The company has also begun using natural language processing models to process the huge amounts of text-based data available. 

When it comes to text-based data, managers can buy unprocessed or processed data from vendors, which would then require different depths of manipulation and analysis to build models and find signals, which in most cases are sentiment signals. 

Dictionaries of positive and negative terms used in corporate communications, financial media or even social media, can be customised and fed into models to refine them and look for stronger signals predicting stock price changes. 

At AXA IM, says Smith, the quant analysis team has also worked on the hypothesis that companies that are more specific with figures, for instance when referring to targets, in their external communications demonstrate more confidence. Natural language processing models can be built accordingly. 

Smith says: “The strongest signal that we were able to build using 10-K filings on US stock markets, for instance, was related to language change. We have measured the extent to which a company’s language in those official documents changes from one period to the next, and built time series data, showing that it is possible to associate language changes with a negative outlook for companies.”

He adds: “Lots of the sentiment models are not telling anything that we wouldn’t have picked up from analysts, price momentum or other signals. So this is not the silver bullet we are looking for with the models we are building. It’s generally very hard to find something that is genuinely new and additive. This language change model, instead, turned out to be additive and uncorrelated with other signals. It gave us a bit of additional alpha.”


“Investors choose quant strategies because they want alpha and outperformance, as well as attractively-priced products”

Gideon Smith

Around half of the new datasets that AXA IM Equity-QI works with are related to ESG, says Smith. “This often consists of genuinely new types of data we can work with to integrate in our signals. Both the quantity and quality of this data has increased massively,” he says. One of the uses of such data is to estimate a cost of carbon, which is then integrated into the company’s valuation models.

From carbon emissions to satellite imaging

At BNP Paribas Asset Management (BNPP AM), too, alternative data and machine learning techniques are used in the field of sustainability, which is a core component of the company’s investment capabilities, according to Raul Leote de Carvalho, deputy head of the quantitative research group.

BNPP AM’s proprietary estimation model and dataset of corporate carbon emissions was constructed using machine learning. The reported data covers about 20% of the company’s investment universe and it pays to build a proprietary model to estimate Scope 1, 2 and 3 emissions, says Leote de Carvalho.

But carbon emissions is just one component of BNPP AM’s ESG scoring methodology, which includes a combination of proprietary and externally-provided data that allows the company to score nearly each of the 15,000 companies in its investment universe.

BNPP AM’s sustainability scoring methodology has recently undergone a deep revision and update. Backtesting from 2008 shows the company can reduce carbon emissions from its equity portfolios by 50% and improve the portfolio’s score by 20% compared with the benchmark with minimal impact on factor tilts, which are towards value, low volatility and high-quality stocks.  

Perhaps the most exciting work concerning alternative sources of data is with satellite imagery.

The company is working on integrating data from the European Union’s Copernicus Earth Observation Programmes, particularly a dataset known as ERA5, which provides hourly estimates of a large number of atmospheric, land and climate variables. The dataset is put together by the European Centre for Medium-Range Weather Forecasts (ECMWF).

Leote de Carvalho says: “It is a dataset that can provide information at the intersection of ESG and financial risk. We use factors such as snow cover, reflectivity, humidity, rainfall and temperature. By linking this data with supply-chain data that we are using to estimate Scope 3 emissions, and another dataset on natural disasters, we can potentially build a better picture of risk for our stocks.”

The use of alternative data such as satellite imaging, transaction data or Google Trends to estimate sales, for instance, is less helpful for stock picking than it is for improving on asset allocation models, says Leote de Carvalho. In general, he sees at least three issues with alternative data. 

He says: “The coverage is not always good. Alternative data requires investment in order to understand whether it can help or not. If we are going to use it, it has to have some explanatory power meaning some efficacy in forecasting returns. But even if it does, it has to add value on top of what we do already. Otherwise, it is just an additional cost, which can be extremely high.”

One type of alternative data that passed all three tests is news. “We even considered gathering the data and building the natural language processing tools ourselves, but in the end that did not prove financially viable, so we decided to source the data and the NLP [natural language processing]. 

“However, the model that we have built does help in terms of improving stock selection based on news sentiment. It is somewhat correlated with momentum, it is expensive to run but in the end we decided that it is worth it as it adds sufficient diversification to the momentum style, particularly in certain geographies,” says Leote de Carvalho.

Traditional data, however, still has a long way to go in helping quant managers build robust strategies based on well-known factors, including value.  

Leote de Carvalho claims that the long-lasting underperformance of the value factor actually depends on how the factor is traditionally defined. When using a broader range of indicators to measure value and taking into account sector effects, the behaviour of value stocks appears markedly different. 

He says: “We use a set of diversified indicators based on cash flows to define value and, differently from what fundamental investors generally do, we neutralise sectors, because we want to invest in companies that are very cheap relative to the fundamentals. But in order to do that, we need to take into account the different levels of expected growth. 

“Neutralising sectors in that way massively changes the performance of the style, and indeed our value style only underperformed in 2019 and 2020. The path was actually similar to what happened before and after the tech bubble, in that a relatively small number of tech stocks massively outperformed and then peaked suddenly.”

Quant managers: back with new weapons