Find an Expert Witness

Forensic, General & Medical
Expert Witnesses

Of Slopes and Flops in Investments


FIND MORE ARTICLES
The investment world and the financial medias are abuzz with the USD yield curve and its growing
flattening/inversion, and for good reasons. Here is why.

Portfolio modeling and information selection

Any investment decision should be grounded in solid market or economic information, not in the investor’s last emotion. This is true no matter which investment segment the portfolio manager is in:

• In global macro / asset allocation, CIOs use econometric information (GDP, growth, balances, unemployment, PPP…), as well as market information (FX rates, interest curves, index PE...) to decide their asset class allocation.

• In discretionary equities, portfolio managers ground their analysis in corporate fundamental information (cash-flow models, ratios, balance sheet metrics and their growths), more qualitative information (business strategy, management quality, relative positioning, provider and client data, new products) and many types of market & economic information.

• Statistical arbitragers use technical information (momentum, acceleration, volatility, oscillators…), fundamental information (ratios, cash-flows, balance sheet statements...) and pretty much any data source they can put their hands on.

What differs between these various investors is the type of information they use, how they convert the information into trading positions, the frequencies & horizons they consider, how they monitor their portfolio and how they loop-back this performance analysis into the decision process.

In reality, the loop-back process is mostly based on risk tolerance and confidence, while the trading frequency is in line with the information frequency: econometric and fundamental equity information are usually quarterly, and the trading decisions are monthly/quarterly/yearly as a result; statarbers use daily if not intra-day information, and their positions are often adjusted daily if not hourly; momentum traders (most CTAs) tend to use week-to-month information and they re-adjust their holdings on a weekly frequency. The transformation of the input data into the portfolio construction is the critical substance in an investment process.

All these investors, discretionary or quantitative, fast or slow, face the same challenge of converting public information into trading decisions. And all of them look back in time to find which information was the most relevant predictor, and most likely to remain relevant going forward.

PCAs – Principal Component Analysis

The main metric to assess the relevancy of a given information is its correlations with the returns of the portfolio’s assets. A more systematic approach of this correlation review is to use Principal Component Analysis.
Here is how a PCA works:

• If you have a cloud of points in your universe (returns vs information data points), the data points or combination of data points which explains returns the most are guessed with correlations. The higher the correlation, the more they explain returns, the better the set. This combination can actually be found precisely with algebraic projections. That combination of data points you have detected will become your most important decision criteria.

• Now, if you eliminate the influence of this combination of data points, you obtain a new cloud of points, and you can again detect which data combination is the most relevant direction. This second data combination will become your second most-important decision criteria.

• By repeating the process, you recursively end-up with a list of 'directions' or 'factors' which best explain/describe how your assets grow over time.

That is the PCA process - explaining returns in a step-by-step process by discovering a new description factor at each step.

Mathematical models and coded libraries can implement this process in the blink of an eye nowadays. As a result, many econometric, quantitative or risk-management systems are using this systematic process or one of its variants. PCAs are ubiquitous in investment research.

Interestingly enough, if you do this exercise the robust mathematical way, not only the factors you discover are different sources of performance... but they are also uncorrelated! Instead of one information signal, you now have two, three or maybe more independent signals, which are independently valuable and variable (that list actually quickly reaches a limit, as noise and time-stability of these combinations become problematic).

Portfolio construction

The main benefit of this analysis is that you can now recombine these factors, with different weights at different times, and therefore stabilize and increase your initial portfolio’s overall performance.

To do this,

• You create synthetic portfolios in line with these (eigen-)factors, and weight these portfolios according to your views. In the StatArb space, these synthetic portfolios are L/S equity portfolios which you can use as a base, hence so much research on 'value', 'momentum', 'carry', etc, by all the risk-premia providers and the broker-dealers.

• The weights are typically based on their recent performance, as factor performance are slow-changing, but you could weight based on your data input: if you have a view, a forecast or a better factor calculation that the markets, your portfolios will have a performance hedge over the market.

• If you are in the Artificial Intelligence or Machine Learning investment spaces, you can actually let your model discover the weights. A simple approach (like mean-variance optimization) work well, but you can use a 'big-data' discovery process, sometimes at the risk of over-fitting.

• Before trading, you should also reduce the overall volatility of your portfolio by factor-neutralizing, aka eliminate all the exposures to the unwanted factors/synthetic portfolios. This process is usually run with a factor model (Northfield, Barra...) aka PCA-based portfolio analysis models.

PCAs allow you to discover and recombine multiple sources of performance to improve your portfolio!

PCAs in the interest rate space

Now, an interest rate curve is a bit trickier source of information to manage than a discreet list of data points - it is continuous. Should you consider the 30-day rate as information for your analysis, or would the 60-day rate be better? Why not the 6-month rate, the 2Y-rate, the 10-year rate? Why not the 35-day or the 36-day rates? How about the forward rates between these maturities, or their accelerations? There is an infinite number of maturities or signals that you can consider as inputs when you analyze interest rate curves or use them as model inputs.
This problem has been studied long and large. When PCAs are applied on yield curves description and their causal uses, they show that the most relevant sources of information come from a small number of variables.

These factors are:

• The general level: aka where the 1Y in USD is in absolute (2.6% now, vs 1.8% a year ago), or in relative terms between curves (the 1Y in German govies is -0.6%, while Italy is 0.25%). Once the maturity is set, you have a stable anchor to compare curves across time and countries.

• The slope of the curve: in USD the slope is usually measured with the 2Y-10Y spread. It stands now around 2.65%-2.50% = 0.15%, but that number is more volatile (and decreasing) at the moment.

• And its curvature, although this third factor is less relevant / influential and sometimes left aside: the difference of the 1Y-5Y spread and the 5Y-15Y spreads (for instance) indicates how upward or downward the curve is bent. In USD, with this set of maturities, the numerical curvature would be (2.75-2.50)-(2.50-2.60) = 0.35%. The higher moments are barely relevant.

Now, the maturity choice, the slope and the curvature vary from one curve to another, and their definition vary from one description/investment model to another, from one frequency of analysis to another (and the exact definition of ‘slope’ and ‘curvature’ may actually be maturity-dependent), but the qualitative results usually remain the same: the interest rate slope is one if not the most critical information coming from an interest rate curve.

The interest rate slope in econometric models

Even more interesting, and we talk here about global macroeconomic allocation models, the slope of the yield curve is a leading indicator of future economic growth. In other words, the slope doesn't tell you what is happening now, but tells you what will happen in an economy in the next couple of years - a bit less than 18 months on average.

You can sense this in non-mathematical terms by linking slopes to forward interest rates. On an example: if the 2Y is at 3% and the 5Y is at 2.5%, a back-of-the-envelope calculation will show that the rate between 2Y and 5Y (the "forward rate") is only 2.15%, aka much lower than now [leaving aside the difference between zero-coupon and swap rates].

Now, interest rates are an indication of the health of an economy. If there is growth (inflation), the rate tends to be higher. If there is deceleration, the rate is lower. So a lower forward rate indicates that the economy will cool down when we reach that inflection point.

Corporations are organizations whose function is to generate future earnings. They are ‘imprecise perpetual bonds’. Because of the unpredictability of these cash-flows, stock prices include a "risk premia" over rates - their earnings yields must be higher than their reference rates.

But corporations also depend on the country’s overall performance as a class (some may fare better or worse but, as a group, they are the economy) and so their yield can’t fluctuate too far away from the rate. Rates and equity earnings yields are therefore correlated, which could explain why rates and economic health are linked.
A negative or flattening slope therefore forecasts a lower equity yield. And since stock valuations are based on these yields (if Price-to-Earnings is a well-monitored multiple, then Earnings-to-Price is a well-monitored yield), then stock prices should also become lower.

The inescapable conclusion

The consequence of critical importance of slopes in curve modelization and econometric modeling, as well as of their leading indicator status is an inescapable conclusion: a flattening of the curve is a strong signal of a recession and of a bear market, a fact which is well documented in economic and investment literature.

Hence the focus on that inversion. Depending on causes, slopes often foretell flops.



By Navesink International
Securities, Derivatives, Portfolio Management Consultant and Expert Witness
ABOUT THE AUTHOR: Gontran de Quillacq - Derivatives & Investment Expert
Gontran de Quillacq has over 20 years of experience in portfolio management, derivatives trading, proprietary trading, structured products and investment research. He has worked with top-tier banks and hedge funds in both London and New York.

Litigation Support - Mr. de Quillacq's own investment experience and his cross-sectional review of other professionals give him unique experience on what can be done, what should be done, what should not be done, and the grey areas in-between. During a personal case, his legal team was so impressed by his wide and thorough knowledge in finance, his capacity to explain complicated ideas in simple terms, and his strong performance on the stand, that they strongly recommended he expand into litigation support services. Mr. de Quillacq is now a FINRA/NFA arbitrator, a member of the Securities Expert Roundtable and an IMS Elite Expert. He has consulting affiliations with Barrington Financial Consulting Group and Ankura (Navigant).

Copyright Navesink International

Disclaimer: While every effort has been made to ensure the accuracy of this publication, it is not intended to provide legal advice as individual situations will differ and should be discussed with an expert and/or lawyer.For specific technical or legal advice on the information provided and related topics, please contact the author.

Find an Expert Witness