FEATURES & NEWS
Print this article
The Future Of Indexing
Written by ETF.com Staff  -  April 19, 2010

Paul Amery, editor of IndexUniverse.eu, interviews Andrew Clark, chief index strategist at Thomson Reuters.

IU.eu: What impact has the financial crisis had on the indexing business?

Clark: It’s benefited from it, in the sense that there’s been a general move away from trying to find alpha, the returns produced from manager skill. People were startled to see that in some cases so-called alpha indices did worse than those offering beta. We and other index providers have seen a large increase in requests to produce specialised beta indices since the crisis. It saddens me to say this, but we’ve benefited from others’ misfortune.

IU.eu: What about the impact of the crisis on obtaining reliable prices? There were several areas of the asset markets where people complained during the crisis that price quotes were unreliable – corporate debt, for example.

Clark: There clearly still are some problems in obtaining reliable prices in certain areas of the market – mortgage- and asset-backed securities, for example, where you can say that markets have not completely “cleared”. In all other areas I think things have returned to normal. And in some areas where you might expect there to have been pricing problems, they haven’t really occurred. For example, we’ve been creating bespoke indices on credit default swaps for some of the exchanges, and in this area of the market liquidity has held up well.

IU.eu: Thomson Reuters has produced some “optimal” indices, based on modern portfolio theory (MPT). But in your own published research work you’ve concentrated on phenomena which suggest that the assumptions of MPT don’t apply: turbulence, time-dependency of returns, for example. Is it appropriate that most people still use MPT as the basis for index design, or is there a better way of doing things?

Clark: This question is linked to the ongoing debate about the use of capitalisation-weighted indices. Rob Arnott, for example, argues in favour of fundamental indexation, while Jeremy Bernstein says that equal weighting of index components is better, to quote just two of the proponents of alternative index methodologies. 

In fact, if you go back more than 30 years, you can find research by a famous American economist, Richard Roll, which makes the point that none of these methods is optimal, given that there is some serial dependence in stock market prices. When constructing variants to capitalisation-weighted indices, most people up to now have concentrated on alternative weighting methodologies. Perhaps, though, we ought to go back to what Roll first described, as we have tried to do in our optimal indices.

IU.eu: But aren’t your indices also based on the assumptions of MPT?

Clark: We do use mean-variance optimisation, but we do this in such a way as to allow price movements to be non-normally distributed (in other words, returns don’t have to follow the “bell” curve). The technique we use is to transform the distribution of returns – which can include the outliers, or extreme values – into a distribution which is closer to the normal one, and then we do the optimisation. This technique – which is less than 10 years old – is called GCAPM (Generalised Capital Asset Pricing Model). It’s based on the research of Didier Sornette at the Swiss Federal Institute of Technology in Zurich.

IU.eu: What’s the take-up been like?

Clark: It’s been a little slow; most people still prefer to use market capitalisation-based indices. But it’s growing. We have two clients in the US, and are currently talking to a third. Although the optimal indices came out three years ago, it’s only about six months that we’ve been promoting the index business more actively.



Comment Using:

blog comments powered by Disqus
 
Submit
 

Europe Blog

Tuesday, April 22, 2014 09:53 (CET)
Posted By Dave Nadig
Dave Nadig

iShares has quietly altered its ETF securities lending fee to between 0.25-0.35 percent which favours investors

... More






ADVERTISING