Document Type
Honors Thesis
Publication Date
Winter 2018
Abstract
Levered and inverse Exchange Traded Funds (LETFs) are a recent and controversial innovation in financial engineering. These ETFs set out to achieve daily returns that are a multiple (2x, 3x) or negative multiple (-1x, -2x, -3x) of an underlying index. Since their inception in 2006, research has overwhelmingly concluded that these ETFs fail to meet their stated objectives over long holding periods. However, there has been debate over the causes of this error, and the holding period at which the tracking begins to break down.
This thesis sets out to analyze the relationship between the expense ratios of LETFs and their tracking error. Influenced by the methods of Bansal and Marshall (2015) as well as Lu, Wang, and Zang (2012), I calculate tracking error of LETFs and use regression analysis to estimate changes in tracking error attributable to changes in expense ratio. The sample is analyzed by each target multiple, and analysis is performed for holding periods of 1, 5, 10, 21, 63, and 126-days.
Through the research process, I find that for -1x, -3x (HP: 126 days) and 2x LETFs, paying a higher expense ratio can produce lower levels of tracking error. The data also supports previous research claiming LETFs tracking error increases as holding period increases. Results did vary for some holding periods and target multiples. Varying results are likely due to the effects of compounding on LETF returns and market conditions like volatility and direction of returns.
Recommended Citation
Carrier, Grant, "Pay for Performance: Do Higher Expense Ratios of Levered and Inverse ETFs Lead to Lower Tracking Error?" (2018). Honors College. 559.
https://digitalcommons.library.umaine.edu/honors/559