Skip to content

How to Learn Algorithmic Trading

January 10, 2010

Readers seeking to learn quantitative / algorithmic trading often ask where to begin. As with any discipline, best approach is to get mentored by an expert. Short of that, read all the seminal works; ideally several times each.

The following is a reading list intended for retail traders introducing standard terminology and introductory topics, with bias to equity, exchange-traded derivatives, and FX. A caveat of this list is acknowledgment of the old adage that “there are no good books” and certainly none capture anywhere near the current state-of-the-art; that said, some books are better than none. This list focuses on those texts which build intuition, in preference to mathematical rigor. Each of the following is recommended, and a valued member of the Quantivity library.

Read more…

Causality, Exogeny & Regimes

January 10, 2010

An assumption of modern financial economics is of fundamental causality: effects in electronic financial markets are primarily due to causes in the “real world”. Classic examples are earnings announcements and GDP numbers, for micro (equities) and macro (fx) respectively. For many, this assumption weaves the very fabric of their financial worldview.

Much of investing and trading similarly bow to this assumption, ranging from the premise of fundamental analysis to many modern hedge fund strategies (from event-driven to global macro and merger arbitrage).

What if this assumption is wrong? Many facts are beginning to beg this question.

Read more…

Why Moving Averages

January 8, 2010

Moving averages are a ubiquitous tool in financial econometrics, especially dominant in both technical analysis and high-frequency trading. Given the sophistication of both disciplines, one is inclined to beg the question why such a seemingly trivial statistical technique as moving averages form a core of their foundation.

One explanation is due to a beautiful mathematical result, which has significant implication for building trading systems.

Read more…

Algorithmic Lingua Franca

January 2, 2010

Readers often privately ask what tools and platforms Quantivity uses for trading, ranging from exploratory analysis to post-trade. One tool has become influential enough to warrant highlight, and will serve as the algorithmic lingua franca in subsequent Quantivity posts. In particular, the forthcoming extended series on Market Regime Trading. As a veteran hard-core coder, this is no small complement.

This tool is R. Readers likely know of R from its origin: a programming language for statistical computing and graphics, similar to S-PLUS or SAS. Over recent years, dedicated quant contributors have evolved R quietly beyond its statistical roots into a fledging platform for quantitative finance analysis. Given the historical fracture amongst commercial analytic tools, R holds the potential to become the algorithmic standard for quantitative finance.

Read more…

Universal Portfolio Optimization & Trading Frequency

January 2, 2010

Readers with background in portfolio selection and optimization may be interested in a recent working paper by Hazan and Kale, presented at NIPS 2009: On Stochastic and Worst-case Models for Investing.

This paper builds upon Agarwal et al., Algorithms for Portfolio Management Based on the Newton Method (2006); and Cover, Universal Portfolios (1991). Notable is this research theme is beginning to actually intersect with reality, suggesting the potential for practical trading applicability.

Read more…

Market Regime Trading Redux

December 31, 2009

One topic of avid interest to Quantivity readers is market regime trading: systematic trading via heterogeneous portfolio of algorithmic trading strategies, whose dynamic selection and optimization is driven by quantitative analysis of current market regime.

This topic was originally introduced in two previous blog entries: Market Regime Dashboard and Trade Using Market Regimes?. By any measure of reader interest (active or passive), the concept of regime trading is unexpected popular. Several subsequent months have been spent in literature review, discussion with colleagues, and quiet reflection.

Such effort confirms market regime analysis indeed appears to be a worthy topic, as it squarely acknowledges the admittedly unpleasant financial postulate: there is no single low-frequency trading strategy that works consistently under all market regimes. Or, in trader-speak: “there is no holy grail”. This postulate also underlies why naïve backtesting is bogus.

Quantivity will take up market regime trading as a central theme in 2010, albeit not exclusively.

Posts will seek to combine profitable trading strategies, computational implementation, and mathematical rigor. Collaboration is sought with interested readers and fellow bloggers, whether in public or private.

CDOs & Computational Intractability

December 29, 2009

Readers with backgrounds in derivatives and theoretical computer science may find amusement in the recent working paper from Arora et al.: Computational Complexity and Information Asymmetry in Financial Products.

The article provides delightful juxtaposition of cross-discipline buzzword bingo: computational complexity (theoretical computer science), asymmetric information / lemon costs (information economics), and densest subgraph problem (graph theory).

In short, the authors posit that detecting fraudulent CDOs/CDSs is essentially computationally intractable (p. 12). Needless to say, the press is already having a field day with the article to blame Goldman (arguably rightfully so, to the demise of their counterparties); despite what I wonder may be limited capacity to fully comprehend the subtle theoretical details of the article.

Read more…

Statistical Arbitrage & Optimality

December 28, 2009

Readers interested in statistical arbitrage (statarb) strategies may enjoy brief review of two recent articles: Statistical Arbitrage in the U.S. Equities Market by Avellaneda and Lee (15 Jun 2009 draft) and Analytic Solutions for Optimal Statistical Arbitrage Trading by Bertram (12 Nov 2009 draft).

Read more…

Quote Arrival Frequency Distribution for Tick Data

December 27, 2009

High-frequency systems development is built upon the analysis of tick data. A classic example is statistically characterizing the frequency and arrival times of intra-day quotes, useful for building systems which exploit market microstructure effects.

Yet, the temporal regularity of such analysis fundamentally differs from traditional quantitative analysis: ticks arrive at irregularly-spaced times (even multiple at the same time), with time intervals ranging from zero to a few seconds (or even minutes). The irregular time arrival of ticks conflicts with the regularly-spaced assumption of classic statistical time series methods and corresponding computational tools.

Recent analysis bore out this challenge.

Read more…

Directional Prediction via Residual Logistic Regression

December 27, 2009

Paul Teetor wrote a nice study on Predicting Swap Spreads. Worth noting is his use of logistic regression to predict probabilistic direction of next period spreads based upon residuals calculated from a linear model of the historical spread. In other words, Paul is evaluating the following logical hypothesis (synthesized from p. 5):

Can the future direction of swap spreads be predicted using their historical mispricings from fair value, as estimated from influential market data (treasury notes, interest rate swaps, LIBOR, BIX, SPX).

Provided robust directional prediction, a profitable trading strategy can be defined as (p. 22):

   Long: if P(up) > 0.5, go long today
   Short: if P(down) > 0.5, go short today

This methodology is particularly interesting as it is equally applicable to many other financial instruments, provided suitable notions of “mispricings” and value estimators. For example, directional prediction of co-integrated pairs can be similarly estimated, extending the analysis introduced in Mean Reversion.

Read more…

Design a site like this with WordPress.com
Get started