Optimal Order Execution: Do You Know What Your Broker is Doing?

Cheriton Symposium


Friday, September 21, 3:00 - 4:00pm, DC1302

Lecturer: Peter Forsyth, Cheriton School of Computer Science, University of Waterloo


Algorithmic trade execution has become a standard technique for institutional market players in recent years, particularly in the equity market where electronic trading is most prevalent. A trade execution algorithm typically seeks to execute a trade decision optimally upon receiving inputs from a human trader.

A common form of optimality criterion seeks to strike a balance between minimizing pricing impact and minimizing timing risk. For example, in the case of selling a large number of shares, a fast liquidation will cause the share price to drop, whereas a slow liquidation will expose the seller to timing risk due to the stochastic nature of the share price.

A desirable strategy can be defined in terms of a Pareto optimal solution. We seek to determine the strategy which, for a given expected revenue from selling a block of shares, minimizes the risk (i.e. the variance of the revenue).

We compare optimal liquidation policies in continuous time in the presence of trading impact using numerical solutions of Hamilton Jacobi Bellman (HJB) partial differential equations (PDE). The industry standard approach (the Almgren and Chriss strategy) is based on an approximate solution to the HJB equation.

In terms of the mean variance efficient frontier, the original Almgren/Chriss strategy is signficently sub-optimal compared to the solution obtained by solving the HJB equation numerically.

This is joint work with Stephen Tse, Heath Windcliff, Shannon Kennedy, and Yuying Li.


Benefits and Pitfalls of Interdisciplinary Research on Light and Matter Interactions

Cheriton Symposium


Friday, September 21, 4:00 - 5:00pm, DC1302

Lecturer: Gladimir Baranoski, Cheriton School of Computer Science, University of Waterloo


Models of light and matter interactions employ computer simulations to describe how different materials absorb and scatter light. These models are extensively used in a wide range of fields, from computer graphics (e.g., realistic image synthesis) and biomedical optics (e.g., noninvasive diagnosis of medical conditions) to systems biology (e.g., prediction of plant responses to environmental stress) and remote sensing (e.g., early detection of diseases in vegetation).

More recently, models of light and matter interactions are also being used to accelerate the hypothesis generation and validation cycles of theoretical research in these fields. The development of simulation frameworks that can be effectively employed in these different disciplines requires a sound scientifically methodology and it is rarely linear. For example, a model should be carefully evaluated by comparing its results with actual measured data, a basic requirement in physical and life sciences.

However, often such data either does not exist or it is not readily available. Although significant progress has been achieved on the predictive modeling of light and matter interactions, several technical and political pitfalls severely hinder further advances in this area and limit the applicability of existing models. In this talk, we present a broad discussion of these issues. This discussion is illustrated with examples derived from openly accessible models developed by the Natural Phenomena Simulation Group (NPSG) at the University of Waterloo.