This course is taught as a regular Ph.D. course in the DTU course 029?? "Advanced Topics in Image Analysis, Computer Graphics and Geoinformatics". The First lecture will be given on

**Monday February 28th, at 1pm- 3pm. **

On this date and time the participants will decide the schedule for the remainder of the course.

Deformable template modelling is an important part of understanding complex patterns in images. Ulf Grenanders seminal work on 2D deformable template modelling (Grenander, Chow & Keenan, 1991) was hugely popularised in the end of the nineteen-nineties by the work of Cootes and Taylor (Cootes, Taylor, Cooper & Graham, 1995; Cootes, Edwards & Taylor, 2001; Stegmann, Ersbøll & Larsen, 2003), where they formulated linear models for shape variability estimated from annotated training data. Ramsay and Silverman (Ramsay & Silverman, 1997) presented seminal work of functional representations of curves.

However, in many situations linear models are likely to fail in
accurately modelling natural phenomena. For instance Kendall's
shape space for triangles in a plane - the simplest shape
imaginable - is a sphere in * R^{3}*. Linear approximations
to the triangle shape space are only valid for low variability in
triangle shape. To handle non-linear, large scale variations as
occurs in nature new models are required. Developments in the
statistics and machine learning fields in the new millennium have
led to methods for parameterizing low dimensional manifolds in
high dimensional spaces. These developments form the basis of
formulating non-linear shape space models. Such as principal
curves proposed by Hastie and Stuetzle (Hastie & Stuetzle, 1989); Tenenbaum
et al. (Tenenbaum, Silva & Langford, 2000)'s ISOMAP procedure;
Roweis et al. (Roweis & Saul 2000)'s Local Linear Embedding (LLE);
Belkin and Niyogi (Belkin & Niyogi, 2002)'s Laplacian Eigenmaps; and
Donoho and Grimes (Donoho & Grimes, 2003)'s Hessian
Eigenmaps.

Furthermore, natural phenomena can often be explained by a set of
few underlying parameters. This property has been used in many
years in statistics, e.g. in factor rotation (Harman, 1967) for
easier interpretation. In recent years sparsity has been used as
design criterion to overcome the problem of the dimensionality of
measurements vastly exceeding the number of observations
available. Mathematically sparsity is envoked by putting a *L _{0}*
penalty of the parameters. However, this is computationally
intractable. Fortunately, in many situation the

In this course we will read and discuss a series of articles and book chapters concerning the above mentioned subjects.

- Representing functional data as smooth functions -
(Ramsay & Silverman, 1997, Chap. 2)

- The registration and display of functional data -
(Ramsay & Silverman, 1997, Chap. 5)

- Principal components analysis for functional data I -
(Ramsay & Silverman, 1997, Chap. 6)

- Principal components analysis for functional data II -
(Ramsay & Silverman, 1997, Chap. 7)

- Principal curves - (Hastie & Stuetzle, 1989)

- ISOMAP - (Tenenbaum et al., 2000)

- Local Linear Embedding (LLE) - (Roweis & Saul, 2000)

- Laplacian Eigenmap - (Belkin & Niyogi, 2002)

- Hessian
Eigenmaps - (Donoho & Grimes, 2003)

- Representing data by Søren Erbou - Hands.m, Plothands.m

Course evaluation is based on completion of mandatory exercises using Matlab and S-plus software available at the following internet addresses

- ISOMAP - http://isomap.stanford.edu/
- LLE - http://www.cs.toronto.edu/~roweis/lle/
- HLLE - http://basis.stanford.edu/WWW/HLLE/frontdoc.htm
- Principal curves - http://www.r-project.org
- FDA - http://www.psych.mcgill.ca/faculty/ramsay/software.html

- Belkin, M., & Niyogi, P. (2002). Laplacian eigenmaps and spectral techniques for embedding
and clustering. In T. G. Dietterich, S. Becker, & Z. Ghahramani (Eds.),
*Advances in neural information processing systems*(Vol. 14). Cambridge, MA: MIT Press.

- Cootes, T. F., Edwards, G. J., & Taylor, C. J. (2001). Active appearance models.
*IEEE Trans. on Pattern Recognition and Machine Intelligence, 23*(6), 681-685.

- Cootes, T. F., Taylor, C. J., Cooper, D. H., & Graham, J. (1995, January). Active shape
models – their training and application.
*Computer Vision, Graphics and Image Processing, 61*(1), 38–59.

- Donoho, D. L., & Grimes, C. (2003). Hessian eigenmaps: locally linear embedding techniques
for high-dimensional data.
*Proceedings of the National Academy of Sciences, 100*(10), 5591–5596.

- Efron, B., Johnstone, I., Hastie, T., & Tibshirani, R. (2003). Least angle regression.
*Annals of Statistics*.

- Grenander, U., Chow, Y., & Keenan, D. M. (1991).
*Hands: A pattern theoretic study of biological shape.*Springer Verlag. (128 pp.)

- Harman, H. H. (1967).
*Modern factor analysis*(second ed.). Chicago: The University of Chicago Press. (474 pp.)

- Hastie, T., & Stuetzle, W. (1989). Principal curves.
*Journal of the American Statistical Association, 84*, 502–516.

- Ramsay, J. O., & Silverman, B. W. (1997).
*Functional data analysis*. New York: Springer Verlag.

- Roweis, S. T., & Saul, L. K. (2000). Nonlinear dimensionality reduction by locally linear
embedding.
*Science, 290*, 2323–2326.

- Stegmann, M. B., Ersbøll, B. K., & Larsen, R. (2003, may). FAME – a flexible appearance
modelling environment.
*IEEE Transactions on Medical Imaging, 22*(10), 1319–1331.

- Tenenbaum, J. B., Silva, V. de, & Langford, J. C. (2000). A global geometric framework
for nonlinear dimensionality reduction.
*Science, 290*, 2319–2323.

- Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. J. Royal. Statist. Soc B., 58 (1), 267–288.