||In this talk I will address the problem of adapting complexity in classification. We will relate this problem to the curse of dimensionality which we shall demonstrate through the bias-variance dilemma. Examples of Bayes normal-based classifiers, Fisher classifier, support vector machines, and neural networks will be presented in this context. Adaptive complexity schemes may be achieved through regularization to avoid overfitting and improve generalization. We see such approaches in ridge regression, normal-based classifiers, and other settings. I will also discuss the problem of estimating complexity in relation to the VC-dimension and the trade-off between optimality and simplicity, as well as future directions for ensemble classification in advanced pattern recognition systems.