Face recognition systems are typically required to work under highly varying illumination conditions. This leads to complex effects imposed on the acquired face image that pertains little to the actual identity. Consequently, illumination normalization is required to reach acceptable recognition rates in face recognition systems. In this paper, we propose an approach that estimates the direction of illumination. This knowledge can subsequently be used to normalize and re-light face images and thereby effectively filtering out effects stemming from illumination variation. We further propose to embed this approach into the widely used Active Appearance Models framework as an augmentation to the texture model in order to obtain illumination-invariant localization of faces.