Skip to Main Content
Support vector machines (SVM) are one of the most useful techniques in classification problems. One clear example is face recognition. However, SVM cannot be applied when the feature vectors defining our samples have missing entries. This is clearly the case in face recognition when occlusions are present in the training and/or testing sets. When k features are missing in a sample vector of class 1, these define an affine subspace of k dimensions. The goal of the SVM is to maximize the margin between the vectors of class 1 and class 2 on those dimensions with no missing elements and, at the same time, maximize the margin between the vectors in class 2 and the affine subspace of class 1. This second term of the SVM criterion will minimize the overlap between the classification hyperplane and the subspace of solutions in class 1, because we do not know which values in this subspace a test vector can take. The hyperplane minimizing this overlap is obviously the one parallel to the missing dimensions. However, this condition is too restrictive, because its solution will generally contradict that obtained when maximizing the margin of the visible data. To resolve this problem, we define a criterion which minimizes the probability of overlap. The resulting optimization problem can be solved efficiently and we show how the global minimum of the error term is guaranteed under mild conditions. We provide extensive experimental results, demonstrating the superiority of the proposed approach over the state of the art.