We propose a multiple incremental decremental algorithm of support vector machines (SVM). In online learning, we need to update the trained model when some new observations arrive and/or some observations become obsolete. If we want to add or remove single data point, conventional single incremental decremental algorithm can be used to update the model efficiently. However, to add and/or remove multiple data points, the computational cost of current update algorithm becomes inhibitive because we need to repeatedly apply it for each data point. In this paper, we develop an extension of incremental decremental algorithm which efficiently works for simultaneous update of multiple data points. Some analyses and experimental results show that the proposed algorithm can substantially reduce the computational cost. Our approach is especially useful for online SVM learning in which we need to remove old data points and add new data points in a short amount of time.