Online learning algorithms when equipped with kernel functions often bring in computational burden both in space and time. In this paper, we present a family of budgeted online learning algorithms for multi-class classification which have constant space and time complexity per update. Our approach is based on the multi-class version of the popular Pegasos algorithm. To maintain constant model size, only a fixed number of support vectors are maintained through a budget maintenance step. By treating the budget maintenance as a source of the gradient error, we prove that the gap between the budgeted Pegasos and the optimal solution directly depends on the average model degradation caused by budget maintenance. Following the analysis, to minimize the model degradation, we studied greedy multi-class budget maintenance methods based on removal, projection and merging of support vectors. Empirical results show that the proposed online algorithm achieves accuracy comparable to non-budget multi-class kernelized Pegasos while being able to efficiently learn from and predict on high throughput data streams.