This paper proposes an optimized learning method for large feature-sets using AdaBoost to produce hardware-efficient boosted decision stumps. The paper also proposes a method for training decision stumps to construct the ensemble. AdaBoost sequentially searches for the best weak classifier in the pool and adds it to the ensemble, using weighted training samples. In the proposed method, Particle Swarm Optimization quickens the selection of decision stumps. It is shown experimentally that the optimized method is more than 60% faster than the exhaustive search method. © Springer-Verlag 2013.