In this paper, we propose a new Quadratic Threshold activation function for multilayer perceptorn neural networks and then discuss the capability of the neural networks for two-class classification problems. By using the Quadratic Threshold activation function in each neuron, we prove that the upper bound of the number of hidden neurons requried for solving a given two-class classification problem can be reduced by one falf compared with the conventional multilayer perceptrons which use the Threshold function. To utilize various optimization techniques in designing the learning algorithm of the new multilayer perceptorn, a differentiable Quadratic Sigmoid function is also proposed to approximate the non-diffferentiable Quadratic Threshold function. Based on the Quadratic Sigmoid function, we have designed the learning algorithm of the new multilayer perceptorn neural network in a way similar to the derivations of the backpropagation learning algorithm. Some simulation results are also demonstrated to show the effectiveness of the learning algorithm.