When linear dependence exists between some explanatory variables in a regression model, the estimates of regression coefficients become unstable, thereby making the interpretation of the estimation results unreliable. To eliminate such multicollinearity, we propose a high-performance method for selecting the best subset of explanatory variables for linear and logistic regression models. Specifically, we first derive a bilevel reformulation of the optimization problem for best subset selection with a multicollinearity constraint. We then develop a two-way cutting-plane algorithm that uses cutting planes in two ways; one type of cutting planes is used to approximate an upper-level nonlinear objective function, and the other type of cutting planes is used to remove subsets with collinearity. We prove that this algorithm outputs a solution with guaranteed global optimality within a finite number of iterations. Computational results based on synthetic and public datasets demonstrate the effectiveness of our method by comparison with the L1-regularized estimation and the previous cutting-plane algorithm for eliminating multicollinearity. Our method is not only a fast computational framework for best subset selection, but also has the advantage of improving the reliability of regression analysis by eliminating multicollinearity.