Recently, optimization problems involving nonsmooth and locally Lipschitz functions have been subject of investigation, and an innovative method known as Gradient Sampling has gained attention. Although the method has shown good results for important real problems, some drawbacks still remain unexplored. This study suggests modifications to the gradient sampling class of methods in order to solve those issues. We present an alternative procedure that suppresses the differentiability test without affecting its convergence and we also exhibit a nonmonotone line search that can improve the robustness of these methods. Finally, we show some numerical results that support our approach.
Submitted in 03/2015, IMECC - Univ. Campinas and ICMC - Univ. São Paulo -- Brazil.