The ” fast iterative shrinkage-thresholding algorithm ” (FISTA) is one of the most famous first order optimization scheme, and the stepsize, which plays an important role in theoretical analysis and numerical experiment, is always determined by a constant related to the Lipschitz constant or by a backtracking strategy. In this paper, we design a new adaptive non-monotonic stepsize strategy (NMS), which allows the stepsize increases monotonically after finite iterations. It is remarkable that NMS can be successfully implemented without knowing the Lipschitz constant or without backtracking. And the additional cost of NMS is less than the cost of some existing backtracking strategies. For using NMS to the original FISTA (FISTA_NMS) and the modified FISTA (MFISTA_NMS), we show that the convergence results stay the same. Moreover, under the error bound condition, we show that FISTA_NMS achieves the rate of convergence to $o(1/k^6)$ and MFISTA\_NMS enjoys the convergence rate related to the value of parameter of $t_k$, that is $o(1/k^(2(a+1)));$ And the iterates generated by the above two algorithms are strong convergent. In addition, by taking advantage of the restart technique to accelerate the above two methods, we establish the linearly convergences of the function value and iterates under the error bound condition. Similar results can not be obtained if we use the backtracking schemes. We conduct some numerical experiments to examine the effectiveness of the proposed algorithms.