In this manuscript, we introduce a novel projected gradient algorithm for solving quasiconvex optimization problems over closed convex sets. The key innovation of our new algorithm is an adaptive, parameter-free stepsize rule that requires no line search and avoids estimating constants, such as Lipschitz modulus. Unlike recent self-adaptive approach given in [17] which typically produce monotonically non-increasing stepsizes, we propose a rule where the stepsize sequence is proven to be non-decreasing and convergent to a positive limit after finitely many iterations. This property enables consistently longer steps, potentially accelerating convergence significantly. We establish convergence guarantees for the algorithm across various classes of functions, including quasiconvex, pseudoconvex, convex, and strongly convex objective functions. Numerical results on numerous test instances demonstrate the efficiency of the proposed method, showcasing its competitive performance and often superior convergence speed compared to state-of-the-art alternatives.