The optimization problem with sparsity arises in many areas of science and engineering such as compressed sensing, image processing, statistical learning and data sparse approximation. In this paper, we study the dual-density-based reweighted $\ell_{1}$-algorithms for a class of $\ell_{0}$-minimization models which can be used to model a wide range of practical problems. This class of algorithms is based on certain convex relaxations of the reformulation of the underlying $\ell_{0}$-minimization model. Such a reformulation is a special bilevel optimization problem which, in theory, is equivalent to the underlying $\ell_{0}$-minimization problem under the assumption of strict complementarity. Some basic properties of these algorithms are discussed, and numerical experiments have been carried out to demonstrate the efficiency of the proposed algorithms. Comparison of numerical performances of the proposed methods and the classic reweighted $\ell_1$-algorithms has also been made in this paper.