We consider a game-theoretic variant of maximizing a monotone increasing, submodular function under a cardinality constraint. Initially, a solution to this classical problem is determined. Subsequently, a predetermined number of elements from the ground set, not necessarily contained in the initial solution, are deleted, potentially reducing the solution’s cardinality.
If any deleted elements were part of the initial solution, they are replaced with a set of at most equal cardinality.
The objective is to maximize the value of the ultimate solution, with the deletion being maximally disadvantageous to the ultimate solution.
When the submodular function is \(\text{M}^\natural\)-concave, we prove that a simple greedy algorithm computes an optimal solution.
When only one element may be deleted, we propose a polynomial running time algorithm with an approximation factor of at least \(\frac{1}{3}\).
When the number of deletions may become as large as the cardinality parameter, we present a polynomial running time algorithm that approximates an optimal ultimate solution in dependence on the curvature of the submodular function.
Furthermore, assuming that the number of allowed deletions is upper bounded by a term of the order of \(\frac{k}{\log_2^2(k)}\), where \(k\) is the cardinality parameter, we adapt an algorithm from Bogunovic et al. (2017) and show that its approximation factor is at least \(0.108\).