Skip to content

Optimization Online

  • Welcome
  • Repository
  • Submit
  • About
  • Help
  • My Eprints

Weak convergence on Douglas-Rachford method

Published: 2010/07/13
  • Benar Fux Svaiter
  • Categories Complementarity and Variational Inequalities, Convex and Nonsmooth Optimization, Infinite Dimensional Optimization Short URL: https://optimization-online.org/?p=11203

    We prove that the sequences generate by the Douglas-Rachford method converge weakly to a solution of the inclusion problem

    Article

    Download

    View Weak convergence on Douglas-Rachford method

    Post navigation
    Complexity of variants of Tseng’s modified F-B splitting and Korpelevich’s methods for generalized variational inequalities with applications to saddle point and convex optimization problems
    Invariant semidefinite programs
    Log in


    Repository

    Author List

    Months

    Categories

    Keywords

    alternating direction method of multipliers approximation algorithms augmented lagrangian method bilevel optimization Branch-and-Bound branch-and-cut chance constraints column generation combinatorial optimization complexity compressed sensing conic optimization convex optimization cutting planes decomposition derivative-free optimization distributionally robust optimization duality dynamic programming first-order methods global convergence global optimization heuristics integer programming interior point methods large-scale optimization linear programming machine learning mixed-integer linear programming mixed-integer nonlinear programming mixed-integer programming nonconvex optimization nonlinear optimization nonlinear programming nonsmooth optimization optimal control optimization proximal point algorithm quadratic programming robust optimization semidefinite programming stochastic optimization stochastic programming trust-region methods unconstrained optimization

    © 2023 Optimization Online • Child Theme of GeneratePress
    For feedback or questions, contact optonline@wid.wisc.edu.