Abstract : Variational Analysis lies at the heart of modern optimization and underlies the convergence analysis of many algorithms. The purpose of this session is to bring together selected experts from the worldwide optimization and analysis communities to exchange ideas and present new results. We will strike a balance between early-career researchers and experts.
[01802] Fixed point strategies for sparsity aware inverse problems and hierarchical convex optimization
Format : Talk at Waseda University
Author(s) :
Isao Yamada (Tokyo Institute of Technology)
Masao Yamagishi (Tokyo Institute of Technology)
Abstract : We present central ideas behind the recently developed fixed point strategies for a nonconvexly regularized sparse least squares model and a hierarchical convex optimization problem. Related advancements for nonconvex optimization and signal processing will also be introduced briefly.
[02724] The splitting algorithms by Ryu, by Malitsky-Tam, and by Campoy applied to normal cones of linear subspaces converge strongly to the projection onto the intersection
Format : Talk at Waseda University
Author(s) :
Heinz H Bauschke (University of British Columbia Okanagan)
Shambhavi Singh (University of British Columbia Okanagan)
shawn xianfu wang (University of British Columbia Okanagan)
Abstract : Finding a zero of a sum of maximally monotone operators is a fundamental problem in modern optimization and nonsmooth analysis. Assuming that the resolvents of the operators are available, this problem can be tackled with the Douglas-Rachford algorithm. However, when dealing with three or more operators, one must work in a product space with as many factors as there are operators. In groundbreaking recent work by Ryu and by Malitsky and Tam, it was shown that the number of factors can be reduced by one. A similar reduction was achieved recently by Campoy through a clever reformulation originally proposed by Kruger. All three splitting methods guarantee weak convergence to some solution of the underlying sum problem; strong convergence holds in the presence of uniform monotonicity.
In this paper, we provide a case study when the operators involved are normal cone operators of subspaces and the solution set is thus the intersection of the subspaces. Even though these operators lack strict convexity, we show that striking conclusions are available in this case: strong (instead of weak) convergence and the solution obtained is (not arbitrary but) the projection onto the intersection. Numerical experiments to illustrate our results are also provided.
[02977] Fixed Point Algorithms: Convergence, stability and data dependence results
Format : Talk at Waseda University
Author(s) :
Javid Ali (Aligarh Muslim University, Aligarh)
Abstract : {\bf Abstract.} In this talk, we discuss a newly introduced two step fixed point iterative algorithm. We prove a strong convergence result for weak contractions. We also prove stability and data dependency of a proposed iterative algorithm. Furthermore, we utilize our main result to approximate the solution of a nonlinear functional Volterra integral equation. Some numerical examples are also furnished. If time permits, then we will discuss Image recovery problem as well.
[01896] Adaptive proximal algorithms for convex optimization under local Lipschitz continuity of the gradient
Format : Talk at Waseda University
Author(s) :
Puya Latafat (KU Leuven)
Andreas Themelis (Kyushu University)
Lorenzo Stella (Amazon Berlin)
Panagiotis Patrinos (KU Leuven)
Abstract : Gradient-based proximal algorithms have traditionally being bound to global Lipschitz differentiability requirements. Attempts to widen their applicabilty or reduce conservatism typically involve wasteful trial-and-error backtracking routines. Extending recent advancements in the smooth setting, we show how for convex problems it is possible to avoid backtrackings altogether and retrieve stepsizes adaptively without function evaluations. We demonstrate this with an adaptive primal-dual three-term splitting method that includes proximal gradient as special case.
[02427] Level proximal subdifferential and its resolvent
Format : Talk at Waseda University
Author(s) :
Ziyuan Wang (University of British Columbia)
shawn xianfu wang (University of British Columbia)
Abstract : In this talk, we introduce a new subdifferential whose resolvent completely represents the associated proximal operator. After illustrating that the usual limiting subdifferential representation is only valid when the given function is weakly convex, we propose level proximal subdifferential, which is a careful refinement of the well-known proximal subdifferential. As such, its resolvent always coincides with the associated proximal operator regardless of weak convexity. Besides this pleasant identity, we also investigate several useful properties of level proximal subdifferential. Finally, numerous examples are given to further illustrate our results.
[02993] On quasidifferentiability and optimization problems
Format : Talk at Waseda University
Author(s) :
VIVEK LAHA (Banaras Hindu University)
Abstract : The talk presents suitable optimality conditions based on some recent works in fractional programming and variational inequaliites in terms of quasidifferentials. The presentation deals with Fritz-John and Karush-Kuhn-Tucker type necessary optimality conditions at an optimal point in the framework of the quasidifferentiable analysis. Further, several other applications of the results are investigated in different fields of optimization like mathematical programs with equilibrium constraints and/or vanishing constraints.