Registered Data

[01197] Numerical linear algebra in convex and nonconvex optimization

  • Session Time & Room : 3D (Aug.23, 15:30-17:10) @F311
  • Type : Proposal of Minisymposium
  • Abstract : Convex optimization has been instrumental in significant progress across science and technology. Nonconvex optimization methods are an exciting area of active research driven by modern applications. The efficiency and effectiveness of most optimization algorithms hinge on the numerical linear algebra algorithms that they utilize. Furthermore, optimization applications have motivated fundamental advances in numerical linear algebra. This minisymposium aims to bring together experts in both optimization and numerical linear algebra to discuss new developments and leading challenges in both areas.
  • Organizer(s) : Alexander Strang, Ming Gu
  • Classification : 65K10, 65Fxx, 90C26
  • Minisymposium Program :
    • 01197 (1/1) : 3D @F311 [Chair: Alexander Strang]
      • [01875] Nonconvex accelerated gradient descent without parameter tuning
        • Format : Talk at Waseda University
        • Author(s) :
          • Naoki Marumo (University of Tokyo)
          • Akiko Takeda (University of Tokyo)
        • Abstract : We propose a new first-order method for minimizing nonconvex functions with a Lipschitz continuous gradient and Hessian. The proposed algorithm is an accelerated gradient descent method with two restart mechanisms. It finds a solution where the gradient norm is less than $\varepsilon$ in $O(\varepsilon^{-7/4})$ function and gradient evaluations. Unlike existing algorithms with similar complexity bounds, our method requires no prior knowledge of problem-dependent parameters. Several numerical results illustrate that the proposed method is promising.
      • [05658] Algorithm NCL for constrained optimization: Overcoming LICQ
        • Format : Talk at Waseda University
        • Author(s) :
          • Michael Alan Saunders (Stanford University)
          • Ding Ma (City University of Hong Kong)
          • Dominique Orban (Polytechnique Montreal)
        • Abstract : We consider optimization problems whose constraints do not satisfy LICQ at a solution. LANCELOT is not troubled by LICQ because it solves a short sequence of bound-constrained subproblems. We therefore call it a BCL method (bound-constrained augmented Lagrangian). Algorithm NCL solves an equivalent sequence of nonlinearly constrained subproblems that are suitable for interior methods such as IPOPT and KNITRO. We explain why the NCL transformation helps IPOPT and KNITRO.
      • [05344] Efficient and numerically stable interior-point algorithms for convex optimization
        • Format : Online Talk on Zoom
        • Author(s) :
          • Ming Gu (UC Berkeley)
        • Abstract : We present efficient and numerically stable interior-point algorithms for linear programs, quadratic programs, as well as second order cone programs (socp), with supporting numerical results. In particular, our stable algorithms for socp achieves full machine precision accuracy, whereas existing interior-point methods for socp are known to be highly unstable.
      • [02180] Low Rank Tensor Decompositions and Approximations
        • Format : Online Talk on Zoom
        • Author(s) :
          • Jiawang Nie (University of California, San Diego)
          • Li Wang (University of Texas, Arlington)
          • Zequn Zheng (University of California, San Diego)
        • Abstract : There exist linear relations among tensor entries of low rank tensors. These linear relations can be expressed by multi-linear polynomials, which are called generating polynomials. We use generating polynomials to compute tensor rank decompositions and low rank tensor approximations. We prove that this gives a quasi-optimal low rank tensor approximation if the given tensor is sufficiently close to a low rank one.