Registered Data

[01197] Numerical linear algebra in convex and nonconvex optimization

  • Session Date & Time : 5B (Aug.25, 10:40-12:20)
  • Type : Proposal of Minisymposium
  • Abstract : Convex optimization has been instrumental in significant progress across science and technology. Nonconvex optimization methods are an exciting area of active research driven by modern applications. The efficiency and effectiveness of most optimization algorithms hinge on the numerical linear algebra algorithms that they utilize. Furthermore, optimization applications have motivated fundamental advances in numerical linear algebra. This minisymposium aims to bring together experts in both optimization and numerical linear algebra to discuss new developments and leading challenges in both areas.
  • Organizer(s) : Alexander Strang, Ming Gu
  • Classification : 65K10, 65Fxx, 90C26
  • Speakers Info :
    • Michael Saunders (Stanford)
    • Luyining Gan (University of Nevada Reno)
    • Naoki Marumo (University of Tokyo)
    • Jiawang Nie (University of California San Diego)
  • Talks in Minisymposium :
    • [01875] Nonconvex accelerated gradient descent without parameter tuning
      • Author(s) :
        • Naoki Marumo (University of Tokyo)
        • Akiko Takeda (University of Tokyo)
      • Abstract : We propose a new first-order method for minimizing nonconvex functions with a Lipschitz continuous gradient and Hessian. The proposed algorithm is an accelerated gradient descent method with two restart mechanisms. It finds a solution where the gradient norm is less than $\varepsilon$ in $O(\varepsilon^{-7/4})$ function and gradient evaluations. Unlike existing algorithms with similar complexity bounds, our method requires no prior knowledge of problem-dependent parameters. Several numerical results illustrate that the proposed method is promising.
    • [02180] Low Rank Tensor Decompositions and Approximations
      • Author(s) :
        • Jiawang Nie (University of California, San Diego)
        • Li Wang (University of Texas, Arlington)
        • Zequn Zheng (University of California, San Diego)
      • Abstract : There exist linear relations among tensor entries of low rank tensors. These linear relations can be expressed by multi-linear polynomials, which are called generating polynomials. We use generating polynomials to compute tensor rank decompositions and low rank tensor approximations. We prove that this gives a quasi-optimal low rank tensor approximation if the given tensor is sufficiently close to a low rank one.
    • [05344] Efficient and numerically stable interior-point algorithms for convex optimization
      • Author(s) :
        • Ming Gu (UC Berkeley)
      • Abstract : We present efficient and numerically stable interior-point algorithms for linear programs, quadratic programs, as well as second order cone programs (socp), with supporting numerical results. In particular, our stable algorithms for socp achieves full machine precision accuracy, whereas existing interior-point methods for socp are known to be highly unstable.