Start

End

KOÇ UNIVERSITY

GRADUATE SCHOOL OF SCIENCES & ENGINEERING

MATHEMATICS

PhD THESIS DEFENSE BY FATİH KANGAL

 

Title: Large-Scale and Nonconvex Eigenvalue Optimization

 

Speaker: Fatih Kangal

 

Time: September 19, 2018, 16:00

 

Place: ENG 208

Koç University

Rumeli Feneri Yolu

Sariyer, Istanbul

 

Thesis Committee Members:

Assoc. Prof. Emre Mengi (Advisor, Koç University)

Prof. Varga Kalantarov (Koç University)

Prof. Emre Alper Yıldırım (Koç University)

Asst. Prof. Fatih Ecevit (Boğaziçi University)

Prof. Bor Plestenjak (University of Ljubljana)

 

Abstract:

We deal with the optimization of a prescribed eigenvalue of a Hermitian matrix that depends on several parameters analytically. This setting is motivated by classical problems such as a linear semidefinite program under a constant trace assumption, shape optimization and structural design problems, robust stability measures in control theory. Our primary interest is in the problems involving large Hermitian matrices, in particular reducing their sizes by means of orthogonal projections onto appropriate subspaces. But some attention is also paid to the nonconvex nature of these problems.

We start by devising subspace frameworks that yield good subspaces for orthogonal projections. At every step, a projected eigenvalue optimization problem is solved, then the subspaces are expanded so that Hermite interpolation properties are attained between the projected problem and the original large-scale problem at the optimizer of the projected problem. Convergence analyses are conducted in the infinite dimensional setting when the eigenvalues of a self-adjoint compact operator dependent on parameters are optimized, specifically (i) convergence to a global minimizer is proven for the minimization of the Jth largest eigenvalue as the subspace dimension goes to infinity, (ii) a superlinear rate-of-convergence result is deduced in the smooth setting with respect to the subspace dimension. We observe the dramatic effects of the latter result in practice; problems on the order of tens of thousands are approximated accurately with projected problems in terms of matrices of sizes on the order of tens.

The second part is devoted to rate-of-convergence analyses in the nonsmooth setting when the eigenvalue is not simple at the optimal parameter value. The analyses here are restricted to the minimization of the largest eigenvalue of a Hermitian matrix dependent on one parameter. We establish a quadratic rate-of-convergence for a generalized version of the subspace framework from the first part. Additionally, for an algorithm to solve the projected problems that is guaranteed to converge to globally optimal solutions and that is based on piece-wise quadratic model functions, a rapid convergence result is proven in this nonsmooth setting. The rate-of-convergence results are illustrated on several nonsmooth examples arising from the computation of the inner numerical radius, the modulus of the point on the boundary of the field of values closest to the origin.

The third and the final part concerns the minimax problems with the largest eigenvalue in the objective. We tailor a globally convergent algorithm for small problems, and a subspace framework inspired by the ones from the first part for large-scale problems. Once again we prove the rapid convergence of the subspace framework. The theoretical results proven are confirmed in practice on an application, namely the minimization of the numerical radius, which is the modulus of the point in the field of values furthest away from the origin.

 

MORE DETAIL