Colloquium Lectures

New matrix perturbation bounds via “skewness”: theory and applications.
Báo cáo viên: GS Vũ Hà Văn, Yale University

Thời gian: 9h30 thứ 6, ngày 20/12/2024

Địa điểm: Hội trường Hoàng Tụy, Tầng 2, Nhà A6, Viện Toán học

Tóm tắt: Matrix perturbation bounds play an essential role in mathematics. Results such as Weyl inequality on spectral norms (|| A+E|| - || A|| <_ || E|| ) has been used frequently in many traditional areas of mathematics.

Recently, matrix perturbation also plays an essential role in analyzing data and algorithms in data science. Let A be the ground matrix, E be the noise. We can only observe A+E, the noisy data. Usually, we are interested in a spectral parameter f(A) of the truth matrix A (for instance, its leading eigenvalues or eigenvectors). In reality, we can only compute the noisy version f (A+E). It is of great importance to estimate the error f(A)- f(A+E).

We have observed that in modern applications, A and E often have extra properties. For instance, it is frequently assumed that A is essentially low rank (it has a few dominating eigenvalues, the rest are small), and E is random. This has motivated us to work out a new theory under such assumptions.

The basic idea behind our study is the following. Results such as Weyl inequality above are optimal in the worst case analysis. However, when the inequality becomes an equality, the eigenvectors of A have to be strongly correlated with E. (For instance ||A+E|| = || A|| +|| E|| if A and E share the same leading eigenvector.) This situation does not happen when E is random. Our plan is to exploit the skewness between E and the eigenvectors of A to make an improvement. Quantitatively, we aim to replace ||E|| by the skewness parameter x:= max |u^T E v| where u, v run over the set of eigenvectors of A.

I have been working on this project for the past 15 years, with various coauthors. These works cumulated in the last few years to what we believe the foundation of a new theory.

In this talk, I am going to discuss our findings, the main technique behind the proofs, and a number of applications in various areas.

Back