Description

When, and at what cost, can we solve linear algebra problems, or even just evaluate multivariate polynomials, with high relative accuracy? Suppose we only know that we can perform the 4 basic operations (+, -, *, /) with high relative accuracy, i.e. rnd(a op b) = (a op b)*(1+delta) with |delta| <= epsilon << 1. Which 2 of the following 3 problems can we solve with high relative accuracy?
  (1) Evaluate the Motzkin polynomial p(x,y,z) = z^6 + x^2*y^2*(x^2 + y^2 - 3*z^2)
  (2) Compute the eigenvalues of the Cauchy matrix C(i,j) = 1/(x(i) + x(j)), given 0 < x(1) < x(2) < ... , an example of which is the Hilbert matrix.
  (3) Evaluate x+y+z.
 

We will answer this question, and many related ones, and talk about the open problem of creating a decision procedure that would take a polynomial, and decide whether an
accurate algorithm for it exists. Properties of the variety of the polynomial will play a central role. We will also discuss some properties of floating point arithmetic that will let us expand the list of basic operations above, and so expand the set of problems we can solve accurately.

This seminar is part of the Recent Progress and Open Directions in Matrix Computations series.

All scheduled dates:

Upcoming

No Upcoming activities yet

Past


When Is Accurate and Efficient Expression Evaluation and Linear Algebra Possible?