Abstract

We study the problem of computing the largest root of a real rooted polynomial p(x) to within error \eps given only black box access to it, i.e., for any x \in \R, the algorithm can query an oracle for the value of p(x), but it is not allowed access to the coefficients of p(x). A folklore result for this problem is that the largest root of a polynomial of degree n can be computed in n \log (1/\eps) polynomial queries using the Newton iteration. We give a simple algorithm that queries the oracle at only 2^{O(\sqrt{\log n})} \log(1/\eps) points. It is based on accelerating the Newton method by using higher derivatives. As a special case, we consider the problem of computing the top eigenvalue of a symmetric matrix to within error \eps in time polynomial in the input description. Well-known methods such as the power iteration and Lanczos iteration incur running time polynomial in 1/\eps, while Gaussian elimination takes more than n^4 bit operations. As a corollary of our main result, we obtain a nearly \n^\omega bit-complexity algorithm for the top eigenvalue.
 
This is joint work with Anand Louis.