This thesis deals with Krylov subspace methods for computing the action of a matrix function on a vector. The main result of the thesis is a new integral representation for the error in Arnoldi's method, which is valid for large classes of functions, including holomorphic functions represented via the Cauchy integral formula and Stieltjes functions.
It is then shown how this error representation can be used to improve upon the standard Arnoldi method: First, a restarted Arnoldi method which evaluates the error function via numerical quadrature is developed and shown to be an improvement over existing methods by numerical experiments on various standard model problems. Afterwards, the method is analyzed theoretically and convergence for Stieltjes functions of Hermitian positive definite matrices is proven (independent of the restart length). The second main use of the error representation is the efficient computation of error estimators in Arnoldi's method. An algorithm which computes retrospective error estimates essentially for free (with cost independent of the matrix size) is presented, and it is shown that it is possible to compute guaranteed lower and upper bounds for the error norm for Stieltjes functions of Hermitian positive definite matrices.
As an extension of the developed results, it is briefly discussed how most of the results of the thesis can be generalized to extended Krylov subspace methods, with emphasis on the efficient computation of error estimators.