Multiply accelerated value iteration for affine fixed point problems and application to Markov decision processes
We analyze a modified version of Nesterov accelerated gradient algorithm, which applies to affine fixed point problems with non self-adjoint matrices, such as the ones appearing in the theory of Markov decision processes with discounted or mean payoff criteria. We characterize the spectra of matrices for which this algorithm does converge with an accelerated asymptotic rate. We also introduce a dth-order algorithm, and show that it yields a multiply accelerated rate under more demanding conditions on the spectrum. We subsequently apply these methods to develop accelerated schemes for non-linear fixed point problems arising from Markov decision processes.