Coordinate Descent on the Orthogonal Group for Recurrent Neural Network Training

With Estelle Massart (Oxford)

Coordinate Descent on the Orthogonal Group for Recurrent Neural Network Training

To address the poor scalability of training algorithms for orthogonal recurrent neural networks, we propose to use a coordinate descent method on the orthogonal group. This algorithm has a cost per iteration that evolves linearly with the number of recurrent states, in contrast with the cubic dependency of typical algorithms such as stochastic Riemannian gradient descent. We numerically show that the Riemannian gradient in recurrent neural network training has an approximately sparse structure. Leveraging this observation, we propose a variant of the proposed algorithm that relies on Gauss-Southwell coordinate selection. Experiments on a benchmark recurrent neural network training problem show that the proposed approach is a very promising step towards the training of orthogonal recurrent neural networks with big architectures.

Join Zoom Meeting
https://maths-cam-ac-uk.zoom.us/j/93776043287?pwd=UDIrNDdkeUU1NmFtZXpNUzd6ZjRrdz09
Meeting ID: 937 7604 3287
Passcode: p1Co4skf

Add to your calendar or Include in your list