Christopher Roth is interested in designing neural network architectures that improve sequential learning. This problem is relevant to creating artificial intelligence, as intelligent beings must learn from data that is presented in a sequential and non-orderly fashion. This is a tricky problem: training a network on a new task tends to override what the network has learned about previous tasks. While neural networks often have the capacity to learn a great number of tasks when presented with the data simultaneously, these same networks struggle with the earlier tasks when the data is presented sequentially. Roth is working to create biologically inspired learning rules that can improve the 'remembering' of earlier tasks while not sacrificing performance on future tasks. Previously, he worked in both condensed-matter and AMO physics.
- The Brain and Computation, Spring 2018. Visiting Graduate Student.