Description

Apocryphal Results, Recent Results, and Open Problems in Neural Network Representations

This talk will start by stating and motivating in broad terms the classical representation question in machine learning: what function class should we use?

The talk will then survey various results in the representation power of various classes, focusing on neural networks.  Results will include: fitting continuous functions over various bases, constructing many-layered neural networks that can not be approximated by subexponential-size shallower networks, and fitting rational functions with neural networks.

Time permitting: the power of recurrent networks will also be discussed, as well as generalization questions (Peter Bartlett permitting).

The secondary goal of this talk is to state many open problems of varying difficulties; these will be woven throughout.

Audience participation is essential, or the speaker will become depressed.

All scheduled dates:

Upcoming

No Upcoming activities yet

Past