Abstract

Suppose we have a set of integers S, and we wish to count solutions to a certain system of linear equations in S; in particular, we want to know when we get the expected "random" answer.  A by-now standard approach to this is to show that the count is controlled by a certain uniformity norm for some s, and use tools from higher order Fourier analysis to understand when S is pseudorandom by that measure.
 
One question is: what is the smallest s you can get away with?  In other wards, how high up in the higher Fourier hierarchy do you have to go to get the right answer?  When can we get away with looking just for linear discrepancies?
 
This quantity s is called the "true complexity" of the system, and these question were addressed in work of Gowers--Wolf and Green--Tao.  A striking feature, though, is the significant gap between what elementary methods can show (the "Cauchy--Schwarz complexity") and the truth, which is currently only accessible using very heavy machinery and which gives ineffective bounds.
 
We will explore some questions in this area.

Video Recording