### Monday, March 6th, 2017

Tentative: There is an interesting connection between hash functions used in data structures and pseudorandom generators for small space. A sequence of results exploited this connection (in both directions) and there are several interesting open problems.

Advancing the sparse regularity method, we prove one-sided and two-sided regularity inheritance lemmas for subgraphs of bijumbled graphs, improving on results of Conlon, Fox and Zhao. These inheritance lemmas also imply improved $H$-counting lemmas for subgraphs of bijumbled graphs, for some $H$.

Let A be a set of integers. For any integer h\geq2, we say that A is a B_h-set if the h-wise sums a_1 + ... + a_h (a_1,...,a_h\in A; a_1\leq...\leq a_h) are all distinct. Let [n] = {1,...,n}. A natural question is the determination or estimation of the extremal function

F_h(n) = max{|A|: A\subset[n] is a B_h-set}.

The particular case of this problem in which h = 2 was raised by Simon Sidon in the 1930s and B_2-sets are known as Sidon sets. An immediate argument shows that F_h(n) = o(n) and, therefore, this is a so called `degenerate' extremal problem. As it is often the case with

degenerate problems, the structure of the extremal, that is, largest, B_h-sets A\subset[n] is not well understood.

We address the simpler problem of estimating the number of B_h-sets A\subset[n] of a given cardinality. As a consequence of these bounds, we determine, asymptotically, for any integer m\leq n, the cardinality

of the largest B_h-sets contained in a typical m-element subset of [n].

This is joint work with D. Dellamonica Jr, S.J. Lee, V. Rödl and W. Samotij.

### Tuesday, March 7th, 2017

Addressing a problem of Goldreich and of Alon and Fox, we prove new sufficient and necessary criteria, guaranteeing that a graph property admits a removal lemma with a polynomial bound. Although both are simple combinatorial criteria, they imply almost all prior positive and negative results of this type, as well as many new ones. In particular, we will show that every semi-algebraic graph property admits a removal lemma with a polynomial bound. This confirms a conjecture of Alon.

Joint work with L. Gishboliner

The question of finding an epsilon-biased set with close to optimal support size, or, equivalently, finding an explicit binary code with distance $\frac{1-\eps}{2}$ and rate close to the Gilbert-Varshamov bound, attracted a lot of attention in recent decades. In this paper we solve the problem almost optimally and show an explicit $\eps$-biased set over $k$ bits with support size $O(\frac{k}{\eps^{2+o(1)}})$. This improves upon all previous explicit constructions which were in the order of $\frac{k^2}{\eps^2}$, $\frac{k}{\eps^3}$ or $\frac{k^{5/4}}{\eps^{5/2}}$. The result is close to the Gilbert-Varshamov bound which is $O(\frac{k}{\eps^2})$ and the lower bound which is $\Omega(\frac{k}{\eps^2 \logeps})$.

The main technical tool we use is bias amplification with the $s$-wide replacement product. The sum of two independent samples from an $\eps$-biased set is $\eps^2$ biased. Rozenman and Wigderson showed how to amplify the bias more economically by choosing two samples with an expander. Based on that they suggested a recursive

construction that achieves sample size $O(\frac{k}{\eps^4})$. We show that amplification with a long random walk over the $s$-wide replacement product reduces the bias almost optimally.

### Wednesday, March 8th, 2017

We introduce the concept of a randomness steward, a tool for saving random bits when executing a randomized estimation algorithm 𝖤𝗌𝗍 on many adaptively chosen inputs. For each execution, the chosen input to 𝖤𝗌𝗍 remains hidden from the steward, but the steward chooses the randomness of 𝖤𝗌𝗍 and, crucially, is allowed to modify the output of 𝖤𝗌𝗍. The notion of a steward is inspired by adaptive data analysis, introduced by Dwork et al. Suppose 𝖤𝗌𝗍 outputs values in ℝd, has ℓ∞ error ϵ, has failure probability δ, uses n coins, and will be executed k times. For any γ>0, we construct a computationally efficient steward with ℓ∞ error O(ϵd), failure probability kδ+γ, and randomness complexity n+O(klog(d+1)+(logk)log(1/γ)). To achieve these parameters, the steward uses a PRG for what we call the block decision tree model, combined with a scheme for shifting and rounding the outputs of 𝖤𝗌𝗍. (The generator is a variant of the INW generator.)

As applications of our steward, we give time- and randomness-efficient algorithms for estimating the acceptance probabilities of many adaptively chosen Boolean circuits and for simulating any algorithm with an oracle for 𝗉𝗋𝗈𝗆𝗂𝗌𝖾-𝖡𝖯𝖯 or 𝖠𝖯𝖯. We also give a randomness-efficient version of the Goldreich-Levin algorithm; our algorithm finds all Fourier coefficients with absolute value at least θ of a function F:{0,1}n→{−1,1} using O(nlogn)⋅poly(1/θ) queries to F and O(n) random bits, improving previous work by Bshouty et al. Finally, we prove a randomness complexity lower bound of n+Ω(k)−log(δ′/δ) for any steward with failure probability δ′, which nearly matches our upper bounds in the case d≤O(1).

### Thursday, March 9th, 2017

Joint work with Toniann Pitassi and Thomas Watson.

### Friday, March 10th, 2017

Hoppen, Kohayakawa, Moreira, and Sampaio conjectured and Klimo\v sov\'a and Kr\'al' proved that hereditary permutation propert

Maybe surprisingly, for testing with respect to the cut metric, we prove there is a universal (not depending on the property), polynomial in $1/\epsilon$ query complexity bound for two-sided testing hereditary properties of sufficiently large permutations. We further give a nearly linear bound with respect to a closely related metric which also depends on the smallest forbidden subpermutation for the property. Finally, we show that several different permutation metrics of interest are related to the cut metric, yielding similar results for testing withrespect to these metrics.

This is a joint work with Jacob Fox.

This talk surveys problems from graph theory and additive combinatorics where randomness gives an essentially optimal bound. In some instances, pseudorandomness characterizes the optimizers.