Schedule

9:30-10:00                Coffee and refreshments
10:00-11:15              Omer Angel
11:15-11:40              Coffee
11:40-12:30              Steven Heilman
12:30-2:00                Lunch
2:00-2:50                  Vilmos Prokaj
2:50-3:15                  Coffee
3:15-4:05                  Sneha Subramanian
4:10-5:00                  Tobias Johnson
6:00                           Dinner

Omer Angel:
Title: Planar Unimodular graphs
Abstract: I will discuss unimodularity in the context of planar random graphs, and some of its consequences. In particular we show that such graphs are local limits of planar graphs if and only if they correspond to a locally finite circle packing, if and only if p_c=p_u for percolation on these graphs. We also relate these properties to expansion and mean degree. I shall also discuss the key motivating examples of random planar maps and random hyperbolic maps. Joint with Tom Hutchcroft, Asaf Nachmias and Gourab Ray.

Steven Heilman:
Title: Strong Contraction and Influences in Tail Spaces.
Abstract: We study contraction under a Markov semi-group and influence bounds for functions all of whose low level Fourier coefficients vanish. This study is motivated by the explicit construction of 3-regular expander graphs of Mendel and Naor, though our results have no direct implication for the construction of expander graphs. In the positive direction we prove an $L_{p}$ Poincar\'{e} inequality and moment decay estimates for mean $0$ functions and for all $1<p<\infty$, proving the degree one case of a conjecture of Mendel and Naor as well as the general degree case of the conjecture when restricted to Boolean functions. In the negative direction, we answer negatively two questions of Hatami and Kalai concerning extensions of the Kahn-Kalai-Linial and Harper Theorems to tail spaces. For example, we construct a function $f\colon\{-1,1\}^{n}\to\{-1,1\}$ whose Fourier coefficients vanish up to level $c \log n$, with all influences bounded by $C \log n/n$ for some constants $0<c,C< \infty$. That is, the Kahn-Kalai-Linial Theorem cannot be improved, even if we assume that the first $c\log n$ Fourier coefficients of the function vanish. This implies there is a phase transition in the largest guaranteed influence of functions $f\colon\{-1,1\}^{n}\to\{-1,1\}$, which occurs when the first $g(n)\log n$ Fourier coefficients vanish and $g(n)\to\infty$ as $n\to\infty$ or $g(n)$ is bounded as $n\to\infty$.
joint with Elchanan Mossel and Krzysztof Oleszkiewicz


Vilmos Prokaj
Title: On the ergodicity of the L\'evy transformation
Abstract: In the talk I will present an approach to the old standing open problem of the
ergodicity of the so called L\'evy transformation of the Wiener space.
This transformation $T$ sends a Brownian motion $B$ into another one by the formula
(T B)_t=\int_0^t \operatorname{sign}(B_t)dB_t.
The main result is that the question could be answered in the affirmative by investigating the sequence
\{(T^nB)_1:\, n\geq 0\}.
Roughly speaking, if the expected hitting time to small neighborhoods of zero is inversely proportional to the size of the neighborhood then $T$ is even exact, that is,
\cap_n\sigma(T^nB)
is trivial. Although the proof of the proposed sufficient condition is still missing, some simulation results supporting it will be presented.

Sneha Subramanian
Title: Zeros of the derivatives of random polynomials and random entire functions
Abstract: For random polynomials and random analytic functions in $\mathbb{C}$ that are constructed as per prescribed random zeros, what can we say about the zeros of the derivative (i.e. critical points) of these functions? First part of this talk we discuss the behavior of the critical points of a random polynomial whose zeros are IID any distribution on the unit circle. For the second part, we study the result of repeatedly differentiating a random entire function whose zeros are points of a Poisson process of intensity 1 on $\mathbb{R}$. Based on joint work with Robin Pemantle.

Tobias Johnson
Title: The frog model on trees.
Abstract: Imagine that every vertex of a graph contains a sleeping frog. At time 0, the frog at some designated vertex wakes up and begins a simple random walk. When it lands on a vertex, the sleeping frog there wakes up and begins its own simple random walk, which in turn wakes up any sleeping frogs it lands on, and so on. This process is called the frog model.
I'll talk about a question posed by Serguei Popov in 2003: On an infinite d-ary tree, is the frog model recurrent or transient? That is, is each vertex visited infinitely or finitely often by frogs? The answer is that it depends on d: there's a phase transition between recurrence and transience as d grows.
This is joint work with Christopher Hoffman and Matthew Junge.