Phil Sosoe's generals (May 10, 2010)
Topics: Harmonic analysis, Probability
Committee: M. Aizenman (chair), C. Fefferman, A. Golsefidy
Duration: 2 hours
I have forgotten some of the questions I was asked, but those were almost certainly standard questions.
Complex Analysis (Fefferman)
Explain the Riemann Mapping theorem. Can you map a disk minus a point to the disk? If you have a map into the disk, what can
you say about the behaviour a 0? What can you say about a map that behaves like z^{-n} in a neighbourhood of 0? What other
behaviour can occur? Can you prove Picard's theorem? (I sketched a proof using Bloch's theorem).
Algebra (Golsefidy)
What is a nilpotent matrix? Explain the Jordan canonical form is. What are the invariants associated to the JCF? How do you
determine the number and sizes of the blocks (he wanted a formula in terms of the dimensions of the kernels)? Can you
diagonalize symmetric matrices? To which class of matrices does the argument you gave generalize? (Normal operators). What's
the image of the circle under a linear map? (I managed to get stuck on this for a moment before realizing the image is a conic
and so must be a line segment, or an ellipse centred at 0). Do you know about the singular value decomposition?
How would you find the Galois group of x^3+2x+1? Explain what the discriminant is. What can you conclude if you know the
discriminant is in the base field? In general, how do you compute the discriminant from the coefficients of the polynomial?
Adjoin a root of the previous polynomial to Q; can you say something about the roots of x^3+3x+1 over this field?
Real analysis (Aizenman)
Conditions for pointwise convergent sequences to have convergent integrals. (I gave dominated convergence). Counterexample?
(function of size n on (0,1/n)). I also mentioned uniform convergence on an interval. Aizenman asked for a counterexample on
the line (I said a sequence with graphs that are wide rectangles of small height). I mentioned that in the finite measure case,
a necessary and sufficient criterion is that the functions be uniformly integrable. This led to a short discussion of the UI
condition.
Conditions for pointwise convergent sequences to have convergent derivatives. At first I couldn't give any, so Aizenman asked
for counterexamples. I said a function that is small but oscillates. It was suggested to look at convex functions. (I guessed
that in this case, the derivatives do converge in the interior of the interval, but there might be trouble at the endpoints). I
didn't conclude the argument before they decided to move on.
Can you find a set of measure zero on the real line? (?! - I said the empty set). What about an uncountable set of zero
measure? (Cantor-type sets where you remove a fixed proportion of the intervals). What are points of density? Why do they
exist? If E has positive measure, does E+E contain an interval? (This last question was asked by Golsefidy).
Harmonic analysis (Fefferman)
Write down \Delta u = f, for 'nice' f. How do you recover u? What space does it lie in? (Stated Hardy-Littlewood-Sobolev
fractional integration theorem). What about the derivatives of u? (Use Riesz transforms to show L^p control of the derivatives
by f). Some generalities about Sobolev spaces and embedding. Write down the multipliers for Riesz transforms, and the
corresponding convolution kernels. Prove that u is in W^2 in this case. Why do the L^p bounds not hold for p = 1 or p = \infty?
There followed a discussion about the Paley-Wiener theorem, during which I mixed up the definitions of "type" and "order" of an
entire function.
Probability (Aizenman, with occasional comments by the others)
Define Markov chain (I asked what generality he wanted for the definition. He said discrete time is fine). Suppose the state
space is finite and the chain is irreducible, what can you say about the distribution of the first hitting time of a given
state? How would you estimate the rate of convergence to equilibrium precisely? I mentioned L^2 methods in the reversible case;
he wanted me to talk about Perron-Frobenius (i.e. spectral theory of general positive matrices) but I had nothing intelligent
to say about that.
Do you know any zero-one laws? (We ended up discussing only the classical Kolmogorov one). How do you prove it? Then they asked
about dependent random variables with non-trivial tail sigma field. Fefferman had an idea involving first choosing at random
between a fair coin and one that always lands on head, and this led to an interminable discussion about how to concretely model
probability spaces (Rademacher functions on [0,1], product measures, image measures, etc…). I couldn't understand what they
actually wanted of me.
What do you know about ergodic theorems? (I gave Birkhoff's theorem: the averages converge pointwise to the conditional
expectation w.r.t. to the sigma algebra of invariant sets). How does it apply to a "coin flip" situation as previously
discussed: what is the measure-preserving transformation in that case? When is the limit a constant? (In the ergodic case; i.e.
the invariant sigma algebra is trivial).
Define conditional expectation. State the martingale convergence theorem. (I stated Doob's "forward convergence" theorem.) How
does this apply to the situation previously discussed? (I gave the "reverse martingale" convergence argument, and the exam
ended there.)
Afterwards, Aizenman explained to me what they were trying to get at with the "coin flip" questions in the "zero-one" part. It
was really quite trivial, and it would not be particularly instructive (not to mention incredibly embarrassing) to reproduce
the discussion here.
I thank Aaron Pollack, Shrenik Shah and Béla Racz for helping me prepare this exam.