Brian Street's general exam
My committee: Aizenman, Nelson, Ellenberg
My topics: Stochastic processes and functional analysis
Time 2 hours.
We began with real analysis:
A: Suppose you have a sequence of functions on the unit disk, converging
pointwise, when do their integrals converge?
(I give the standard answers)
A: When do their derivatives converge?
I had no good answer for this, when he pressed me about conditions on the
second derivative, I said, by applying fundamental theorem of calc pointwise
convergence of them, plus some domination would do it. He also wanted
some result about the derivatives converging of convex functions. This
confused me a little, and so we talked about convex functions for a minute
or two. In retrospect the proof is easy, though I didn't think of it
at the time.
N: What can you say about the discontinuities of a function?
(They're an F sigma)
N: Why?
I sketch a proof
N: Can a function be discontinuous at just the irrationals?
(no)
N: Why?
(Baire category [that's really all I had to say])
A: define the Lebesgue integral
(I give the definition)
A: Give an example of something that's Lebesgue Integrable but not Riemann
integrable.
(I give the characteristic function of the rationals in [0,1])
A: Why isn't it Riemann integrable?
(All the lower sums are 0, and the upper sums >=1)
N: Do the Riemann integrable functions form a vector space?
(Yes)
N: Why?
(A function is Riemann integrable iff its set of discontinuities is Lebesgue
measure 0. So... [Nelson stopped me there])
Now they moved onto complex, but really the rest of the exam was a blur of all
the topics.
A: Suppose you have a sequence of holomorphic functions on the closed unit
disk, converging pointwise on the boundary. What can you say about the
convergence of the derivatives in the disk?
(I say I need some sort of domination or something)
A: How would you prove it if you had that?
(I start to write it down, once I write down the Cauchy integral formula
things move on)
A: Can you conformally map the unit circle to a square?
(of course it's possible by RMT)
A: How would you do it?
(I don't know off the top of my head)
A: How about the UHP
(I write down the conformal map)
A: How would you tell if two domains are conformally equiv?
(I don't know a good general formula, I say something about annuli, and
multiply connected regions, but mess that up a little. I mention the Dirichlet
problem)
A: Talk about the Dirichlet problem.
A: Is there a functions on the annulus solving the Dirichlet problem
with (and then gives some boundary conditions)?
(I answer it using Brownian motion. Aizenman and Nelson laugh.)
Aizenman wants to see the Dirichlet problem as the minimization of a
functional in terms of Euler-Lagrange equations. I hadn't looked at
physics in a few years and so didn't remember all this right away. They
inform me that it's the minimizer of the gradient innerproduct, and that
seems pretty obvious. Nelson asks why the gradient inner product is
conformally invariant, I stumble on this a little and we move on.
N: Suppose you have a function Holomorphic in a vertical strip, bounded by 1 on
the boundary, what can you say about it?
(Isn't there some sort of growth condition?)
N: you tell me.
(So I explain phragman lindeloff)
A: Now how would you prove this using Brownian motion, like you talked about
before?
(I mention something about if it's bounded, I can do it)
Aizenman wants something better. They hold my hand through an argument
done by bounding the probability that complex Brownian motion will hit
the edges of a rectangle of width L (L is large) and height 1 starting
on the interior. I mess this up horribly. But we get through it, and
get a phragman lindeloff for harmonic functions, they have to grow slower
than some exponential that we came up with.
Ellenberg's up for algebra now. We talk about the group of automorphisms
of the disk, and SL(2,R) (and their relation).
E: What's the stabilizer of a point in the disk under the group of
automorphisms of a disk?
(S^1)
E: What group theoretic construct relates the stabilizer of two points?
(conjugate)
E: Prove it.
We talk about conjugacy classes of SL(2,R).
E: Consider SL(2,R) acting naturally on R^2, what is the stabilizer of a
point?
(I compute the stabilizer of (1,0))
E: Do you know what sort of subgroup this is?
(Is it a Borel subgroup?)
E: Yes it is.
E: What is a tensor product?
(I ask what category we're in. He says whatever, so I do it for vector
spaces. I mention the universal property, and he asks what the elements look
like, so I do that too.)
E: Now we'll take the tensor product of two abelian groups, ie Z modules.
Z/pZ and Z/qZ, p and q distinct primes. What is it?
(I make a guess, and am wrong.)
E: Well let's take an element from it and look at it.
(So I write down a tensor b)
E: Now multiply it by p, what is it?
(0)
E: Why?
E: Now multiply it by q.
(It has to be zero as well. I think for a minute and say "OH!", and say it has
to be the zero element, and thus the group is trivial, and do a quick proof.)
We talk about discriminants of polynomials. I define the discriminant.
E: Why do you have that square there? (I had defined the discriminant
in terms of the roots of the polynomial)
E: How would you compute the discriminant for a quadratic?
(Well I know that, it's b^2-4ac)
E: Well that's right, does it always have to the a polynomial in the
coefficients?
(I don't manage to answer this one. I didn't review discriminants.)
E: What does this have to do with symmetric polynomials?
(I'm a little off on this one. Eventually he coaxes out of me what he
wanted me to say. We talk about the symmetric and alternating group
acting on the discriminant.)
On to functional analysis.
A: What topologies on the bounded operators on a Banach space are there?
(Now I've forgotten, did he say Banach or Hilbert? I ask, and mention
that I know more on the Hilbert space. I mention strong operator,
weak operator, uniform, weak, and ultraweak (and mention that
I only know ultraweak for Hilbert spaces)).
Aizenman gives me an operator on l^2(Z), namely:
D (a_n) = a_{n+1}-2a_n+a_{n-1}, the discrete laplacian.
Adds to it a multiplication operator V. Now, he takes the projection
onto the coordinates [-n,n], call it P_n. He asks in what topology does
P_n(D+V)P_n converge to D+V. Nelson asks if V is bounded. I said I wanted
it to be, so it was.
I start proving away, and intermediately, see its clear for weak operator, and
then keep going and show it for the strong operator. Aizenman is happy.
Now they start asking about the laplacian.
N: Take the laplacian on the real line, with domain smooth functions
with support away from 0. Is it essentially self adjoint?
(I talk about its deficiency indices, calculate them, mess up a little,
eventually correct myself)
We talk about the same problem in other dimensions. Specifically 3.
N: Does the Laplacian with domain smooth functions with support
away from 0 (call it D_0) have the full laplacian as its closure?
(Boy do I mess this up. Eventually he coxes me into showing that
functions in R^3 who are in L^2 and whose laplacian is in L^2 are
continuous (Sobolev imbedding). Since they're continuous, everything
in the domain of the closure of D_0, is 0 at 0 and we're done. This
took awhile to coax out of me)
A: What can you say about boundary conditions of the laplacian in terms
of Brownian motion?
(I do the semigroup generated by the Dirichlet laplacian in terms of Brownian
motion)
A: Let's go back to the above. And say how what you just said, and the
case of R^3\0 relate.
(Brownian motion never hits 0)
A: Let's to back to the case R\0, use Brownian motion to talk about
different self adjoint extensions of the laplacian, thereby giving an
elementary proof that there is more than one.
(I tell how to get Neumann and Dirichlet boundary conditions by using
Brownian motion [reflect it and kill it respectively])
A: talk about the central limit theorem
(I mention the iid formulation, and then go on to say Lindberg's condition,
I can't remember what it is, they seem at least okay with that)
N: I could never remember it either, until I saw it nonstandardly...read
that section is Radically Elementary Probability Theory, and you'll never
forget it again.
A: What is Ascoli's theorem?
(I state it)
N: State something from stochastic calculus
(I write down Ito's formula)
N: What can you say about the generator of a Markovian semigroup given
I'll let you have as nice a domain as you want.
(I talk about a theorem from Dynkin that says when it's a diffusion,
this is not what he wants. They write down some specific transition
function. This doesn't help. Eventually he leads me to say that
it's not a diffusion. I forget where all this lead.)
N: Talk about quadratic variation of Brownian motion. What does this mean
pathwise?
(I state that it converges in quadratic mean. He wants an almost everywhere
result, so I say if you do it on the dyadic rationals.)
N: That's right, but why do you have to choose a specific sequence for
convergence pointwise a.e.?
(I don't know. Pretty soon I write down the modulus of continuity for
Brownian motion, and they were happy and sent me out of the room and later
came out and told me that I passed.)
For how rocky things were at times, I was strangely comfortable. They
were all very nice.
Another comment:
I read somewhere on this page to bring a bottle of water to the exam, so I
did, and I am very happy that I did. My throat was killing me after about
an hour.
Book recommendations:
Stochastic Processes:
"Stochastic differential equations, an introduction with applications"
by O/ksendal is an easy read. The problems are easy, too. Working through
this book gave me a nice understanding of stochastic calculus and
stochastic differential equations. If you don't know anything about those
topics, but need to, I highly recommend this book.
Durrett's book. I think it was called "Stochastic Calculus: A
Practical Introduction", and it is essentially the same as his book
"Brownian Motion and Martingales in Analysis". The former is newer
and easier to read, but the latter does a little more in the way
of solutions to PDEs in terms of Brownian motion. These books do
some things in that vein that O/ksendal doesn't cover. Also, it has
some nice proofs of properties of Brownian motion. Definitely worth
a look, especially for the section on the Schrodinger equation.
"An introduction to the theory of random processes" by Krylov
has nice and easy proofs of such things as Donkser's invariance principle,
and Komolgorov's criterion for continuity. If one does stochastic
processes as a topic, one should know these proofs, and this book
is an easier place to learn them quickly than Karatzas and Shreve.
"Brownian Motion and Stochastic Calculus" by Karatzas and Shreve
I find this book hard to read, but it has results in it
that I didn't see done elsewhere, and so I am glad I looked at it. For
instance, it is where I studied the modulus of continuity for Brownian
motion, which came up on my exam. Plus it also goes into local time
quite a bit, which is worth looking at.
"Martingales and Stochastic Integrals" by P.E.Kopp This book does lots
of nice things for martingales. The stochastic calculus part is better
found elsewhere, though.
"Introduction to Stochastic Processes" by Cinlar. This book has a nice
easy overview of the Poisson process, if you just want to quickly learn
it's basic properties, but not spend too much time with it (which is how
I felt about the Poisson process).
"Functional integration in quantum physics" by Barry Simon was recommended
to me by Aizenman. It's worth looking at. It does a lot of Feynman-Kac
type things from the Trotter-product formula perspective, and such
proof methods are useful to know. Also, it talks a little bit about
the semigroups generated by the laplacian with different boundary conditions
in terms of Brownian motion. It doesn't really talk about this as much as
I wanted, but there are a few tangential theorems on it. I ended up
searching online a bit to learn about such things, which was good
since it came up on my exam. Though, I was never able to find a general
discussion of such things.
"Markov Processes" by E.B. Dynkin. I read the first 6 chapters of vol. 1
and a bit of vol 2, and it helped me a great deal. He does a very formal
introduction to Markov processes, which I enjoyed (though I know some
people who like it much less than I do).
Functional Analysis:
I spent most of my functional analysis time with Rudin's "Functional Analysis"
which served me well. But if I had to do it over again, I would spend
most of my time with Reed and Simon, especially since my examiners had a
physics bent. Either way, one should look at both, I think. Reed and
Simon has some parts of the spectral theorem that don't appear in Rudin,
and so should be looked at (like pure point, abs cont, and singular spectrum,
and multiplicity and such things). Also, self adjoint extensions via
quadratic forms is in Reed and Simon, but not in Rudin. Rudin does some
nice Banach algebra stuff that isn't in Reed and Simon. Also, Reed
and Simon makes some comments on the topologies that appear on bounded
operators on a Hilbert space, which came up on my exam.
Also, I read "A Short Course in Spectral Theory" which was a lot of fun,
and quite useful. It's an easy read, and contains some good functional
analysis, not even just things related to spectral theory.
Dunford and Schwartz, vol II was quite useful to me. It says some nice
things about the spectral theorem in the beginning, and I am very glad
I read the chapters on self-adjoint extensions before my exam.