Steven (J) Miller
May 14, 1997 2pm --> Tea (4pm)
Special Topics:
1. Analytic Number Theory
2. Singular Integrals/Harmonic Analysis
Fefferman (Chair), Sarnak, Katz
I was asked what order I would like, and said: Complex, Number Theory,
Algebra, Real, Singular Integrals (its a good idea to have already
thought in advance what order you would like, and also what you'd like to
talk about, so that if they're kind enough to give you a choice...)
[COMPLEX ANALYSIS]
(S) Talk about the Riemann Mapping Theorem. Why can't you map the plane
into the unit disk. I gave a sketch of the proof, and told them I could
talk about the proof of when you can extend to the boundaries, and they
were happy (and didn't want to see the proof).
(S) what do I know about compact Riemann Surfaces-- I said not too much,
and just gave the definition and mentioned the sphere and torus, zeros
and poles.
(K) When is a meromorphic function equal to the derivative of a
meromorphic function? After being prodded to write down some candidates,
we came to residues and Mittag-Leffler.
(K) Integrate (Sin[x]/x)^2, and this lead into my Real Analysis,
which was basically asked by everyone.
[REAL ANALYSIS]
Talk about integrating (Sin[x]/x)^2. I said that by integrating by
parts, we can reduce it to the standard problem of Sin[x]/x, but got into
trouble when I did this in a Complex Analysis class at Yale. I asked them
if it was ok to solve the problem this way, as this was the Complex part
of my test, and was told that as long as I get it right, doesn't matter
how I do it.
Integrated by parts, and then talked about contour integration and the
residue theorem. I mentioned the alternate proof, of examining the
function in blocks of length Pi down the real axis. This led to a
discussion of Riemann vs Lebesgue integration, and then the Fresnel
Integrals (Sin[x^2]). One can show the integral exists by using a similar
method of looking at blocks of now varying size, because of the rapid
osciallation (period of Sin[kx] = 2Pi/k, so period is rapidly decreasing).
After some prodding for a more exact answer, I multiplied by
x/x, changed variables, and was then left with Sin[u]/ Sqrt[u].
I was then asked what can I say about
Integral[-oo,oo] f(x) exp[i*lamda*g(x) ]dx I said if g(x) = x,
then this is just the Fourier Transform, and talked a little about its
properties. For g'(x) never zero, one can multiply by g'(x)/g'(x). I was
then asked to examine the behaviour of 'nice' functions under this. What
if g'(x) vanishes, say polynomially at just one point?
[ANALYTIC NUMBER THEORY]
Sarnak asked me what books I had read. Actually, he asked if I had read
Davenport cover to cover, and I said I had read all but the last few
chapters, where for the three prime theorem, Waring's problem, and sieve
methods I had read Nathanson (Addative Number Theory). He asked what I
wanted to talk about, and I mentioned Three Primes, Reciprocals of Twin
Primes and Brun's Sieve, Zeros on the Critical Line, Waring's Problem. I
was asked why we can write all odd numbers as the sum of three primes, and
why these methods aren't applicable to Goldbach (another good source is
Ellison and Ellison, Prime Numbers, chapter 9).
Talk about Poisson summation, in particular Sum exp[-h*n^2] (I was
informed that this is how one knows which normalization to use for the
Fourier Transform!). What spaces of functions can we use this on (I
originally was giving weak conditions, but was allowed to move into the
Schwartz space, where I stayed for quite awhile). For what functions is it
useful to use Poisson summation. I was then asked about when can this be
extended, namely under what condition can you have:
Sum An f(mu*n) = Sum Bn f^(lamda*n)
but was let off the hook when I said I really had no idea.
As a follow up to the three primes, we talked about Waring's problem, in
particular, using Weyl's method for the sum of squares. I was having
trouble applying the difference operators and Weyl's method here, and we
soon left this problem.
[ALGEBRA]
(K) Talk about Jordan Canonical Form, and apply the results to operators
p(D), where p is a constant coefficient polynomial in d/dx. I had some
trouble with the applications, and was greatly helped.
(K) Write down the definition of a group. As I commented it is sufficient
to have just left inverse, left identity, I was then asked if I thought
it was enough to just have a left identity, right inverse, and ventured
the answer was 'no'. (The same professor who didn't like my integrating
by parts at Yale taught us that if it seems like a nice easy result and
you haven't heard of it/read of it in a book, guess no. I figured if it
was true, would've been in the 'weaker' definitions in Hungerford/Lang/
Jacobsen....). Found a counter-example by just playing with sets of two
(maybe it was three) elements.
(S) Under what conditions on a matrix B can you solve exp[A] = B? I
talked about diagonalizeable first, and then after some prodding into
writing out the series for log and examining Jordan Canonical Form/
Nilpotent matrices.
(K) State the structure theorem for finitely generated abelian groups.
[SINGULAR INTEGRALS]
(F) What is your favorite singular integral? I mentioned the Hilbert
Transform. Why does one care about this operator? I talked about pointwise
a.e. convergence of Fourier Series of L2 functions (Carleson's Theorem),
and that the nth partial summand, Snf = f*Dn, is basically the sum of two
Hilbert transforms plus bounded terms.
----------------------------------------------------------------------------
General Comments:
1. Was a very friendly committee, and they kept the pressure low. Often I
was asked/told to answer their question using only two or three words.
2. I'm the 'other' Steve Miller, the one born in Rochester NY in 1974 who
is studying under Sarnak at Princeton after going to Yale, not the one born
in Rochester NY in 1974 who studied under Sarnak at Princeton who is now
teaching at Yale.
Below are a list of books I used:
[ANALYTIC NUMBER THEORY]
Below are the main books I used to study Analytic Number Theory. KNOW the
Three Prime Theorem cold, as well as why the methods there do not apply to
Goldbach (proving every even number is the sum of two primes).
Nathanson, Addative Number Theory: The Classical Basis
Springer-Verlag, Number 164
An extremely well written exposition on Waring's Problem,
representing numbers as the sum of a fixed number of kth
powers, and the Three Prime Problem, representing all
sufficiently large odd numbers as the sum of at most 3 primes.
Ellison and Ellison, Prime Numbers
A slightly more general form of the Three Prime Problem.
Like Nathanson, very readable. A source for interesting special
topics not covered in Davenport; for example:
1. proof that there are, not just infinitely many
zeros in the critical strip, but on the line Re(s) = 1/2
2. oscillation theorems (for example, concerning Li(x)
and Pi(x)).
3. estimates for size of Zeta and related functions.
Their proofs of Prime Number Theorem and other results are at times
different than Davenport, relying on obtaining better bounds of
the Zeta and related functions, and using these explicit estimates.
Preferred their proof of Dirichlet to Davenport (chapts 7, 8).
Davenport, Multiplicative Number Theory
A must read, though I found certain books to be better for
Three Primes and proving Dirichlet. Well written in the middle
sections, with functions of finite order and functional equations
and prime number theorem. Numerous details and justifications in
going from line to line are left for the reader, unlike the two
books above.
Rose, ???????
Possibly Rosen or Penrose. More basic than the other books, but
has the clearest exposition on the Class Number Formula that is
often used in the proof of Dirichlet, and in obtaining estimates
as to the size of L(1,Chi), which, combined with our knowledge
of the size of its derivative, gives information on the location
of zeros close to s = 1.
Davenport, The Higher Arithmetic
A fast read, nice proof of Quadratic Reciprocity.
[SINGULAR INTEGRALS/HARMONIC ANALYSIS]
Singular Integrals (Stein), Introduction to Fourier Analysis (Stein and
Weiss), Fefferman's proof of Carleson (class notes, paper). For more
introductory level, there was a really good series of papers in 'Studies In
Harmonic Analysis'
[REAL ANALYSIS]
Rudin (R+C), Folland (Real Analysis), Lang (Real and Functional Analysis)
[COMPLEX ANALYSIS]
Rudin (R+C), Alfors, Lang (Complex Analysis)
[ALGEBRA]
Hungerford (Algebra), Jacobson (Basic Algebra), the last half of Lang's
Undergraduate Algebra (for Galois Theory)