Michael Barz's Generals Committee: Bhargav Bhatt (chair), Akshay Venkatesh, Bjoern Bringmann Special topics: algebraic geometry and representation theory of compact Lie groups 23 April 2025, 2:00P.M. - 4:30P.M. PRELUDE [my responses in brackets] Bhatt: How should we start? [Complex analysis] Bringmann: If we start with analysis let's start with real [OK] REAL ANALYSIS Bringmann: Define Banach space. Bringmann: What does weak convergence mean? [for every continuous linear functional f, f(v_n) converges to f(v)] Bringmann: Let's get more specific. Let's work in L^p. What are continuous functionals on L^p? [L^q, I said the word Holder and he became happy] Bringmann: OK, now look at L^4(R) and the sequence f_n(x) = sin(n^4x)/(n^3x). Let's prove it weakly converges to 0 but doesn't converge to 0. First show it doesn't converge to 0. [If you substitute y = n^4x you can rewrite the L^4 norm as integral (sin(y)/y)^4 dy which is some constant. I was fortunately told I didn't need to evaluate the constant or prove it was nonzero.] Bringmann: OK, now show it weakly converges to 0. [I started by finding that if p = 4 then q = 4/3. Then I was stuck looking at why integral f_ng converges to 0. I tried changing variables again and using Holder's.] Bringmann: Holder's is scaling invariant so you changing variables will not help. However sometimes Holder *will* help. When is f_n in L^p? [I waxed poetic for a little about inclusions of L^p spaces] Bringmann: No no no do it concretely, also your philosophy is wrong, this is the correct one Venkatesh: Oh my one analysis question was about inclusion of L^p spaces now you spoiled it Bringmann [Finally I just wrote down the same change of variables and saw my function was L^p for any 1 < p < 4.] Bringmann: OK set p = 3. [Huh, so if instead I had an L^(3/2) function I would get convergence but idk.] Bringmann: Use the word `density' [OH compactly supported smooth functions are dense in both] Bringmann: Why does density work? Use something already on the board [The L^4 norms of the f_n are all the same, and in particular bounded, so if g - g_m goes to 0 in L^(4/3) then f_n(g - g_m) will be bounded by ||f_n||_L^4 * something going to 0 which will still go to 0.] Bringmann: OK let me change questions Venkatesh: Wait wait draw a graph of this function f_n in very schematic terms [I drew some oscillating thingy that peaked at n at 0] Venkatesh: OK you said something about the vertical scaling now tell me about the horizontal scaling, when does it have norm less than 1? [I fiddled around before saying n^(-3), which made Akshay happy, but Bringmann said n^(-4) was a better horizontal scaling to use, and they talked for a bit about this.] Bringmann: OK now let's change questions. State Plancherel's theorem for S^1. [I wrote it using \sum_n |a_n|^2 = \int |f|^2, but he told me to use the notation \hat{f}(n) for Fourier coefficients instead] Bringmann: A closed and bounded subset of R^n is compact. Imagine you have a closed and bounded subset F of L^2(S^1) [when he first said this I didn't hear the word `subset,' and neither did Akshay, and we were both very confused until I asked Bringmann to repeat the question]. It is not always compact. But suppose it obeys the following property: lim N->infinity of sup_{f \in F} \sum_{|n| > N} |\hat{f}(n)|^2 = 0. Prove then that it is compact. [I said at first that Arzela-Ascoli is proved using a diagonal argument so maybe I should do that with Fourier coefficients] Bringmann: It works but it is easier to instead show that for any epsilon I can cover it with finitely many epsilon balls, it is the same trick but better notationally to do this than to find a convergent subsequence [OK, if epsilon is given there is exactly one thing to do: pick N large so that forall f \in F, \sum_{|n| > N} |\hat{f}(n)|^2 < \epsilon. I got confused here for a little and eventually said wait let me work on L^2(Z) = ell^2(C), then it all made sense: I added an /2 to the epsilon above, said cover F with small epsilon/2 balls so that the first N coefficients in ell^2(C) are within epsilon/2 of one of my balls, and then the equality above takes care of the rest of the Fourier coefficients.] Bringmann: Good! What if I replaced S^1 with R? [The suspicious part is where I used finitely many balls to cover the first N coefficients since now R has continuously many Fourier modes] Bringmann: Yes. Bonus question: can you come up with a concrete counterexample? [Uhh] Bringmann: Think of some functions whose Fourier coefficients decay rapidly and uniformly. [Uhh well to get rapid decay I can use a C^inf function by integrating by parts] Bringmann: Use that R is very long [Uhh wait maybe I make functions of compact support getting larger and larger support?] Bringmann: You can also keep the support the same size.... [Oh! I take one function on [0, 1] and translate it around!] Bringmann: Yes! [Oh! Translation multiplies Fourier coefficients by a thing of norm 1, so we'll get uniform decay.] COMPLEX ANALYSIS: Bringmann: Should I start? Venkatesh: Ooh I have a question! Venkatesh: You said that if you have a C^inf function on S^1 its Fourier coefficients decay faster than any polynomial. What can you say about a real analytic function? [Uhhh] Venkatesh: Define real analytic function [I did, he corrected my definition to remind me the power series depends on the point, and then I drew a little neighborhood of the disk] Venkatesh: Hint: This is the complex analysis section [OK, interpret it as a holomorphic function on this little ring region given by the same power series. Then the Fourier coefficients are an integral integral over S^1 f(x)e^{-nix} dx] Venkatesh: OK rewrite that by viewing S^1 inside of C [Ok so \integral |z| = 1 of f(z)/z^n dz] Venkatesh: Move the contour outside a little [Ok so if you move to |z| = 1 + \epsilon then you go down by a factor of (1+epsilon)^n, I stared confused and then he told me I had proven what he wanted: now instead of decaying faster than a polynomial they decay exponentially.] Bringmann: This was a very nice segue between real and complex. Now my question: prove Liouville's theorem. [I proved it, I hesitated a little but then fixed my mistakes and wrote it down.] Bringmann: OK surely you know residue theorem, right? [Yes] Bringmann: OK good. Now using residue theorem prove fundamental theorem of algebra in the following way. Write p(z) a polynomial of degree n, and look at integral over circle of radius R of z^(n-1)/p(z). [I wrote down the integral, said that if p(z) had no zeroes then |p(z)| > epsilon for all z, but then struggled to bound the integral. With some prodding from Akshay and Bringmann I wrote down z^(n-1)/p(z) = 1/(az + b + o(z)) and then integrated this successfully] Bringmann: Good, now we do something number theorists like. Write f of, I think number theorists like s Venkatesh: Yes we do [Knowing what was coming, I wrote down f(s) = \sum_{n =1}^{\infty} a_n/n^s.] Bringmann: OK so you know it. Change infinity to an N to make it easier. Prove if x is not an integer, then a_1 + a_2 + ... + a_floor(x) = lim t->infinity (2pi i)^(-1) * integral from B-it to B+it of f(s) * x^s/s ds [I wrote down dx at first but Akshay corrected me] [I was stuck, but Bringmann suggested I move the contour of integration] [I wrote out a_n/s * (x/n)^s ds and said OK in the regime where x > n and x < n I guess (x/n)^s has different behaviors, so maybe I should move the contour to places where Re(s) is very large or very small. This is basically the idea, but I struggled a little on showing the other two sides of the rectangle have negligible contributions and on remembering that there was a pole at s = 0. Eventually I got it with lots of help.] This was the last analysis question. Bhargav asked if I wanted to break now; I said maybe we do algebra first and break before the special topics. The committee approved my request. ALGEBRA Venkatesh: Write down a Z/3 extension of Q. Actually don't do it concretely, just tell me how you'd do it. [I mention the Galois group of a cubic is either A_3 = Z/3 or S_3, you can determine which it is by looking at the discriminant] Venkatesh: What is discriminant? [I wrote it down in terms of roots, mentioned how even/odd parity permutations impact it] Venkatesh: How do you compute it for a polynomial without factoring the polynomial? [I said Vieta's formulas and then he was pleased and said I need not go further] Bhatt: Tell me about GL_2(Z/p^n) [Uhh let me do n=1 first] Bhatt: OK what is its order [(p^2 - 1) * (p^2 - p)] Bhatt: Write down a p-Sylow subgroup [It has size p, so it must be Z/p, oh so 1 t 0 1] Bhargav: OK now GL_2(Z/p^2), what is its order. [Reduce mod p, OK so p^4 * (p^2 - 1) * (p^2 - p). At first I wrote (p-1)^4 but then Bhargav pointed out to me that there are p numbers in the range 0, 1, ..., p-1.] Bhatt: OK let's find the p-Sylow subgroup. Write down what you just did with reduction conceptually. [I wrote a short exact sequence 0 -> K -> GL_2(Z/p^2) -> GL_2(F_p) -> 0] Bhatt: What's the kernel K? [I described it concretely and showed it was (Z/p)^4] Bhatt: What is the resulting 4-dimensional representation of GL_2(F_p) you produced? [I was confused] Bhatt: OK more conceptually, let S be a smooth scheme over Z/p^2. What can you say? [I was confused] Bhatt: OK now try S smooth over F_p[epsilon] [OH! The set of points living over a given Fp point is the tangent space!] Bhatt: Good. [OH! So this 4-dimensional representation is the adjoint representation of GL_2(F_p) on gl_2(Z/p^2)] Bhatt: No its gl_2(F_p). OK now what's the p-Sylow? [Uhm it's order p^5 and contains (Z/p)^4 but idk if it's split] Venkatesh: I don't think he's asking for it as an abstract group Bhatt: Yeah just say what it is [OK preimage of the p-Sylow from GL_2(F_p)] Bhatt: Good, for p > 5 it's not a split extension of Z/p by (Z/p)^4. Akshay do you know a good reason why? Venkatesh: Some reason Venkatesh: OK suppose I have a real symmetric matrix. Tell me about diagonalizing it. [I tell him the eigenspaces are orthogonal and you can unitarily diagonalize it.] Venkatesh: Prove it [Reduce to the case A is nilpotent, show nilpotent + self-adjoint implies zero.] Venkatesh: How would you diagonalize this using a computer? [Uhhh] Venkatesh: It's OK if you don't know Bhatt: Ask ChatGPT Bhatt: Do you know Cayley-Hamiliton? [yes] silence Bhatt: OK tell us it! [oh, the char poly of a matrix A is p(t), C-H says p(A) = 0] Bhatt: Prove it [you can compute but it's hard for me. for diagonalizable matrices you can compute easily. but diagonalizable matrices are Zariski dense, so we're OK.] Venkatesh: Why are they Zariski dense? [A^(n^2) is irreducible so it suffices to prove they're nonempty open. ofc nonempty, and open because it's exactly the locus where the discriminant of f does not vanish] == we take a break for five minutes. Akshay and Bringmann leave the room, Bhargav asks how I think it's going. I say I think I should have done better on complex; he says I did fine on the analysis, and we talk about Matt Emerton and Plein Air for a bit. == Venkatesh and Bringmann return. Special topics! [can we do algebraic geometry first?] ALGEBRAIC GEOMETRY Bhatt: Embed a genus 1 curve in projective space. [If it has a point then I can embed it in P^2, I do the whole O(2P_0), O(3P_0), ... discussion and at some point he is satisfied] Venkatesh: Why is it called an elliptic curve? [I mention elliptic integrals and Abel-Jacobi] Bhatt: What if it's over R and has no R-points? Then can you embed in P^2? [Uhh... wait no because then it would have a degree 3 divisor by Riemann-Roch but every divisor has even degree] Bhatt: OK now over a finite field. This isn't actually a question. [??? I say a few things and tell him idk] Bhatt: This isn't a question but you can always find an Fp point of a genus 1 curve. [I start trying to figure out how to do this, get stuck, and then we move on] Bhatt: Can I embed a genus 1 curve in higher P^N as a complete intersection? [idk but adjunction seems relevant, let me do that. I write down the adjunction thing, find that d1 + ... + d(N-1) = N + 1, say the only solution is setting every d = 1, get confused because in N=2 this is false, then Bhargav corrects me and I realize I solved this equation poorly.] Bhatt: yeah so you have a ton of 1's, but if you have a 1 it was really an embedding in a smaller N, so N=2 and N=3 are the only `real' cases Bhatt: OK now give an example of a smooth affine curve which has nontrivial Picard group [Huh so I'm looking for a Dedekind domain which is not a UFD. I wrote down a random guess of an elliptic curve and he tells me to compute the class group geometrically. I write down the right exact sequence relating it to the class group of the projective closure.] Bhatt: OK. Can Pic(X) be finitely generated? [I struggled for a while and mention H^1_et(X, mu_n) = n torsion in Pic(X). He says work over C, I do, and prove no because its uncountable but finitely generated abelian groups are countable. He says OK good; for countable fields you can make the same theorem true if you use the etale cohomology a bit more judiciously.] Bhatt: OK good so you have an example now. [REMARK: After the exam he told me he was originally going to ask if I could get a subset of P^1 as an example. I don't know.] Bhatt: Surfaces. If I blow up X at a point how does the cohomology change? [It doesn't, because Rpi_*O\tilde{X} = OX] Bhatt: What assumptions are you making? [Uhm nonsingular projective] Bhatt: Good. But this is local so you really only need nonsingular. Prove it. [I recite the proof from Hartshorne, with the key points being E^2 = -1 and E is genus 0] Bhatt: OK now come up with an example where this fails. [Uh let me write something down. I write x^2 + y^2 + z^2 = 0. He asks me what it is in the blowup, I compute, and wrats it still holds for this because E^2 = -2 and genus E is 0.] Bhatt: In fact the only way to make trouble is to have a genus not zero exceptional divisor. How do you do that? [Uhh] Bhatt: Was it a coincidence that you got the same equation for the blowup and exceptional divisor? [Uhh] Bhatt: Cone [OH this is the affine cone over something so when we blow up the cone point of course we get our projective thing, I write down the cone over an elliptic curve and Bhatt is happy] REPRESENTATION THEORY Venkatesh: What are the irreps of SU(2)? [I list them and their characters] Venkatesh: Why are they self-dual? [chi = chi bar] Venkatesh: Why is chi bar the character of the dual? What property of SU(2) are you using? [I write down the contragradient representation, say the character is chi(g) = chi(g^(-1)), and mention SU(2) compact --> G-invariant Hermitian form --> eigenvalues are roots of unity. Akshay corrects me that a root of unity is a complex number which is torsion; I should have said something on the unit circle.] Venkatesh: OK. Now which of these irreps admit symmetric bilinear G-invariant forms? skew symmetric? [I mention Frobenius-Schur indicator, Sym^2 and Alt^2, and with some prodding he gets me to write down a skew symmetric form on the standard representation (symplectic form) and says the important thing is the S in SU(2). Then we write down a symmetric form on Sym^2(std) and I mention polarization and Schur-Weyl duality.] Venkatesh: Did you study higher rank groups? (yes) How does SO(4) relate to SU(2)? [SU(2) \times SU(2) is a double cover of SO(4), spin] Venkatesh: Let's do this concretely. Write down a 4-dimensional representation of SU(2) \times SU(2). [I do std \otimes std, we relate to the bilinear form discussion from before to show the resulting matrices are orthogonal.] Venkatesh: OK cool. Tell me about U(n). [I say the weights and Weyl character formula] Venkatesh: Decompose U(n) acting on its Lie algebra [I do the case of U(3) first as an example, then I see what's going on with some help from Akshay; when I say the magic word Weyl character formula he decides it's OK to stop.] Venkatesh: Do you know the Haar measure on SO(3)? [No] Venkatesh: Do you know about orthogonality of matrix coefficients? [No] Venkatesh: OK. Rep theory over! == They dicussed for about 30 seconds, then Bhargav shook my hand said I passed. I drank the cup of water I brought with me (which I barely touched during the exam) in one big pour. Bringmann shook my hand too. Akshay asked me to rate him on Yelp. He said his daughter rates him after he helps her with math homework and he started bad and got better. Bhargav says Beilinson's daughter once explained to him how Beilinson was not very helpful at aiding in high school math homework. Bringmann and Beilinson leave, I talk to Bhargav about our next meeting. He tells me after he passed his generals, he played Mario Kart. I tell him I will go to New York City over the weekend to celebrate. THOUGHTS: The committee was very nice and didn't want details if they thought I understood something. They gave lots of hints. I studied a ton of general theory which in the end was not asked about at all; every question felt like some specific example. Especially if you're doing algebraic geometry as a special topic, I'd say just read chapters 4 and 5 and see all the examples in a lot of detail. I was disappointed that rep theory didn't go beyond SU(2), as I spent the month rehearsing rank 2 groups over and over again. It did help me realize immediately that SO(4) had a double cover of SU(2) \times SU(2) when Akshay asked, though. I spent a lot of time rehearsing all the proofs of the basic facts about representation theory from Adams' book, but before the exam Akshay told me he didn't remember how to prove the Weyl group is generated by reflections and during the exam he never asked me about anything more than these examples involving SU(2). I think in the end I should have just spent all my time on examples. I did much better on the real analysis section than I expected! I am happy I remembered Chicago's analysis sequence; Holder's inequality saved me. Also, the best part is getting to study with other people, so talk to the other first years about generals!