representation theory


Apologies for the lack of posts here lately; I’ve been meaning to say many things that I simply have not gotten around to doing. I’ve been taking a course on infinite-dimensional Lie algebras this semester. There are a number of important examples here, most of which I’ve never seen before. This post will set down two of the most basic.

1. The Heisenberg algebra

The Heisenberg Lie algebra {\mathcal{A}} is the Lie algebra with generators {\left\{a_j, j \in \mathbb{Z}\right\}} and another generator {K}. The commutation relations are

\displaystyle [a_j, a_k] = \begin{cases} 0 & \text{ if } j + k \neq 0 \\ j K & \text{ if } j = -k . \end{cases} ,

and we require {K} to be central. This is a graded Lie algebra with {a_j} in degree {j} and {K} in degree zero.

The Heisenberg algebra is a simple example of a nilpotent Lie algebra: in fact, it has the property that its center contains the commutator subalgebra {\mathbb{C} K}.

The factor of {j} in the relation for {[a_j, a_k]} is of course a moot point, as we could choose a different basis so that the relation read {[a_j, a_k] = \delta_{j, -k}}. (The exception is {a_0}: that has to stay central.) However, there is a geometric interpretation of {\mathcal{A}} with the current normalization. We have

\displaystyle [a_j, a_k] = \mathrm{Res}( t^j d t^k)_{t = 0} K.

Here {\mathrm{Res}} denotes the residue of the differential form {t^j dt^k = k t^j t^{k-1} dt} at {t = 0}. Since only terms of the form {t^{-1} dt} contribute to the residue, this is easy to check.

As a result, we can think of as the Lie algebra of Laurent polynomials plus a one-dimensional space:

\displaystyle \mathcal{A}= \mathbb{C}[t, t^{-1}] \oplus \mathbb{C}K

where {K} is central, and where the Lie bracket of Laurent polynomials {f, g } is

\displaystyle \mathrm{Res}_{t =0 } (f dg) K.

Note that any exact form has residue zero, so {\mathrm{Res}_{t = 0}(fdg) = -\mathrm{Res}_{t=0}(g df)} (by comparing with {d(gf)}). This explains the antisymmetry of the above form. (more…)

A little earlier, we studied invariant theory for the general linear group {GL(V)} for a finite-dimensional vector space {V} over {\mathbb{C}}. We considered the canonical representation on {V^{\otimes p} \otimes V^{* \otimes q}} and studied “invariant polynomials” on this space: that is, polynomials {P: V^{\otimes p} \otimes V^{* \otimes q} \rightarrow \mathbb{C}} constant on orbits. We showed that these formed a finitely generated {\mathbb{C}}-algebra, and indeed gave a set of generators: these were given by pairing a factor of {V} with a factor of {V^*} with respect to the evaluation pairing. This is not, of course, a linear map, but it is a well-defined polynomial map of {p} vector and {q} covector variables.

 1. Introduction

Now we want to consider a more general question. Let {G} be an (affine) algebraic group over {\mathbb{C}}, acting on the finite-dimensional vector space {V}. We’d like to ask what the invariant polynomials on {V} are, or in other words what is {(\mathrm{Sym} V^*)^G}. It was a Hilbert problem to show that this “ring of invariants” is finitely generated. The general answer turns out to be no, but we will show that it is the case when {G} is reductive.

What is a reductive group? For our purposes, a reductive group over {\mathbb{C}} is an algebraic group {G} such that the category {\mathrm{Rep}(G)} of (algebraic) finite-dimensional representations is semisimple. In other words, the analog of Maschke’s theorem is true for {G}. The “classical groups” (the general linear, special linear, orthogonal, and symplectic groups) are all reductive. There is a geometric definition (which works in characteristic {p} too), but we will just take this semisimplicity as the definition.

The semisimplicity is quite a surprising phenomenon, because the method of proof of Maschke’s theorem—the averaging process—fails for reductive groups, which are never compact in the complex topology (as then they would not be affine varieties). However, it turns out that a reductive group {G} over {\mathbb{C}} contains a maximal compact Lie subgroup {K} (which is not algebraic, e.g. the unitary group in {GL_n}), and the category of algebraic representations of {G} is equivalent (in the natural way) to the category of continuous representations of {K}. Since the category of continuous representations is always semisimple (by the same averaging idea as in Maschke’s theorem, with a Haar measure on {K}), {\mathrm{Rep}(K)} is clearly semisimple. But this is {\mathrm{Rep}(G)}.

Anyway, here’s what we wish to prove:

Theorem 1 Let {G} be a reductive group over {\mathbb{C}} acting on the finite-dimensional vector space {V}. Then the algebra of invariant polynomials on {V} is finitely generated. (more…)

With the semester about to start, I have been trying to catch up on more classical material. In this post, I’d like to discuss a foundational result on the ring of invariants of the general linear group acting on polynomial rings: that is, a description of generators for the ring of invariants.

1. The Aronhold method

Let {G} be a group acting on a finite-dimensional vector space {V} over an algebraically closed field {k} of characteristic zero. We are interested in studying the invariants of the ring of polynomial functions on {V}. That is, we consider the algebra {\mathrm{Sym} V^*}, which has a natural {G}-action, and the subalgebra {(\mathrm{Sym} V^*)^G}. Clearly, we can reduce to considering homogeneous polynomials, because the action of {G} on polynomials preserves degree.

Proposition 1 (Aronhold method) There is a natural {G}-isomorphism between homogeneous polynomial functions of degree {m} on {V} and symmetric, multilinear maps {V \times \dots \times V \rightarrow k} (where there are {m} factors).

Proof: It is clear that, given a multilinear, symmetric map {g: V \times \dots \times V \rightarrow k}, we can get a homogeneous polynomial of degree {m} on {V} via {v \mapsto g(v, v, \dots, v)} by the diagonal imbedding. The inverse operation is called polarization. I don’t much feel like writing out, so here’s a hand-wavy argument.

Or we can think of it more functorially. Symmetric, multilinear maps {V \times \dots \times V \rightarrow k} are the same thing as symmetric {k}-linear maps {V^{\otimes m} \rightarrow k}; these are naturally identified with maps {\mathrm{Sym}^m V \rightarrow k}. So what this proposition amounts to saying is that we have a natural isomorphism

\displaystyle \mathrm{Sym}^m V^* \simeq (\mathrm{Sym}^m V)^*.

But this is eminently reasonable, since there is a functorial isomorphism {(V^{\otimes m})^* \simeq (V^{*})^{\otimes m}} functorially, and replacing with the symmetric algebra can be interpreted either as taking invariants or coinvariants for the symmetric group action. Now, if we are given the {G}-action on {V}, one can check that the polarization and diagonal imbeddings are {G}-equivariant. \Box

2. Schur-Weyl duality

Let {V} be a vector space. Now we take {G = GL(V)} acting on a tensor power {V^{\otimes m}}; this is the {m}th tensor power of the tautological representation on {V}. However, we have on {V^{\otimes m}} not only the natural action of {GL(V)}, but also the action of {S_m}, given by permuting the factors. These in fact commute with each other, since {GL(V)} acts by operators of the form { A\otimes A \otimes \dots \otimes A} and {S_m} acts by permuting the factors.

Now the representations of these two groups {GL(V)} and {S_m} on {V^{\otimes m}} are both semisimple. For {S_m}, it is because the group is finite, and we can invoke Maschke’s theorem. For {GL(V)}, it is because the group is reductive, although we won’t need this fact. In fact, the two representations are complementary to each other in some sense.

Proposition 2 Let {A \subset \mathrm{End}(V^{\otimes m})} be the algebra generated by {GL(V)}, and let {B \subset \mathrm{End}(V^{\otimes m})} be the subalgebra generated by {S_m}. Then {A, B} are the centralizers of each other in the endomorphism algebra. (more…)

[Minor corrections made, 6/21]

I’d now like to discuss my paper “Categories parametrized by schemes and representation theory in complex rank.” My RSI project was rather-open ended: to investigate the categories of representation theory in complex rank. Pavel Etingof told me that it would be expected that they would behave similar in some ways to the integral case (at least if “there was justice in the world”). For instance, we know that Deligne’s {\mathrm{Rep}(S_t)} has a comibnatorial parametrization of simple objects similar to the classical case. However, as I discovered when I got there, I don’t actually know representation theory. I had looked through some material on finite groups, and knew (in outline, not usually proofs) the basic facts about the symmetric group. I certainly didn’t know anything about Hecke algebras (the literature of which seems rather inaccessible to beginners), and I don’t think I could define a semisimple Lie algebra. Anyway, so what I did was therefore was the easy case: representation theory in transcendental rank. I sort of ended up stumbling into this by accident, so I’ll try to reconstruct the story below, somewhat. I apologize in advance to readers that know algebraic geometry and will probably find this post rather slow-moving (it’s really addressed with a younger version of myself in mind). Readers may wish, however, to review my earlier posts on this topic.

(more…)

We have now discussed some of the basic properties of Deligne’s categories {\mathrm{Rep}(S_t)}, and some of the rich structure that they have. It turns out, as I have already mentioned, that Deligne did the same for representation categories of the other classical groups.

Knop described how to do it for the wreath products, obtaining categories {\mathrm{Rep}(S_t \ltimes G^t)} for {t \in \mathbb{C}}; here the central object is the “standard representation” {\mathfrak{h}_G} of {G}-invariant functions {G \rightarrow  \mathbb{C}oprod_n G}, which has a natural action of {S_n \rtimes G^n}. The representation {\mathfrak{h}_G} is faithful, and again one uses its tensor powers and a combinatorial parametrization of its morphisms to interpolate. For the details in much more generality, see Knop’s paper; he actually constructs tensor categories via the calculus of relations out of arbitrary “regular categories.” (My paper has a brief exposition of how things play out in the special case of {\mathrm{Rep}(S_t \ltimes G^t)}.) These categories, like Deligne’s, are semisimple symmetric tensor categories.

It turns out, however, that many families of algebraic objects of interest in representation theory depend on a parameter {n \in  \mathbb{Z}_{\geq 0}}, and are built out of the corresponding (i.e., depending on {n}) classical groups (i.e. symmetric, orthogonal, etc.). One example is the family of algebras {S_n \ltimes A^{\otimes n}} for {A} an associative algebra. This is a rather simple one; a more complicated one is given by the family of Hecke algebras. The additional relations and generators corresponding to the part of these objects not in the classical groups can, however, often be stated in a uniform, categorical manner independent of {n}.

Using this, Etingof proposed a program of studying the representation categories of these objects in complex rank, which he constructed out of Deligne’s categories. I will briefly explain what this is all about. Consider the example of the family of semidirect product algebras; it’s simpler than what Etingof focuses on, but I’d be horrendously unqualified to really say anything about any of them. (more…)

[This post, a continuation of the series on representation theory in complex rank, discusses the irreducibles in Deligne’s category \mathrm{Rep}(S_t) for t \notin \mathbb{Z}_{\geq 0} and what one can do with them.]

OK, so we now know that Deligne’s categories {\mathrm{Rep}(S_t)} are semisimple when {t \notin  \mathbb{Z}_{\geq 0}}. But, this is a paradox. Deligne’s categories, a family of categories constructed to interpolate the semisimple categories of representations of {S_n, n \in \mathbb{Z}_{\geq  0}} are semisimple precisely at the complement of the nonnegative integers!

The problem is, when {t \in \mathbb{Z}_{\geq 0}}, {\mathrm{Rep}(S_t)} is not equivalent to the ordinary category {\mathrm{Rep}^{\mathrm{ord}}(S_t)}. The problem is that not all relations correspond to actual morphisms. Deligne in fact shows that the ordinary category can be obtained as a quotient of his {\mathrm{Rep}(S_t)} (via the tensor ideal of “neglligible morphisms”) but this isn’t really important for the story I’m telling.

1. Motivation and remarks

Today, I want to talk about what the simple objects in {\mathrm{Rep}(S_t), t \notin \mathbb{Z}_{\geq 0}}, look like. We know what the simple {S_n}-representations are; they are the Specht modules, parametrized by the Young diagrams of size {n}. It turns out that the simple objects in {\mathrm{Rep}(S_t)} are parametrized by the Young diagrams of arbitrary size. There is an interesting way of thinking about this that Etingof explains in his talk, and which I will try to motivate here now.

OK. So, just as we defined a filtration on Deligne’s categories yesterday, let’s define a filtration on the ordinary representation categories {\mathrm{Rep}^{\mathrm{ord}}(S_n), n \in \mathbb{Z}_{\geq  0}}. Namely, we let {\mathrm{Rep}^{\mathrm{ord}}(S_n)^{(N)}} denote the category generated by {\mathfrak{h}^{\otimes p}, p \leq  N} for {\mathfrak{h}} the regular representation. When {N} is large enough, this becomes the full category, so we will always pretend that {n} is really really large relative to {N} (which is kinda ironic when you think about the notation…).

Anyhow, we want to look at the simple objects in {\mathrm{Rep}^{\mathrm{ord}}(S_n)^{(N)}}. Well, these are going to have to correspond to some Young diagrmas of size {n}, but the question is which ones?

I claim that the Young diagrams that arise are precisely those where the rows below the top have {\leq N} boxes.

In particular, as {n} gets large, the top row must get really long, but the number of simple objects stays bounded. (more…)

This post continues my series on representation theory in complex rank, begun here with a discussion of Deligne’s interpolation of the representation categories of the symmetric group, introduced in his 2004 paper.

Semisimplicity is the basic structure theorem for Deligne’s categories, and I would be extremely remiss in my discussion of representation theory in complex rank if I did not say something about it.

So, let’s review. In the first post, I explained and motivated the definition of Deligne’s categories {\mathrm{Rep}(S_t)}. Incidentally, Deligne did the same for the other classical groups, i.e. {GL_n, O_n, Sp_{2n}}, but I shall not discuss them. The categories {\mathrm{Rep}(S_t)}, are defined as the pseudo-abelian envelope of the {\mathbb{C}}-linear category generated by objects {\mathfrak{h}^{\otimes p}}, where the hom-spaces {\hom( \mathfrak{h}^{\otimes p}, \mathfrak{h}^{\otimes r})} are free on the equivalence relations on {\mathbf{ p+r}}, and composition is given by a combinatorial expression which is polynomial in the rank {t} (hence interpolable).

Now, we just have an abstract category with formal objects and morphisms corresponding in no obvious way to anything concrete. To prove it is semisimple, we cannot use therefore techniques such as those in the proof of Maschke’s theorem of Weyl’s complete reducibility theorem.

But we can do it by appealing to what I discussed in the second post of this series: by proving that the endomorphism rings are semisimple and the category is nonnilpotent. In fact, since direct products and factor rings of semisimple rings are semisimple, we only need to prove that the algebras {\hom_{\mathrm{Rep}(S_t)}(\mathfrak{h}^{\otimes p}, \mathfrak{h}^{\otimes p})} are semisimple (in addition to nonnilpotence). This endomorphism ring (depending on the size {p} and the rank {t}) is an important object, called the partition algebra, and you can look it up e.g. here. But I don’t know how to prove directly that the partition algebra is semisimple. So I will follow Deligne (and Knop) in the (inductive) proof (which will also imply semisimplicity of the partition algebra).

I will do this in two steps. First, I will use a little bit of combinatorics to show that when {t \notin \mathbb{Z}_{\geq 0}}, the category {\mathrm{Rep}(S_t)} is nonnilpotent. Next, I will use this to prove semisimplicity.

(more…)

I was initially going to talk about why Deligne’s categories of representations of the symmetric group on a nonintegral number of elements are semisimple generically. This is a rather difficult result, and takes quite a bit of preparation in his paper.  However, I got sidetracked. Instead, I will devote this post to a general discussion of semisimple categories.  According to the material here, it follows that in order to show that Deligne’s categories are semisimple, one has to show that the so-called “partition algebra” is a semisimple ring.


1. Review of semisimple categories

Before we specialize to the case of Deligne’s categories, it may help to go through a little abstract nonsense. Suppose {\mathcal{C}} is a semisimple category. This means that {\mathcal{C}} is abelian, and each object in {\mathcal{C}} is a direct sum of simple objects, where simple means that there is no proper subobject. So for instance, the {A}-modules for {A} a semisimple algebra form a semisimple category. The finite-dimensional representations of a semisimple Lie algebra form a semisimple category (though the finite-dimensional condition is necessary; the enveloping algebra is not a semisimple algebra generally).

Now, I want to look at the hom-spaces in a semisimple category. But first, in the next lemma, there is no need to have the semisimplicity asumption, so I drop that.

Remember Schur’s lemma—that lemma in group representation theory, that any morphism between irreducible representations over {\mathbb{C}} is a scalar? The proof of it in different textbooks tends to vary between nonintuitive and clean (depending on the extent of the allegiance of said textbook to category theory). Because when thought of categorically, I claim that it is trivial.

Lemma 1 (Schur, categorical version) Let {X} be a simple object in a {\mathbb{C}}-linear abelian category with finite-dimensional hom-spaces. Then {\hom(X,X) \simeq  \mathbb{C}}. Also, {\hom(X,Y) =0} if {X,Y} are simple and nonisomorphic.

So, let’s prove this. We will first prove that any morphism between simple objects {X,Y} is an isomorphism or zero. If one were not zero, it would have either a nontrivial kernel or cokernel. And this would mean either that {X} had a nontrivial subobject or {Y} a nontrivial subobject—two things that can’t happen for simple {X,Y}.

It is now clear that {\hom(X,Y) = 0} when {X,Y} are nonisomorphic, because a nontrivial morphism would be an isomorphism by the above.

Well, then {\hom(X,X)} is a ring where every nonzero element is invertible—that is to say, a division algebra. It is also finite-dimensional over {\mathbb{C}}by the assumption on the hom-spaces. But every f.d. division algebra over {\mathbb{C}} is {\mathbb{C}} itself; indeed, if {\alpha \notin \mathbb{C}} belonged to such a division algebra, then {\mathbb{C}(\alpha)} would be a finite extension field (yes, commutative—{\alpha} commutes with itself!) and this cannot happen since {\mathbb{C}} is algebraically closed.

In particular, {\hom(X,X) = \mathbb{C}}. This proves Schur’s lemma. Not entirely trivial, but at least swift.

So that’s done. I claim then that, in a semisimple category {\mathcal{C}}, the hom spaces {\hom(X,X)} is ring-isomorphic to a product of matrix algebras over {\mathbb{C}}. This is now straightforward: decompose {X} as a sum of simple objects {S_1 \oplus S_2 \oplus \dots \oplus S_k}. Partition {S_1, \dots, S_k} into equivalence classes based on isomorphism and take the sums {T_j, 1 \leq j  \leq l} of the {S_i} in each equivalence class. Each {T_j} has hom-spaces isomorphic to a matrix algebra, so the claim is clear.

In particular, the hom-rings of a semisimple category are—surprise, surprise—semisimple algebras!

2. What if the hom-spaces are semisimple?

The 45-million-dollar question now arises whether the opposite might be true. In fact, I think it is, with certain hypotheses: this isn’t really about Deligne’s paper anymore, but it’s something that I learned from Friedrich Knop’s very interesting paper “Tensor envelopes of regular categories.” Knop actually generalizes Deligne’s construction and axiomatizes it to constructing large classes of interesting tensor categories (such as representation categories of wreath products {S_t \ltimes  G^{t}} for {G} finite and {t} complex. I may talk more about Knop’s paper later, but right now I am just using it as a source of some fun abstract nonsense.

(more…)

[Updated, 6/12; various errors fixed]

I’ve just uploaded to arXiv my paper (submitted to J. of Algebra) “Categories parametrized by schemes and representation theory in complex rank,” an outgrowth of my RSI project started last summer, where I worked with Pavel Etingof and Dustin Clausen.  I will devote this post to talking about some of the story surrounding it. In short, the project is about looking at this program of studying representation theory when the dimension is complex (admittedly nobody has ever seen a vector space of dimension {\pi}; I will explain this precisely below) in the simplest possible case.  But the categories of interest in the project are built out of certain symmetric tensor categories that Deligne defined back in 2004, and I’ll talk a bit about those today. I could have just jumped straight into my paper, but I figured this would make things potentially more accessible, and would be more fun.

I also recommend looking at these posts of David Speyer and Noah Snyder, which talk about some of Deligne’s work as well (and which I learned a lot from). Also, cf. this talk of Pavel Etingof.  The talk goes further (into the non-semisimple analogs of Deligne’s categories) that I will cover in a later post. Finally, the paper of Deligne is available here.

1. Motivation

The whole story behind this starts with the representation theory of the classical groups—these are objects like {S_n, GL(n),  O(n)}, etc. And in particular, I’m going to zoom in on the symmetric group—or more precisely, the family of symmetric groups {S_n, n \in \mathbb{Z}_{\geq 0}}.

The symmetric group is a very complicated object (indeed, any finite group is a subgroup of a symmetric group, by Cayley’s theorem), but its representation theory has been understood for 100 years and has many interesting combinatorial facets.

In the modern language, we can package the entire representation theory of {S_n} into a category {\mathrm{Rep}^{\mathrm{ord}}(S_n)} (depending on the nonnegative integer {n}). This is a very interesting category for several reasons. The first, and most obvious, part of its structure is that it is a {\mathbb{C}}-linear abelian category.

More interestingly, it’s semisimple: every exact sequence splits. This is because the group algebra {\mathbb{C}[S_n]} is semisimple, by Maschke’s theorem. In addition, it is a tensor category: we can define the tensor product of any two representations of a group in a natural way, and {S_n} is no exception. It is even a symmetric tensor category because we have a nice isomorphism {X \otimes Y \rightarrow Y \otimes X} for any two representations {X,Y}.

Technically, all this works for any finite group. What’s special about the symmetric group is, for instance, the very nice way the simple objects of {\mathrm{Rep}^{\mathrm{ord}}(S_n)} (i.e. irreducible representations) are parametrized. Namely, (as for every finite group) they are in bijection with the conjugacy classes of {S_n}, but (unlike for other groups) we have an explicit map between such conjugacy classes and irreducible representations. Since each conjugacy class of {S_n} corresponds to a partiton of {n} (a well-known fact easily seen because any permutation can be written as a product of disjoint cycles),

The whole idea behind Deligne’s work is that, while there isn’t any such thing as a symmetric group on {\pi} elements, there is nevertheless a category {\mathrm{Rep}(S_\pi)} (or more generally {\mathrm{Rep}(S_t)} for {t \in  \mathbb{C}}) that has much of the same structure. Deligne constructed these categories via an interpolation procedure.

2. Interpolation

(more…)

So, the blog stats show that semisimple Lie algebras haven’t exactly been popular.  Traffic has actually been unusually high, but people have been reading about the heat equation or Ricci curvature rather than Verma modules.  Which is interesting, since I thought there was a dearth of analysts in the mathosphere.  At MathOverflow, for instance, there have been a few complaints that everyone there is an algebraic geometer.     Anyway, there wasn’t going to be that much more I would say about semisimple Lie algebras in the near future, so for the next few weeks I plan random and totally disconnected posts at varying levels (but loosely related to algebra or algebraic geometry, in general).

I learned a while back that there is a classification of the simple modules over the semidirect product between a group and an commutative algebra which works the same way as the (more specific case) between a group and an abelian group.  The result for abelian groups rather than commutative rings appears in a lot of places, e.g. Serre’s Linear Representations of Finite Groups or Pavel Etingof’s notes.  I couldn’t find a source for the more general result though.  I wanted to work that out here, though I got a bit confused near the end, at which point I’ll toss out a bleg.

Let {G} be a finite group acting on a finite-dimensional commutative algebra {A}over an algebraically closed field {k} of characteristic prime to the order of {G}. Then, the irreducible representations of {A} correspond to maximal ideals in {A}, or equivalently (by Hilbert’s Nullstellensatz!) homomorphisms {\chi: A \rightarrow k}, called characters. In other words, {A} acts on a 1-dimensional space via the character {\chi}. (more…)

Next Page »