So, suppose given a root system ${\Phi}$ in a euclidean space ${E}$, which arises from a semisimple Lie algebra and a Cartan subalgebra as before. The first goal of this post is to discuss the “splitting”

$\displaystyle \Phi = \Phi^+ \cup \Phi^-$

(disjoint union) in a particular way, into positive and negative roots, and the basis decomposition into simple roots. Here ${\Phi^- = - \Phi^+}$.

To do this, choose ${v \in E}$ such that ${(v, \alpha) \neq 0}$ for ${\alpha \in \Phi}$. Then define ${\Phi^+}$ to be those roots ${\alpha}$ with ${(v,\alpha)>0}$ and ${\Phi^-}$ those with ${(v,\alpha) < 0}$. This was easy. We talked about positive and negative roots before using a real-valued linear functional, which here is given by an inner product anyway.

Bases

OK. Next, I claim it is possible to choose a linearly independent set ${\Delta \subset \Phi^+}$ such that every root is a combination

$\displaystyle \alpha = \sum k_i \delta_i, \quad \delta_i \in \Delta, \ k_i \in \mathbb{Z}$

with all the ${k_i \geq 0}$ or all the ${k_i \leq 0}$.

Then ${\Delta}$ will be called a base. It is not unique, but I will show how to construct this below.

An amusing application of the condition on the simple roots is that ${(\alpha, \beta) \leq 0}$ when ${\alpha, \beta \in \Delta, \alpha \neq \beta}$; otherwise, if ${(\alpha,\beta)>0}$ we would have

$\displaystyle s_{\alpha}(\beta) = \beta - 2\frac{ (\beta, \alpha)}{(\alpha, \alpha)} \alpha \in \Phi$

but this has ${\beta \in \Delta}$ occurring with coefficient ${1}$ and ${\alpha}$ occuring with negative coefficient. This contradicts the hypothesis on simple roots.

Anyway, to construct ${\Delta}$, first choose ${v}$ as above and the corresponding splitting ${\Phi = \Phi^+ \cup \Phi^-}$. Let ${\Delta}$ denote the set of positive roots that cannot be written as the sum of two positive roots—in other words, the indecomposable ones. I claim ${\Delta}$ spans ${E}$. Indeed, it is enough to check that each positive root is a sum of indecomposable ones. If ${\alpha \in \Phi^+}$ is not, we can choose ${\alpha}$ such that ${(\alpha, v)}$ is minimal among roots which are not sums of indecomposable ones; then, however, ${\alpha}$ is itself not indecomposable, so we can write ${\alpha = \beta + \beta'}$ for ${\beta, \beta' \in \Phi^+}$. Then ${\beta, \beta'}$ are by the inductive hypothesis (since ${(\beta, v), (\beta', v) < (\alpha, v)}$) sums of indecomposable ones; thus so is ${\alpha}$, contradiction.

Next, I claim that ${\Delta}$ as constructed above via indecomposable roots satisfies ${(\alpha, \beta) \leq 0}$ for distinct ${\alpha, \beta \in \Delta}$, which we showed was necessary for a construction of simple roots. If ${(\beta, \alpha)>0}$ then ${\beta - \alpha}$ is a root (cf. the discussion of maximal strings here). If ${\beta - \alpha \in \Phi^+}$, then ${\beta = (\beta - \alpha) + \alpha}$ is not indecomposable. Similarly if ${\beta - \alpha \in \Phi^-}$, then ${\alpha = (\alpha -\beta) + \beta}$ is not indecomposable. This will let us prove linear independence. Indeed, if we had an expression

$\displaystyle \sum a_i \delta_i = \sum b_j \delta'_j$

for ${\delta_i, \delta'_j \in \Delta}$ distinct sets and ${a_i, b_j \geq 0}$, then taking inner products with itself yields

$\displaystyle \sum_{i,j} a_i b_j (\delta_i, \delta'_j) \geq 0$

which is a contradiction by ${(\delta_i, \delta'_j) \leq 0}$ unless all the ${a_i, b_j}$ are zero, which is impossible since taking inner products with ${v}$ gives something positive.

Incidentally, this resembles the argument in Sylvester’s theorem on quadratic forms.

Proposition 1 ${\Delta}$ as just defined is a base.

We will denote it by ${\Delta(v)}$ to emphasize the dependence on ${v \in E}$.

It is in fact the case that any base ${\Delta}$ can be obtained in such a fashion. To see this, we will construct ${v}$ such that ${(\alpha, v) > 0}$ for all ${\alpha \in \Delta}$ (and consequently ${v}$ is orthogonal to no member of ${\Phi}$). I claim then that ${\Delta = \Delta(v)}$. Indeed, ${\Delta}$ leads to a decomposition ${\Phi^+ \cup \Phi^-}$ where ${\Phi^+}$ consists of roots that are positive linear combinations of ${\Delta}$, and ${\Phi^-}$ those that are negative linear combinations of ${\Delta}$. It is clear that the splitting ${\Phi = \Phi^+ \cup \Phi^-}$ is also obtained by taking inner products with ${v}$ as at the beginning of this post—i.e., ${\Phi^+}$ consists of those roots that bracket positively with ${v}$. Now it is clear that ${\Delta \subset \Phi^+}$ is indecomposable because it is a basis, and the set of indecomposable elements of ${\Phi^+}$ is itself a basis by the above arguments of Proposition 1. If a basis is contained in another, the two are equal though. So ${\Delta = \Delta(v)}$.

It now remains to construct ${v}$ from ${\Delta}$, and in fact we can do it for any basis of ${E}$. For each ${\alpha \in \Delta}$, one defines ${v_{\alpha}}$ to be the projection of ${\alpha}$ into the space spanned by ${\Delta - \{ \alpha \}}$. Then ${v_{\alpha}}$ is orthogonal to ${\Delta - \{ \alpha \}}$ but brackets positively with ${\alpha}$. So ${\sum v_{\alpha}}$ brackets positively with all of ${\Delta}$.

Proposition 2 Any basis can be written as ${\Delta(v)}$ for some ${v \in E}$ orthogonal to no element of ${\Phi}$.

Next, there is another result:

Proposition 3 Given ${\alpha \in \Phi^+}$, we can write ${\alpha = \delta_1 + \dots + \delta_r}$ where each partial sum is a positive root.

We can write ${\alpha}$ as a sum ${\sum_{\delta \in \Delta} k_{\delta} \delta}$ where ${k_{\delta} \in \mathbb{Z}_{\geq 0}}$. Induct on the degree ${\sum k_{\delta}}$; if it is one, then ${\alpha \in \Delta}$ already. If not, then ${\alpha}$ is at least not orthogonal to all of ${\Delta}$. Now if ${(\alpha, \delta) \leq 0}$ for all ${\delta \in \Delta}$ then it is easily checked that ${\{ \alpha \} \cup \Delta}$ would be linearly independent, so we can choose ${\delta \in \Delta }$ with

$\displaystyle (\alpha, \delta) > 0.$

As a result ${\alpha - \delta \in \Phi}$, and it is moreover a positive root. We can apply the inductive hypothesis to ${\alpha - \delta}$ and use ${\alpha = (\alpha - \delta) + \delta}$.

Weyl chambers

Consider ${Y = \{ v\in E: (v,\alpha) \neq 0 \ \forall \alpha \in \Phi \}}$; this is the complement of the union of several hyperplanes. Each ${v \in Y}$ corresponds to a basis ${\Delta}$, and it is clear that ${\Delta(v) = \Delta(v')}$ if ${v,v'}$ are sufficiently close to one another. In particular, the connected components of ${Y}$ are in one-to-one correspondence with the bases of ${Y}$.

These connected components are called Weyl chambers. Next, we’ll define a group that permutes the Weyl chambers.

I learned this from Shlomo Sternberg’s book.