This is the last homework set of the term. It is due Friday, April 27, 2012, at the beginning of lecture, but I am fine collecting it during dead week, if that works better.
Recall that, given a group , the commutator of is . The derived group is the subgroup of generated by the commutators,
- Suppose that . In class we saw that , and that equality holds for . Determine (with a proof or a counterexample) whether equality holds for all .
[Here is one possible approach: Can you write any product of two transpositions as a commutator? You may want to consider two cases, depending on whether the two transpositions are disjoint or not. Do these products generate , or what else is needed?]
- The book lists for all all groups (up to isomorphism) of order , see pages 210-213. Pick a group with , and show that
(Note the difference with the definition above. Here we are saying that every element of is a commutator, rather than simply saying that it is a product of commutators, as in the definition.)
It is not true that for every , the elements of are commutators. The smallest counterexample has order 96, so they are not that easy to identify.
- As an extra credit problem, show that if , then any element of is a commutator.
Here I sketch an argument (due to Phyllis Joan Cassidy) showing an explicit (infinite) counterexample. For additional references, you may want to see this MathOverflow question, or this Math.StackExchange question.
Suppose that is a polynomial in , that is a polynomial in , and that is a polynomial in and — we are allowing to only depend on one of the two variables, or to be constant. Similarly with and .
Define to be the matrix
Let be the set of all these matrices.
- Show that the matrix inverse of is .
- Show that the product of and is the matrix , and conclude that is a group (with matrix multiplication as the operation).
- Show that the commutator is .
- Define to be the (abelian) group of matrices of the form with a polynomial in and . Note that the operation is particularly simple here: . Conclude that is isomorphic to the (additive) group of polynomials in . Also, show that .
- In fact, . To see this, we must show that, for any polynomial in , we have that is a product of commutators. Prove this by showing that, if , then
(As usual, you may want to try a few examples first to make sure you understand this formula.)
Now we get to the crucial part of the argument: We want to show that not every is a commutator. In fact, let’s show that, for any , there is an such that is not a product of commutators:
- Let . From the formulas above, show that what we need to prove is that we cannothave an equality of the form
for some polynomials , with .
To prove this, argue by contradiction: Suppose we have an equation of this form. Consider both sides of the equation as polynomials in whose “coefficients” are polynomials in :
Write and , and compare the coefficients of on the left and right of .
- Check that we get, for each , an equality of the form
- Now, we invoke some linear algebra to finish: The polynomials are linearly independent. However, the equations above show that they belong to the space generated by the polynomials , . Explain why this is a contradiction.