Permutation Groups


Bijections on finite set, which can be proved to be permutations, form a type of group called the permutation group.

The study of permutation group is originated from and tied with the solvability by radical of algebraic equations. Ever since the solutions to cubic and quartic equations was published, hundred of mathematicians have floundered in the swamp of finding solution to quintic equations, and none of them succeeded. Lagrange, who summarized and refined former works in depth, was the first mathematician to realize there might be no solution by radical to quintic equations. Mathematicians like Ruffini, Abel and Galois, follows the path pointed by Lagrange, and succeded in showing the solvability of algebraic equation is strongly tied with the symmetries of permutations of polynomial of roots, proving there's no solution by radicals for quintic and higher equations due to their fundamental differences in symmetry from solvable ones. Their feat undoubtedly proved the usefulness and neccesarity of group theory to mathematicians, and commenced the study of group theory and abstract algebra.

Besides its monumental meaning in history, the permutation group can also be used as a powerful tool in studying group theory, which we will see in later texts. In this text, we will focus on establishing some cognition of fundamental properties of permutation group.

Permutation groups

Given a set of elements, let's put them into an array, and imagine what will happen when we apply a bijection to it.

For every element in the , it must appear at least once after applying the bijection to the array, otherwise there'll be element not mapped and thus violates 's surjectivity. Similarily, every element must appear at most once otherwise there'll be violation to 's injectivity. As a conclusion, every element in set must appear exactly once after being mapped by bijection .

How do we achieve that. This time, imagine the array is made up of puzzle pieces that each one contains one element and can be taken down and rearranged. After we rearrange them applying the bijection's mapping rules, they would line up in an array again, in our desired order and none is lost.

This might not be a formal proof, but I believe you will feel more intuitive about how bijection from to is equivalent to permutation of 's elements.

Let's put the array before our applying the bijection in the first line, and the array after applying the bijection in the second line, like:

This can be used to represent a permutation of 's elements. As you may have noticed, this also exhaustively enumerate every mapping rule in . So permutation and bijection are thereby equivalent, and we will apply notations applicable to to from now on.

Symmetric groups and permutation groups

For two permutations of , their composition is also a permutation of defined by .

Denote the set of all permutations of as , consider the binary operation , is a group because:

  1. (Closed) .
  2. (Identity) , is the identical map.
  3. (Invertible) .
  4. (Associative) .

For convenience, we usually omit the binary operator so that .

The group is called the symmetric group of . For a group with elements, has elements. Specially, the symmetric group of , where is positive integers not above , is denoted as and called symmetric group on letters.

Obviously, for any symmetric group , we can enumerate and label all elements in , like . Then bijection defined by is a group isomorphism under which there's:

So we just need to study in order to study the properties of any . And we will focus on the discussion to from now on, if possible.

The subgroup of a symmetric group is called a permutation group. For example, dihedral group of is a permutation group of every verticies of the equilateral -gon.

Cayley's theorem

The Cayley's theorem states that each finite group of order is isomorphic to some permutation group on letters, that:

Recall the map defined by . We've already proved it's a bijection. And since is finite, is also a permutation, and an element of .

Consider a map defined by . Noted that , so is also a group homomorphism.

Consider the kernel of , the identity in is and only mapped from , so . And since , it is only possible that .

Finally, combined with we've concluded the proof. Anyway, it is quite intuitive if you are familiar with bijectivity within a group.

Cycle notation

Let's temporarily rewind to the moment when we are to rearrange the puzzle pieces in the array. One could break up the array and reassemble them in many ways, but we want to discuss about the way we rearrange with the following algorithm:

  1. From left to right, find the first piece that is not in desired place, or the algoritm is done if all of them are in place.
  2. Take down the piece that is not in place, now what the piece used to be is vacant, and the piece is at your hand.
  3. Find the desired place the piece is to be, if there's piece at the place, swap it with the piece at your hand and repeat step 3, or just put it in the vacant place and goto step 1.

We need to show the algorithm halts eventually. First, the pieces have already been in their place will not be touched, and since number of pieces in the array is finite, the number of pieces not in their place is therefore finite. Each time we go through step 2 and step 3, the number of pieces not in their places will be reduced by 1, and count down to 0 eventually.

The piece which is swapped in step 3 is mapped by the bijection from a different element, and must be mapped to another element otherwise it is a contradiction to the bijection's injectivity. The vacant place formed in step 2 will be filled eventually, otherwise there's no element mapped to the piece taken in step 2 and is a contradiction to the bijection's surjectivity.

By now, we've proved the algorithm is applicable to the task of rearrangement with respect to arbitrary bijection.

And let's investigate the algorithm closer, it won't be hard to find there're two cases of motions of each elements in the rearrangement: either the pieces stay in their places, or the pieces get involed in a closed mapping circuit.

For the latter case, we would like to introduce cycle notation that , such that , while . Such a is also called an -cycle.

Arbitrary permutation can also be represented as composition of disjoint cycles: each time we return to the step 1 of the algorithm, we enclose a cycle that every elements in the cycle are in their place and will not be touched from then on.

Basic properties

Obviously the starting element from which we commence the enumeration of cycle elements doesn't matter, that . What matters is the order of elements in the loop, in other word, each cycle can be inverted by reversing the order of enumeration, that . Just think of restoring the consequence of being permutated by a cycle, by moving each element back to its previous position, will you reach to this result naturally.

For two cycles , if the elements enrolled in each cycle are disjoint, obviously their composition commutes that . Conversely, the composition of non-disjoint cycles usually doesn't commute, like .

Specially, though no element is touched, we usually denote the identity of in cycle notation as .

The order of -cycle is , otherwise there's discordance with the embedded bijection. For two disjoint cycles which are -cycle and -cycle, the order is . First there's otherwise it is an implication of and are not disjoint. Then since , in order for to be it is required that , while is the minimum value for . The order of composition of more disjoint cycles can be evaluated analogously.

Parity of permutations

Consider a cycle , we would like to show it is possible to split the cycle at , forming two cycles with a junction at , that:

To prove, consider how the elements are moved by and how the motions are compatible with 's, case by case:

So the result of is compatible with and they are the same.

This rule gives us the way to decompose a large cycle into small cycles. And obviously we can decompose a cycle by removing one -cycle at a time, starting from , and finally there's . Each -cycle is called a transposition, and a cycle can be viewed as composition of a series of non-disjoint transpositions.

As you can see, the decomposition of a cycle into transpositions is not unique, you can also decompose the cycle above like . And you can even insert some transpositions in between the composition, which is legal by the symmetric group's property. However, we are just about to show the parity is kept invariant no matter what kind of decomposition is undergone.

A permutation can be written as composition of a series of disjoint cycles, and each cycle can be written as composition of a series of non-disjoint transpositions. The parity means, a permutation can be written as either composition of odd number of transpositions or even number of transpositions, and the cases are mutually exclusive. With respect to each case, the permutation is called an odd permutation or even permutation.

First, we need to prove can only be written as composition of even number of transpositions, and is even permutation.

Let's write down a possible decomposition for into transpositions that . When , the statement is trivially true. It's impossible for , otherwise there's rendering a transposition equal to identity and is a contradiction. When , it is only the case that , since the inverse of a transposition is itself.

When , for the last two transpositions , we use for representing cases of element occurence in the tranpositions, and let's perform some identical replacement for them with respect to each possible cases:

  1. .
  2. .
  3. .
  4. .

The case 1 merely removes the trailing elements and reduces to composition of transpositions.

The case 2, 3 and 4 perform the replacement that where does not occur in . Let's iteratively perform the replacement on . You can see there's some occurence of , canceling out and reducing to compositon of transpositions. Otherwise the transformation terminates at . Given that never occurs in s, and is a contradiction.

So we've proved besides the trivial case (with transposition), every possible decomposition of into transpositions () can be reduced down to decomposition into transpositions, and eventually all of them cancel out, rendering is even in any circumstances. Now we've concluded the proof that can only be written as composition of even number of transpositions.

Then, consider a permutation with its decompositions into permutations that . Obviously there's . Since can only be written as composition of even number of transpositions, is even. And it is either are odd or are even, corresponding to is odd permutation or even permutation respectively.

By now, we've successfully revealed the mechanism of how parity of permutation works.

Alternating groups

For extracting the parities of permutations, consider the sign function defined by where s are transpositions. Despite the decomposition of into transpositions is not unique, is either odd or even in accordance with 's parity, so is well-defined. Meantime, is a group with as identity, and there's:

So is a group homomorphism with its kernel being the subgroup of all even permutations. Let's denote such subgroup as and call it alternating group on letters.

Consider the order of . When , there's no odd permutation in , so is not surjective and . When , there're odd permutations and is surjective, so . Given that it won't be hard to find .

Consider the simplicity of . When , so is trivial and simple. When , so is simple. When , is not trivial and , so is not simple.

Simplicity of alternating groups

Consider the simplicity of , we've already known is trivial when , so let's focus on the cases when .

Denote the set of all -cycles as . First, we would like to show when , there's .

All elements in can be written as composition of even number of transpositions, we would like show we can identically substitute each pair of transpositions with -cycles:

  1. .
  2. .
  3. .
  4. .

So every element in can also be written as composition of -cycles. And conversely, when every element in shows up in , then every element in must also show up in , otherwise the binary operation of will not be closed.

Then, we would like to show for any that , there's thus .

Take any from , by 's closed property there's . Motivated by , and , let's ramble with identities of this form:

Despite 's being odd permutations and not inside , we can combine them to get even permutations: and . So for any -cycle , as long as there's , we are able to shift it into . And with we are able to shift into , and eventually , which covers all elements in . By now, we've concluded the proof that when , there're and .

Finally, we would like to show for any non-trivial , , which means is simple.

For any element , it can be written as composition of disjoint cycles, and the cycles commute. Let's begin with taking one of these disjoint cycle out, say , and discuss about its property. We might also need to discuss about the parity of s and s when it comes to the detail.

Similarily, motivated by , we want to see what kind of elements are forced into through . Consider the commutator of and defined by , by 's group property, there's . This is a wise choice since for any , if we are able to find disjoint with , then , canceling out the s.

Choose which is disjoint with , there's:

Please notice this handling method requires to work. When , there's and thus not applicable. When , is ill-formed and also not applicable. However, this covers the case of when any cycle in is longer than , contains a -cycle. And from now on, we just need to think about 's being composition of disjoint -cycles and transpositions.

When is composition of single -cycle and or disjoint transpositions, say , there's . and contains a -cycle though operations on elements in this case.

When is composition of more than one -cycles, say , choose which is disjoint with this time, there's:

Which means the cycle of length 5 is also in . Apply the case of again and we get is also in .

When has no -cycle, that is, composed merely of even number of disjoint transpositions, say , preprocess with , there's:

Which means the is also in after preprocessing, and we get rid of discussing while choosing .

For group , it is always possible to choose some such that , there's:

So for element in with trailing transposition , it's always possible to choose some distinct from so that is also in .

By now, all possible cases of elements in have been discussed. Just as you can see, being non-trivial and normal in will inevitably have a -cycle planted in it, the -cycle will then sprout into , and finally grows into the group of .

For , there's no element composed of disjoint transpositions. Actually there're and .

For any , we know it is impossible to choose some distinct from to form some -cycle in . Consider what element can be generated by where is a -cycle in . Since and , we can safely align with the "hole" of missing element in -cycle, without losing generality. And all -cycles that can be formed by are and , so:

So no -cycle can be generated in this way, and if contains a composition of two transpositions, it should contain other composition of two transpositions in order to be normal.

Consider the compositions of two transpositions in , there're three of them: . Noted that , , so . And by and are of order 2, , is a group and internal direct product of , and . And there's since elements in can only be -cycles or compositions of two transpositions, contains all of the compositions of two transpositions, and none of the -cycles.

By now, all possible structures of have been discussed and can be generalized as below:

Such series of normal subgroups are called the subnormal series of . Special attention must be paid since being does not necessarily mean , like is not normal in . When all normal subgroups enumerated are normal in , such series of subgroup is called the normal series.

Non-trivial normal subgroup in symmetric groups

Despite , is normal in , is it possible that there's other non-trivial normal subgroup such that ?

Since , and is simple, there's either or .

For the latter case, so divides . And since it is only possible for to be or . None of them are valid candidate for .

For the former case, according to the second isomorphism theorem, there's . So , . Since is not trivial . And since contains all even permutations, contains only odd permutations except for . The odd permutation must be of order 2, and can only be be composition of odd number of disjoint transpositions.

When , for some , there's , which is a contradiction for to be normal.

When , for some , there's , which is a contradiction for to be normal.

So it is not possible for such to exist, and we've concluded the proof that is the only non-trivial normal subgroup in .

Conclusion

In this text, we begin with showing how bijections on finite set are equivalent to permutations, and how do these permutations form a group called symmetric group. Later we show how bijectivity embedded in a group's binary operation eventually renders a finite group's isomorphism to some subgroup of symmetric group called permutation group.

Then we focus on studying some fundamental properties of permutations, which are elements of a symmetric group, like decomposition of a permutation into disjoint cycles, consequent decomposition of these cycles into transpositions, and the way parities constrain the operations of them.

Finally we focus on the subgroup called alternating group, which is formed by all even permutations in the symmetric group, by studying how -cycles are inevitable in a symmetry group on over letters, how these -cycles affect the simplicity of an alternating group, and finally what kind of structures can a symmetric group has, by generalizing into subnormal series.

March 16, 2023