## 403/503 – Determinants

Here is a problem that you may enjoy thinking about. Given an $n\times n$ matrix $A,$ define a new $n\times n$ matrix $e^A$ by the power series

$\displaystyle\sum_{k=0}^\infty\frac{1}{k!}A^k.$

This means, of course, the matrix whose entries are the limit of the corresponding entries of the sequence of matrices $\sum_{k=0}^n\frac{1}{k!}A^k$ as $n\to\infty.$

(This limit actually exists. Those of you who have seen Hilbert spaces should see a proof easily: Recall we defined the norm $\|A\|$ of $A$ as $\sup_{\|v\|=1}\|Av\|,$ where in this supremum $\|v\|$ and $\|Av\|$ denote the usual norm  (of $v$ or $Av,$ respectively) in ${\mathbb C}^n$ defined in terms of the usual inner product. One checks that a series of vectors $\sum_{k=0}^\infty v_k$ converges (in any reasonable sense) in a Banach space if it converges absolutely, i.e., if $\sum_{k=0}^\infty \|v_k\|$ converges. Since $\|A^k\|\le\|A\|^k,$ the series defining $e^A$ clearly converges absolutely.)

The matrix $e^A$ is actually a reasonable object to study. For example, the function $f(t)=e^{tA}v_0$ is the unique solution to the differential equation $f'(t)=Af(t),$ $f(0)=v_0.$ Here, $v_0\in{\mathbb C}^n$ is a fixed vector.

Note that, for any $A,$ the matrix $e^A$ is invertible, since $e^Ae^{-A}=I,$ as a direct computation verifies.

Anyway, the problem: Show that for any matrix $A,$ we have $\det(e^A)=e^{{\rm tr}(A)}.$ Note this is not completely unreasonable to expect: A direct computation shows that if $v$ is an eigenvector of $A$ with eigenvalue $\lambda,$ then $e^Av=e^\lambda v,$ so the formula is true whenever $A$ is diagonalizable.