Exterior powers of subdomains of a field of fractions

Let R be an integral domain, and let F be its field of fractions.

  1. Considering F as an R-module in the usual way, prove that \bigwedge^2(F) = 0.
  2. Let I \subseteq F be a submodule. Prove that \bigwedge^k(I) is torsion for all k \geq 2.
  3. Exhibit an integral domain R and an R-submodule I \subseteq F such that \bigwedge^k(I) \neq 0 for all k \geq 0.

Let \frac{a}{b} \otimes \frac{c}{d} \in \mathcal{T}^2(F) be a simple tensor. Note that \frac{a}{b} \otimes \frac{c}{d} = \frac{ad}{bd} \otimes \frac{cb}{bd} = abcd(\frac{1}{bd} \otimes \frac{1}{bd}) \in \mathcal{A}^2(F). In particular, we have \frac{a}{b} \wedge \frac{c}{d} = 0 in \bigwedge^2(I).

Suppose \frac{a_1}{b_1} \wedge \frac{a_2}{b_2} \wedge \cdots \wedge \frac{a_k}{b_k} \in \bigwedge^k(I) be nonzero; then the a_i are nonzero, and certainly the b_i are nonzero. Now a_1a_2b_1b_2 \neq 0 in R, and evidently a_1a_2b_1b_2(\frac{a_1}{b_1} \wedge \frac{a_2}{b_2} \wedge \cdots \wedge \frac{a_k}{b_k}) =  \frac{a_1a_2}{1} \wedge \frac{a_1a_2}{1} \wedge \cdots \wedge \frac{a_k}{b_k} = 0. So every element of \bigwedge^k(I) is torsion, and thus \bigwedge^k(I) is torsion as an R-module.

Now consider R = \mathbb{Z}[x_1,\ldots,x_n], and let I = (x_1,\ldots,x_n). We claim that if \sum \alpha_ix_i = \sum \beta_ix_i \in I, then there exist h_i \in I such that \alpha_i - \beta_i = h_i. To see this, choose some j and consider \alpha_jx_j - \beta_jx_j = \sum_{i \neq j} (\beta_i - \alpha_i)x_i. Since R is a domain, x_j divides the right hand side of this equation; say \sum_{i \neq j} (\beta_i - \alpha_i)x_i = x_jh_j. Note that every term on the left hand side is divisible by some x_i other than x_j. Hence every term in h_j is divisible by some x_i other than x_j. In particular, h_j \in I, and \alpha_i - \beta_i = h_i.

Now consider the elements of \prod^k I as “column vectors”- that is, write \sum \alpha_i x_i as [\alpha_1\ \alpha_2\ \ldots\ \alpha_n]^\mathsf{T}. Note that this does not give a unique representation of the elements of I. However, if two column vectors A and B represent the same element in I, then there exists a third column vector H such that A = B+H, and moreover the entries of H are in I.

Now write the elements of R^k as (square) matrices. Let us consider the determinant of such a matrix A, as an element of R, reduced mod I. This is an alternating bilinear map on the set of matrices over R; we claim that this is also well-defined up to our nonunique identification of matrices over R with elements of I^k. To that end, suppose matrices A and B represent the same element of I^k; then we have a matrix H such that A = B+H. Consider the determinant of both sides mod I. Using the combinatorial formula for computing determinants, we have \mathsf{det}(B+H) = \sum_{\sigma \in S_n} \epsilon(\sigma) \prod (\beta_{\sigma(i),i} + h_{\sigma(i),i}). Note that each h_{i,j} is divisible by some x_i, and so goes to zero in the quotient R/I. So in fact \mathsf{det}(A) = \mathsf{det}(B+H) \equiv \mathsf{det}(B) \mod\ I; thus the map \mathsf{det} : I^k \rightarrow R/I is a well-defined alternating bilinear map. Since \mathsf{det}(x_1 \otimes \cdots \otimes x_n) = 1 is nonzero, this map is nontrivial. Thus \bigwedge^k(I) \neq 0 for all k.

Post a comment or leave a trackback: Trackback URL.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: