February 1, 2014

http://coursework.tylerlogic.com/courses/upenn/math503/homework01
Some Language. Similar to Dummit and Foote [DF04, p. 365], for left R-module N, right R-module M and abelian group L, we will call a map α : M × N → L R-balanced if it satisfies all three of

Let V and W be modules over a commutative ring R. Then according to the definition of tensor products, we have the existence of balanced maps α : V ×W → V ⊗

commutes. Furthermore, since α(v,w) = v ⊗ w and β(v,w) = w ⊗ v, then s(v ⊗ w) = w ⊗ v for each v ∈ V and w ∈ W.

Finally, since R is commutative, the above argument symmetrically holds when V and W are
swapped as each is a left and right R-module. This implies the existence of a homomorphism
s′ : W ⊗_{R}V → V ⊗_{R}W such that s′(w ⊗ v) = v ⊗ w. Hence, s above is invertible and therefore an isomorphism.
^{1}

Since under standard scalar multiplication, R

First let us define φ : R

Lemma 2.1. Let ϕ be defined as it is above. For all r ∈ R_{n} and c ∈ C_{n}, if ϕ(r ⊗ c) = 0 then r ⊗ c = 0 ∈
R_{n} ⊗_{Mn(F)}C_{n}.

Proof. Let the initial conditions of the Lemma’s statement stand. Denote the components of r and c by r_{1},…,r_{n}
and c_{1},…,c_{n}, respectively. Since ϕ(r ⊗c) = rc = 0 then for each i, r_{i} = 0 or c_{i} = 0. Let i_{1},…,i_{k} be the indices of the
components of r which are zero. Therefore c_{j} = 0 for j ⁄∈{i_{1},…,i_{k}} Let A ∈ M_{n}(R) be the matrix with ones along
the diagonal, except in rows i_{1},…,i_{k} which shall be completely zero. Then we have that

__

Lemma 2.2. For every element x ∈ R_{n} ⊗_{Mn(F)}C_{n} there exist elements r ∈ R_{n} and c ∈ C_{n} such that x = r ⊗ c.

Proof. As R_{n} ⊗_{Mn(F)}C_{n} is made up of finite linear combinations of elements of the form r ⊗ c, it suffices to
only show that any two such elements can be combined into one. So let r_{1} ⊗ c_{1} + + r_{2} ⊗ c_{2} be arbitrary in
R_{n} ⊗_{Mn(F)}C_{n}. Then by denoting the individual components of r_{1} by r_{11},…,r_{1n} and likewise for r_{2}, we can define
A ∈ M_{n}(F) to be

Note that for clarity, we neglect to address the case of a component of r_{2} being zero, in which case we would set
the corresponding diagonal entry in A to zero, and the remainder of the proof holds. Therefore we have

__

Now let r_{1} ⊗c_{1} + + r_{n} ⊗c_{n} in R_{n} ⊗_{Mn(F)}C_{n} be such that ϕ(r_{1} ⊗c_{1} + + r_{n} ⊗c_{n}) = 0. By Lemma 2.2 there are
r ∈ R_{n} and c ∈ C_{n} such that r_{1} ⊗ c_{1} + + r_{n} ⊗ c_{n} = r ⊗ c. By Lemma 2.1 r_{1} ⊗ c_{1} + + r_{n} ⊗ c_{n} = 0 since
ϕ(r_{1} ⊗ c_{1} + + r_{n} ⊗ c_{n}) = ϕ(r ⊗ c) = 0.

Hence ϕ is an injective linear transformation with a 1-dimensional vector space as a codomain. Then, since ϕ is not the zero map, e.g.

R_{n} ⊗_{Mn(F)}C_{n} must also be a one dimensional vector space. Furthermore, in light of Lemma 2.2, ϕ is our explicit
isomorphism we need, mapping r ⊗ c to rc.

Let F be a field and U, V , and W be vectors spaces over F. Furthermore let u,u

Let β : U × V × W → U ⊗

Let T : U ×V ×W → X be a F-trilinear map where X is some F-vector space. Then for a fixed u ∈ U, we can define T

| (3.1) |

where i : V × W → V ⊗ W is the “inclusion” map (v,w)v ⊗ w; i.e. the diagram

commutes.

Next, we will use these φ_{u} maps to obtain the f for which we’re looking. So define ϕ : U × (V ⊗ W) → X by
ϕ(u,v,w) = φ_{u}(v ⊗ w). Then equation 3.1 gives us

and

the equation ϕ ∘ ψ = f ∘ i′∘ ψ implies T = f ∘ β.

Due to the symmetry of the situation, it is a laborious plug-and-chug operation to prove that

Given a F-trilinear map T : U × V × W → X where X is some F-vector space, there exists a unique F-linear map f : (U ⊗ V ) ⊗ W → X such that T = f ∘ β

given that we already have the above proof in our hands. So we omit the proof and simply admit the above statement as fact. As there is already grounds for multi-linear universal property of tensor products [Lan02, p. 603], we will overload “the universal property of tensors products” by referring to the above statement, the statement proven in the previous part of this problem, and the original two-dimensional property as “the universal property of tensors products”.

So by part (a) of this problem, we have that β : U ×V ×W → U ⊗ (V ⊗W) is F-trilinear. The same argument holds, by shuffling around parentheses, for β′ : U ×V ×W → (U ⊗V ) ⊗W, implying that it is F-trilinear. Therefore, the universal property of tensor products implies that there exist unique group homomorphisms α′ : (U ⊗V ) ⊗W → U ⊗ (V ⊗W) and α : U ⊗ (V ⊗W) → (U ⊗V ) ⊗W such that α′((u⊗v) ⊗w) = β(u,v,w) and α(u⊗ (v ⊗w)) = β′(u,v,w), i.e. the following diagrams commute

But the uniqueness implies that α′ and α are inverses of each. Thus α is the desired isomophism between U ⊗ (V ⊗W) and (U ⊗ V ) ⊗ W.

Let’s start off by defining Φ

Now define f : U^{∨}× V ^{∨}→ (U ⊗ V )^{∨} by f(φ,ϕ) = Φ_{φϕ}. We yet again have an F-bilinear map here by the following
equations, using what we have shown above.

Now we define g : (U ⊗ V )^{∨}→ U^{∨}⊗ V ^{∨} (this will be our inverse of f) to be the map

where α : U ×V → F is the map associated with α according to the universal property of tensor products. By the following we see that g is the inverse of f