Hey guys, so just to be up front and clear, this is a homework question. I've looked at it all day, and am fairly stumped (a novel feeling for me). It makes some sense intuitively, but when it comes to a logic chain, I've got next to nothing. Bear with me, because half the reason I'm posting this is to think out loud to someone.

Assumptions:

Let [imath]V,W,X,Y[/imath] be vector spaces.

Let [imath]f:V \rightarrow W[/imath] be a linear transformation

Let [imath]Y \subseteq W[/imath]

Let [imath]X := \lbrace v \in V : f(v) \in Y \rbrace[/imath]

In the first part of the problem, we prove that X is a subspace of V. pretty standard stuff. Where I have trouble is in the second part, worded thusly:

"Prove that if f is surjective, then [imath]dim(V) - dim(X) = dim(W) - dim(Y)[/imath]."

So far, I have about 3 pages of nonsensical notes and no real solution. Probably the only productive progress I've made is this:

Let [imath]g: X \rightarrow W[/imath] be defined such that [imath]g(x) = f(x)[/imath] for x in X

Let [imath]\lbrace e_1, e_2, \ldots , e_n \rbrace[/imath] be the basis for X

Then it can be extended to [imath]\lbrace e_1, e_2, \ldots , e_n, e_{n+1}, \ldots , e_m \rbrace[/imath] as a basis for V

Similarly, let [imath]\lbrace d_1, d_2, \ldots , d_l \rbrace[/imath] be a matrix for Y

Therefore it can be extended to [imath]]\lbrace d_1, d_2, \ldots , d_l, d_{l+1}, \ldots, d_k \rbrace[/imath] as a basis for W

So really what I'm trying to show that

[math]m-n = k - l[/math]

I know that the definition of X combined with f being a surjective function must push its dimensions upwards to [imath]n = m + k - l[/imath], but I'm lost on how to show it. Y is the image of g, and by the way X is defined, it is impossible to get an element v of V such that v isn't part of X but f(v) is part of Y. So v must be a linear combination of the V bases n+1 thru m, and f(v) must be a linear combination of l+1 thru k, which is the difference I'm trying to prove is equal.

In other words, my current strategy is to consider places where X must be and where X must not be. Still, I can't really find a path through yet.

Any answers, advice, or comments are greatly appreciated.

## Dimensions of related vector spaces

**Moderators:** gmalivuk, Moderators General, Prelates

### Dimensions of related vector spaces

Last edited by SirDucky on Sat Aug 28, 2010 11:45 am UTC, edited 1 time in total.

### Re: Dimensions of related vector spaces

Have you learned about supplementary spaces yet? If so, then you just need to show that a space supplemental to X in V and a space supplemental to Y in W have the same dimension. I haven't had too much caffeine yet this morning, but that doesn't look too horrific at first glance.

I will give one sound piece of advice. Don't consider an arbitrary basis for W, since {f(e1), f(e2), ..., f(em)} is a basis that is clearly far more relevant to your issues.

I will give one sound piece of advice. Don't consider an arbitrary basis for W, since {f(e1), f(e2), ..., f(em)} is a basis that is clearly far more relevant to your issues.

- jestingrabbit
- Factoids are just Datas that haven't grown up yet
**Posts:**5967**Joined:**Tue Nov 28, 2006 9:50 pm UTC**Location:**Sydney

### Re: Dimensions of related vector spaces

I'd suggest you break up your basis for V such that you have a subset that is a basis for the kernel of f and another subset that is the span of X. If you can prove such a basis exists, and establish the relation between the subsets, you're pretty much done.

ameretrifle wrote:Magic space feudalism is therefore a viable idea.

### Re: Dimensions of related vector spaces

Thanks for the response guys!

I haven't in fact learned about supplemental spaces, and wikipedia doesn't have an entry for it. Do you mean complimentary spaces?

The reason I didn't use {f(e1), f(e2), ..., f(em)} as a basis for W is because it's not necessarily linearly independent (a good example being projections). Therefore, although we can say that the basis is arranged so that every d_i has an e_j s.t. d_i = f(e_j), it is possible for one of those e_j's to be a basis. (This is work I've done since posting). Thanks for the help!

the basis as a span of X is sort of what I've done with extending the basis of X to the basis of V. However, I fail to see what the kernel has to do with anything. Could you maybe clarify?

And again, if anyone else has contributions, I'm all ears.

Tirian wrote:Have you learned about supplementary spaces yet? If so, then you just need to show that a space supplemental to X in V and a space supplemental to Y in W have the same dimension. I haven't had too much caffeine yet this morning, but that doesn't look too horrific at first glance.

I will give one sound piece of advice. Don't consider an arbitrary basis for W, since {f(e1), f(e2), ..., f(em)} is a basis that is clearly far more relevant to your issues.

I haven't in fact learned about supplemental spaces, and wikipedia doesn't have an entry for it. Do you mean complimentary spaces?

The reason I didn't use {f(e1), f(e2), ..., f(em)} as a basis for W is because it's not necessarily linearly independent (a good example being projections). Therefore, although we can say that the basis is arranged so that every d_i has an e_j s.t. d_i = f(e_j), it is possible for one of those e_j's to be a basis. (This is work I've done since posting). Thanks for the help!

jestingrabbit wrote:I'd suggest you break up your basis for V such that you have a subset that is a basis for the kernel of f and another subset that is the span of X. If you can prove such a basis exists, and establish the relation between the subsets, you're pretty much done.

the basis as a span of X is sort of what I've done with extending the basis of X to the basis of V. However, I fail to see what the kernel has to do with anything. Could you maybe clarify?

And again, if anyone else has contributions, I'm all ears.

- jestingrabbit
- Factoids are just Datas that haven't grown up yet
**Posts:**5967**Joined:**Tue Nov 28, 2006 9:50 pm UTC**Location:**Sydney

### Re: Dimensions of related vector spaces

The kernel is a bit of a help in that when you act on the basis with f, you get some number of zero terms, and some number of non zero terms. You should be able to prove that the non zero terms are a basis for W, and a subset of them is a basis for Y.

ameretrifle wrote:Magic space feudalism is therefore a viable idea.

### Re: Dimensions of related vector spaces

thanks, that actually makes a lot of sense. However, since f an unspecified arbitrary transformation, it could have a large, small, or single-value kernel. I agree that the non-kernel elements of the basis will map to a set that spans W, but this is the whole linear independence issue that I brought up in my last post. This may be my own mental density, but I can't see how defining a subset of bases as the span of the kernel would help me in my proof.

- jestingrabbit
- Factoids are just Datas that haven't grown up yet
**Posts:**5967**Joined:**Tue Nov 28, 2006 9:50 pm UTC**Location:**Sydney

### Re: Dimensions of related vector spaces

If you isolate the elements of the basis that span the kernel, you can prove that the non zero images are linearly independent. I suppose that's the tricky part. Try to work out how to get from an element of the span of nonzero images to a unique element of the span of the basis elements that aren't in the kernel.

ameretrifle wrote:Magic space feudalism is therefore a viable idea.

- imatrendytotebag
**Posts:**152**Joined:**Thu Nov 29, 2007 1:16 am UTC

### Re: Dimensions of related vector spaces

As people before me have alluded to, there is a relation between the dimension of the domain of f, V, the dimension of the kernel of f and the dimension of the range of f, W (since f is surjective). In particular, you should make a basis of V by first making a basis for the kernel of f, then extending. The next question is, where do X and Y come in? Well, use your function g:X -> W and apply the same relation. (Hint: the kernel of g = the kernel of f. Why?)

Hey baby, I'm proving love at nth sight by induction and you're my base case.

### Re: Dimensions of related vector spaces

All right guys! I finally got it. You were right, the nullspace was key to the whole thing. I figured I should post my proof here, if only for closure (I'm lazy and didn't take out all the $'s for inlines):

First, let us examine the linear transformation [imath]f:V\rightarrow W[/imath]:

Since [imath]f:V\rightarrow W[/imath] is a linear transformation, it can be represented as a matrix as per (Definition 2.2.10):

[math]A =

\begin{bmatrix}

a_{1,1} & \ldots & a_{1,j}\\

\vdots & \ddots & \vdots\\

a_{i,1} & \ldots & a_{i,j}\\

\end{bmatrix}[/math]

Where [imath]j = dim(V), i = dim(W)[/imath]

Let [imath]A\thicksim A'[/imath] be the row-reduced equivalent matrix of A

As we know from Problem 3c:

[math]dim(Im(f)) + dim(Ker(f)) = j = dim(V)[/math]

Since [imath]g:X\rightarrow W[/imath] is also a linear transformation, we can similarly say:

[math]dim(Im(g)) + dim(Ker(g)) = dim(X)[/math]

Now, by definition, if [imath]v\in Ker(f)$ then $f(v) = 0[/imath]

And $Y$ is a vector space [imath]\Rightarrow 0\in Y \Rightarrow v\in X[/imath]

Therefore, $[imath]Ker(f) = Ker(g)[/imath]$

So let nullspace $[imath]N = Ker(f) = Ker(g)[/imath]$

Therefore $[imath]dim(N) = dim(Ker(f) = dim(Ker(g)[/imath])$

Furthermore, we know that $f$ is surjective. Therefore:

[math]Im(f) = W[/math]

For $[imath]g:X\rightarrow W[/imath]$, if $[imath]x\in X[/imath]$ then$[imath]g(x) \in Y[/imath]$ by defn, and if $[imath]y \in Y[/imath]$ then because $[imath]f[/imath]$ is surjective there exists $[imath]f(x) = g(x) = y[/imath]$ where $[imath]x\in X[/imath]$ by definition of $[imath]X[/imath]$. Therefore:

[math]Im(g) = Y[/math]

So plugging these into our previous equations, we find:

[math]dim(W) + dim(N) = dim(V)[/math]

[math]dim(Y) + dim(N) = dim(X)[/math]

Therefore:

[math]dim(V) - dim(X) = dim(W) + dim(N) - (dim(Y) + dim(N))[/math]

[math]dim(V) - dim(X) = dim(W) - dim(Y)[/math]

Which is the proposition we set out to prove.

First, let us examine the linear transformation [imath]f:V\rightarrow W[/imath]:

Since [imath]f:V\rightarrow W[/imath] is a linear transformation, it can be represented as a matrix as per (Definition 2.2.10):

[math]A =

\begin{bmatrix}

a_{1,1} & \ldots & a_{1,j}\\

\vdots & \ddots & \vdots\\

a_{i,1} & \ldots & a_{i,j}\\

\end{bmatrix}[/math]

Where [imath]j = dim(V), i = dim(W)[/imath]

Let [imath]A\thicksim A'[/imath] be the row-reduced equivalent matrix of A

As we know from Problem 3c:

- Columns containing leading entries in [imath]A'[/imath] correspond to a basis for [imath]Im(f)[/imath]

Columns not containing leading entries in $A'$ correspond to a basis for [imath]Ker(f)[/imath]

[math]dim(Im(f)) + dim(Ker(f)) = j = dim(V)[/math]

Since [imath]g:X\rightarrow W[/imath] is also a linear transformation, we can similarly say:

[math]dim(Im(g)) + dim(Ker(g)) = dim(X)[/math]

Now, by definition, if [imath]v\in Ker(f)$ then $f(v) = 0[/imath]

And $Y$ is a vector space [imath]\Rightarrow 0\in Y \Rightarrow v\in X[/imath]

Therefore, $[imath]Ker(f) = Ker(g)[/imath]$

So let nullspace $[imath]N = Ker(f) = Ker(g)[/imath]$

Therefore $[imath]dim(N) = dim(Ker(f) = dim(Ker(g)[/imath])$

Furthermore, we know that $f$ is surjective. Therefore:

[math]Im(f) = W[/math]

For $[imath]g:X\rightarrow W[/imath]$, if $[imath]x\in X[/imath]$ then$[imath]g(x) \in Y[/imath]$ by defn, and if $[imath]y \in Y[/imath]$ then because $[imath]f[/imath]$ is surjective there exists $[imath]f(x) = g(x) = y[/imath]$ where $[imath]x\in X[/imath]$ by definition of $[imath]X[/imath]$. Therefore:

[math]Im(g) = Y[/math]

So plugging these into our previous equations, we find:

[math]dim(W) + dim(N) = dim(V)[/math]

[math]dim(Y) + dim(N) = dim(X)[/math]

Therefore:

[math]dim(V) - dim(X) = dim(W) + dim(N) - (dim(Y) + dim(N))[/math]

[math]dim(V) - dim(X) = dim(W) - dim(Y)[/math]

Which is the proposition we set out to prove.

- Yakk
- Poster with most posts but no title.
**Posts:**11129**Joined:**Sat Jan 27, 2007 7:27 pm UTC**Location:**E pur si muove

### Re: Dimensions of related vector spaces

Proof commentary...

A diagram:

that makes things look cooler. (ObCategory theory joke: "the above diagram commutes. QED.")

Are you sure you need to prove:

f:A->B linear -> dim(A) = dim(image(f)) + dim(ker(f))

? That is the kind of thing that you might have as a lemma/theorem from class.

What you call g can also be called f|

Your ker(f|

f|X: X->Y seems useful to prove. But I guess you did it when you showed that img(f|X)=Y, which avoided having to prove that your function f|X is surjective on Y as well.

A diagram:

Code: Select all

` f`

V -> W

sub| |sub

X -> Y

f|X

that makes things look cooler. (ObCategory theory joke: "the above diagram commutes. QED.")

Are you sure you need to prove:

f:A->B linear -> dim(A) = dim(image(f)) + dim(ker(f))

? That is the kind of thing that you might have as a lemma/theorem from class.

What you call g can also be called f|

_{X}:X->W -- pronounced f restricted to X.Your ker(f|

_{X}) = ker(f) needs at least one iff (or double-headed-arrow).f|X: X->Y seems useful to prove. But I guess you did it when you showed that img(f|X)=Y, which avoided having to prove that your function f|X is surjective on Y as well.

One of the painful things about our time is that those who feel certainty are stupid, and those with any imagination and understanding are filled with doubt and indecision - BR

Last edited by JHVH on Fri Oct 23, 4004 BCE 6:17 pm, edited 6 times in total.

Last edited by JHVH on Fri Oct 23, 4004 BCE 6:17 pm, edited 6 times in total.

### Who is online

Users browsing this forum: No registered users and 7 guests