## Vectors and Matrix Determinants

For the discussion of math. Duh.

Moderators: gmalivuk, Moderators General, Prelates

pogrmman
Posts: 553
Joined: Wed Jun 29, 2016 10:53 pm UTC
Location: Probably outside

### Vectors and Matrix Determinants

So, matrices have always kind of puzzled me, but what puzzles me more has to do with the determinant of a matrix of vectors.

If you have a matrix composed of n-1 n-dimensional vectors and a row of the form [i,j,k,...] (i.e. the unit vectors along each of the axes), the determinant is a vector perpendicular to the ones you started with. (So, if you have the vector <1,2> and put it in a matrix with [i,j] as the other row, you'll get the vector 2i-j or <2,-1>, which is perpendicular to <1,2>. You could also get j-2i or <-2,1> which is also perpendicular.)

You've probably seen this in calculating the cross product. What I want to know is why this works. (It seems to work for n at least up to 6 -- I couldn't be bothered to take the determinant of a 7x7 matrix, but my guess is it will work)

Do any of you have a way to explain why this works?

Thanks!

EDIT: The position of the row [i,j,k,...] might matter -- I'm not sure -- but I'm not about to calculate the determinant of a 3x3 matrix without scratch paper... (It appears as if it works if this row is on top or on the bottom -- and 3x3 is the minimum to test somewhere that isn't the top or bottom...)

cyanyoshi
Posts: 389
Joined: Thu Sep 23, 2010 3:30 am UTC

### Re: Vectors and Matrix Determinants

Let's try to find an n×1 vector vn so that v1Tvn = ... = vn-1Tvn = 0. I define V = [ v1 v2 ... vn-1] so that we have VTvn = [0 ... 0]T. Now let's make up a vector "w" and matrix "X" so that [V w] and [X vn] are square, and [V w]T[X vn] = In, where In is the n×n identity matrix. Expanding this out gives the system of equations VTX = In-1, VTvn = 0, wTX = 0, wTvn = 1. So far so good.

Now we can just worry about finding a matrix inverse, since we would have that [X vn] = [V w]-T. We can get a vn that works by looking at the last column of [V w]-T. It turns out there is a nifty way to calculate the inverse of a matrix using cofactors. What happens is that the ith element of vn is some constant times (-1)i times the determinant of (V with the ith row deleted). This is the same as the determinant of the matrix you described, times a constant!

Therefore a vector constructed by your method must be orthogonal to all the other vectors, since if I give you some random vector "w" that is linearly independent from {v1, ... , vn-1}, you can just look at the last column of the matrix [V w]-T to get an orthogonal vector that happens to be a constant times the vector you can construct by your method.

Demki
Posts: 199
Joined: Fri Nov 30, 2012 9:29 pm UTC

### Re: Vectors and Matrix Determinants

In addition to cyanoshi's post, I think these 2 videos might help you, at least for the case of 3d vectors:
https://youtu.be/eu6i7WJeinw
https://youtu.be/BaM7OCEm3G0

These 2 videos are part of a series about linear algebra.

pogrmman
Posts: 553
Joined: Wed Jun 29, 2016 10:53 pm UTC
Location: Probably outside

Thanks!