## [Linear Algebra] Finding an orthogonal vector

For the discussion of math. Duh.

Moderators: gmalivuk, Moderators General, Prelates

Qaanol
The Cheshirest Catamount
Posts: 3067
Joined: Sat May 09, 2009 11:55 pm UTC

### [Linear Algebra] Finding an orthogonal vector

Given a basis for a hyperplane in ℝⁿ, what’s the best way to obtain a vector orthogonal to it?

In particular, given n independent vectors in ℝⁿ, is there an efficient way to calculate a vector orthogonal to the hyperplane containing all of their differences (viv0)?

The general problem would be to find the null space of the matrix given by those differences, but since we know that it has dimension 1 is there a way to take advantage of that? For the case I am interested in all the given vectors have unit length, if that helps.
wee free kings

Cauchy
Posts: 602
Joined: Wed Mar 28, 2007 1:43 pm UTC

### Re: [Linear Algebra] Finding an orthogonal vector

Don't you do that determinant thing that gets you the cross product?

For n-1 vectors in R^n, look at the matrix

[v_11 v_12 ... v_1n]
[v_21 v_22 ... v_2n]
.
.
.
[v_{n-1}1 v_{n-1}2 ... v_{n-1}n]
[x_1 x_2 ... x_n]

where the x_i is 1 in the ith position and 0 everywhere else, and take its determinant. It's an abuse of notation, I know, but it works.

More formally, let v_i, 1 <= i <= n-1 be the differences, so the hyperplane is spanned by the v_i. For 1 <= k <= n, let A_k be the determinant of the n-1 by n-1 matrix

[v_11 v_12 ... v_1{k-1} v_1{k+1} ... v_1n]
.
.
.
[v_{n-1}1 v_{n-1}2 ... v_{n-1}{k-1} v_{n-1}{k+1} ... v_{n-1}n]

where the kth component of each vector has been stripped out. These are the minors of the first matrix I wrote when you expand along the bottom row. Let A = {A_1, A_2, ..., A_k}. Then, for arbitrary w, the determinant of

[v_1]
[v_2]
.
.
.
[v_{n-1}]
[w]

is A dot w, by expanding the determinant along the bottom row. It follows that A dot w = 0 when w is any of the v_i, because a row in the determinant is repeated, so A is orthogonal to each of the v_i. A is also not the 0 vector because... handwave involving the v_i being linearly independent.
(∫|p|2)(∫|q|2) ≥ (∫|pq|)2
Thanks, skeptical scientist, for knowing symbols and giving them to me.

cyanyoshi
Posts: 399
Joined: Thu Sep 23, 2010 3:30 am UTC

### Re: [Linear Algebra] Finding an orthogonal vector

If I am understanding correctly, you have an (n-1)-dimensional hyperplane defined as all affine combinations of a set of linearly independent vectors {v0, ... , vn-1}, all embedded in an n-dimensional vector space. You then want to compute a unit vector orthogonal to this plane.

Let V=[v0, ... , vn-1]. If the plane doesn't pass through the origin, then the vector V(VTV)-11 will be orthogonal to the plane, where 1 is the vector of ones. If that inverse can't be taken, then you can generate a random vector a to add to each vi and try again. (This translates the hyperplane away from the origin.) I haven't worked through the details, but I suspect that this should work.

Or you could do that determinant method. That's probably a better idea.

Qaanol
The Cheshirest Catamount
Posts: 3067
Joined: Sat May 09, 2009 11:55 pm UTC

### Re: [Linear Algebra] Finding an orthogonal vector

Cauchy wrote:Don't you do that determinant thing that gets you the cross product?

Thank you much!
wee free kings

Return to “Mathematics”

### Who is online

Users browsing this forum: No registered users and 4 guests