Page **1** of **1**

### [Linear Algebra] Finding an orthogonal vector

Posted: **Sat Jun 03, 2017 10:24 pm UTC**

by **Qaanol**

Given a basis for a hyperplane in ℝⁿ, what’s the best way to obtain a vector orthogonal to it?

In particular, given n independent vectors in ℝⁿ, is there an efficient way to calculate a vector orthogonal to the hyperplane containing all of their differences (v_{i}−v_{0})?

The general problem would be to find the null space of the matrix given by those differences, but since we know that it has dimension 1 is there a way to take advantage of that? For the case I am interested in all the given vectors have unit length, if that helps.

### Re: [Linear Algebra] Finding an orthogonal vector

Posted: **Sun Jun 04, 2017 12:02 am UTC**

by **Cauchy**

Don't you do that determinant thing that gets you the cross product?

For n-1 vectors in R^n, look at the matrix

[v_11 v_12 ... v_1n]

[v_21 v_22 ... v_2n]

.

.

.

[v_{n-1}1 v_{n-1}2 ... v_{n-1}n]

[x_1 x_2 ... x_n]

where the x_i is 1 in the ith position and 0 everywhere else, and take its determinant. It's an abuse of notation, I know, but it works.

More formally, let v_i, 1 <= i <= n-1 be the differences, so the hyperplane is spanned by the v_i. For 1 <= k <= n, let A_k be the determinant of the n-1 by n-1 matrix

[v_11 v_12 ... v_1{k-1} v_1{k+1} ... v_1n]

.

.

.

[v_{n-1}1 v_{n-1}2 ... v_{n-1}{k-1} v_{n-1}{k+1} ... v_{n-1}n]

where the kth component of each vector has been stripped out. These are the minors of the first matrix I wrote when you expand along the bottom row. Let A = {A_1, A_2, ..., A_k}. Then, for arbitrary w, the determinant of

[v_1]

[v_2]

.

.

.

[v_{n-1}]

[w]

is A dot w, by expanding the determinant along the bottom row. It follows that A dot w = 0 when w is any of the v_i, because a row in the determinant is repeated, so A is orthogonal to each of the v_i. A is also not the 0 vector because... handwave involving the v_i being linearly independent.

### Re: [Linear Algebra] Finding an orthogonal vector

Posted: **Sun Jun 04, 2017 12:32 am UTC**

by **cyanyoshi**

If I am understanding correctly, you have an (n-1)-dimensional hyperplane defined as all affine combinations of a set of linearly independent vectors {v_{0}, ... , v_{n-1}}, all embedded in an n-dimensional vector space. You then want to compute a unit vector orthogonal to this plane.

Let V=[v_{0}, ... , v_{n-1}]. If the plane doesn't pass through the origin, then the vector V(V^{T}V)^{-1}1 will be orthogonal to the plane, where 1 is the vector of ones. If that inverse can't be taken, then you can generate a random vector a to add to each v_{i} and try again. (This translates the hyperplane away from the origin.) I haven't worked through the details, but I suspect that this should work.

Or you could do that determinant method. That's probably a better idea.

### Re: [Linear Algebra] Finding an orthogonal vector

Posted: **Sun Jun 04, 2017 10:36 pm UTC**

by **Qaanol**

Cauchy wrote:Don't you do that determinant thing that gets you the cross product?

Thank you much!