## An Eigenbasis Problem

For the discussion of math. Duh.

Moderators: gmalivuk, Moderators General, Prelates

Carlington
Posts: 1588
Joined: Sun Mar 22, 2009 8:46 am UTC
Location: Sydney, Australia.

### An Eigenbasis Problem

In the interest of giving a source, this problem comes from this video.

To paraphrase the video (and just by the by, if anyone has a better way to represent matrices here, please share it):

Take the 2*2 matrix A ((0,1),(1,1)). Computing powers of this matrix manually can quickly become tedious. However, as shown in the video, if a 2*2 matrix has two eigenvectors which are linearly independent, these vectors can be used as the basis vectors for a new basis. Given that two eigenvectors of this matrix are v1 = (2,1+sqrt(5)) and v2 = (2,1-sqrt(5)), can you find a way to simply compute any arbitrary An by first changing to an eigenbasis, computing the new representation of An in this basis, then converting back to our standard basis?

On a conceptual level, I understand what needs to be done here. I need some change of basis matrix B and its inverse B-1, and I need to compute B-1AnB. Where I'm sticking is in computing B-1, and possibly also in computing AB. My method, at the moment, is to compute AB for now, since I know how to do that, and then apply A to this repeatedly, and I'll apply B-1 at the end. Is it going to be simpler to compute B-1AB, and take the output A' and then compute A'n? If so, is there some way to identify an easier path to computing the inverse of a matrix, apart from the standard "write B and I next to each other, and then perform row operations until you have turned B into I, and then whatever sits in place of where I was is now B-1"? Because doing that is resulting in headbanging and terms like (-1/1-sqrt(5))(1-((1+sqrt(5))/(1-sqrt(5))) and I'd rather not.
Kewangji: Posdy zwei tosdy osdy oady. Bork bork bork, hoppity syphilis bork.

Eebster the Great: What specifically is moving faster than light in these examples?
doogly: Hands waving furiously.

Please use he/him/his pronouns when referring to me.

Flumble
Yes Man
Posts: 2162
Joined: Sun Aug 05, 2012 9:35 pm UTC

### Re: An Eigenbasis Problem

Carlington wrote:is there some way to identify an easier path to computing the inverse of a matrix, apart from the standard "write B and I next to each other, and then perform row operations until you have turned B into I, and then whatever sits in place of where I was is now B-1"?

For the general case, Gaussian elimination (I guess that's analogous to what you're doing) is the most accessible method. (or at least the only one I understand )
Luckily, for a 2x2 matrix the inversion comes down to: Tub
Posts: 430
Joined: Wed Jul 27, 2011 3:13 pm UTC

### Re: An Eigenbasis Problem

You've been told that B-1AB is a diagonal matrix, which means that once you have B-1 and thus B-1AB, you can easily calculate (B-1AB)n

All you need to figure out is a way to get from (B-1AB)n to An.

(Or you could calculate A100 with no more than 8 matrix multiplications without any knowledge of eigenvalues).

Carlington
Posts: 1588
Joined: Sun Mar 22, 2009 8:46 am UTC
Location: Sydney, Australia.

### Re: An Eigenbasis Problem

This feels fake because it seems like it's too simple, but to get from (B-1AB)n back to An I think I can just find the linear transformation within the basis determined by v1 and v2 which transforms those basis vectors back to the standard î and ĵ, and apply it to (B-1AB)n

EDIT: I'm going to have a rest and attack this again tomorrow I think, because out of B-1AB I get A back, which is definitely not a diagonal matrix.
Kewangji: Posdy zwei tosdy osdy oady. Bork bork bork, hoppity syphilis bork.

Eebster the Great: What specifically is moving faster than light in these examples?
doogly: Hands waving furiously.

Please use he/him/his pronouns when referring to me.

cyanyoshi
Posts: 394
Joined: Thu Sep 23, 2010 3:30 am UTC

### Re: An Eigenbasis Problem

The way I think about this sort of thing is if the eigenvectors of A form a basis, then A can be represented as A=BDB-1 where each column of B is an eigenvector, and D is a diagonal matrix of eigenvalues. Then taking powers of A is easy because An = (BDB-1)n = (BDB-1)(BDB-1)...(BDB-1) = BDD...DB-1 = BDnB-1. This is easier to work with because taking powers of a diagonal matrix is easy.

Think about why A can be written as BDB-1 in the first place. First of all, if there are not enough eigenvectors to span the entire space, then B cannot be invertible because either 1) B isn't square, or 2) the columns of B are linearly dependent. One way to calculate Ax is to write x as a sum of eigenvectors (this is what an "eigenbasis" does). So hopefully x= α1v1 + ... + αnvn for some scalars αi. This can be written in matrix form as x=[v1 v2 ... vn]α = Bα, where α is a vector with elements (α1, ... , αn). Notice that the matrix B turns the standard basis vectors into eigenvectors of A. Eigen.png (8.86 KiB) Viewed 2861 times

In short: multiplying Ax is the same as finding the "α" coefficients of the vector x, then scaling each term by the appropriate eigenvalue λ. Convince yourself that multiplying by An can be done by scaling each of these terms by λn.
Last edited by cyanyoshi on Wed May 03, 2017 6:24 am UTC, edited 1 time in total.

Meteoric
Posts: 333
Joined: Wed Nov 23, 2011 4:43 am UTC

### Re: An Eigenbasis Problem

Carlington wrote:On a conceptual level, I understand what needs to be done here. I need some change of basis matrix B and its inverse B-1, and I need to compute B-1AnB.

I don't think that is what you want to do.

Fundamentally, the trick you're trying to use is that exponents of diagonal matrices are easy, but exponents of other matrices usually are not. So you take your change-of-basis matrix and apply it the other way around:

A=BDB-1

where D is diagonal. Specifically, the entries of D should be the eigenvalues of A, and the columns of B should be the eigenvectors of A.
Now to calculate An, you know

An=(BDB-1)n
=BDB-1BDB-1...BDB-1
=BDnB-1 (because all the adjacent the BB-1 cancel out)

And Dn is easy to calculate, since D is a diagonal matrix.
No, even in theory, you cannot build a rocket more massive than the visible universe.