Matrix multiplication seems so arbitrary
Moderators: gmalivuk, Moderators General, Prelates

 Posts: 236
 Joined: Mon Mar 07, 2011 9:18 pm UTC
Matrix multiplication seems so arbitrary
In order to multiply 2 matries you mulitiply the rows of one with the columns of the other one, add some numbers up, and then you get a new matrix that is a different size than the one you started with (many times it is actually smaller). All of this seems really arbitrary and for some reason is the only way (that I know of) to multiply 2 sets of numbers. Why not do something simpler like multiply the columns with each other instead or not add a bunch of numbers together? Also why are there no other ways to multiply 2 sets of numbers together?
I am sure there is a good explanation for these things since I have seen plenty of seemingly arbitrary decisions mathematicians make explained but so far I can't find any explanation for these things.
I am sure there is a good explanation for these things since I have seen plenty of seemingly arbitrary decisions mathematicians make explained but so far I can't find any explanation for these things.
 Yakk
 Poster with most posts but no title.
 Posts: 11115
 Joined: Sat Jan 27, 2007 7:27 pm UTC
 Location: E pur si muove
Re: Matrix multiplication seems so arbitrary
Do you know the dot product? A matrix multiplication is a bunch of dot products arranged into an output grid.
Row 1 of left, dot column 2 of right, produces row 1 column 2 of result.
The dot product is also the "projection" operation.
Matrix mathematics is also related to linear algebra, and linear transformations. NxM matrices (and MxN) define every linear transformation between spaces of dimension N and M. If you used a different operation, they wouldn't.
Matrix multiplication corresponds to linear operation composition.
Matrix multiplication by a vector corresponds to linear operation application to a vector.
If you want seemingly arbitrary functions, have you met the determinant? Did you prove that if you want a handful of abstract properties it has, that it is unique?
Row 1 of left, dot column 2 of right, produces row 1 column 2 of result.
The dot product is also the "projection" operation.
Matrix mathematics is also related to linear algebra, and linear transformations. NxM matrices (and MxN) define every linear transformation between spaces of dimension N and M. If you used a different operation, they wouldn't.
Matrix multiplication corresponds to linear operation composition.
Matrix multiplication by a vector corresponds to linear operation application to a vector.
If you want seemingly arbitrary functions, have you met the determinant? Did you prove that if you want a handful of abstract properties it has, that it is unique?
One of the painful things about our time is that those who feel certainty are stupid, and those with any imagination and understanding are filled with doubt and indecision  BR
Last edited by JHVH on Fri Oct 23, 4004 BCE 6:17 pm, edited 6 times in total.
Last edited by JHVH on Fri Oct 23, 4004 BCE 6:17 pm, edited 6 times in total.
Re: Matrix multiplication seems so arbitrary
Here's a practical example. Suppose you are a furniture maker.
You can make a table out of 10 planks of wood, 20 screws, and 40 manhours.
You can make a chair out of 8 planks of wood, 10 screws, and 30 manhours.
This gives one matrix B:
[math]\left [ \begin{array}{rcccl} 10 & 20 & 40 \\ 8 & 10 & 30 \end{array}\right ][/math]
The rows are the furniture types, the columns what goes into making them.
Now you get 4 orders: 1 table and 4 chairs; 2 tables and 8 chairs; 1 chair; 1 table and 1 chair.
This gives a second matrix A:
[math]\left [ \begin{array}{rccl} 1 & 4 \\ 2 & 8 \\ 0 & 1 \\ 1 & 1 \end{array}\right ][/math]
The rows are the orders, the columns the types of furniture in the orders.
Now multiply the two matrices AxB, to get:
[math]\left [ \begin{array}{rccl} 42 & 60 & 160 \\ 84 & 120 & 320 \\ 8 & 10 & 30 \\ 18 & 30 & 70 \end{array}\right ][/math]
Each row is an order, with in each column the amount of wood/screws/manhours it takes to fulfil that order.
[Edited: changed BxA to AxB, thanks gorcee]
You can make a table out of 10 planks of wood, 20 screws, and 40 manhours.
You can make a chair out of 8 planks of wood, 10 screws, and 30 manhours.
This gives one matrix B:
[math]\left [ \begin{array}{rcccl} 10 & 20 & 40 \\ 8 & 10 & 30 \end{array}\right ][/math]
The rows are the furniture types, the columns what goes into making them.
Now you get 4 orders: 1 table and 4 chairs; 2 tables and 8 chairs; 1 chair; 1 table and 1 chair.
This gives a second matrix A:
[math]\left [ \begin{array}{rccl} 1 & 4 \\ 2 & 8 \\ 0 & 1 \\ 1 & 1 \end{array}\right ][/math]
The rows are the orders, the columns the types of furniture in the orders.
Now multiply the two matrices AxB, to get:
[math]\left [ \begin{array}{rccl} 42 & 60 & 160 \\ 84 & 120 & 320 \\ 8 & 10 & 30 \\ 18 & 30 & 70 \end{array}\right ][/math]
Each row is an order, with in each column the amount of wood/screws/manhours it takes to fulfil that order.
[Edited: changed BxA to AxB, thanks gorcee]
Last edited by jaap on Tue Mar 15, 2011 6:27 pm UTC, edited 1 time in total.
 Torn Apart By Dingos
 Posts: 817
 Joined: Thu Aug 03, 2006 2:27 am UTC
Re: Matrix multiplication seems so arbitrary
Linear algebra is really about linear transformations. By the properties of linearity, a linear transformation is uniquely defined by what it does to the unit vectors, because if you know that f((1,0))=u and f((0,1))=v, then you know that f((a,b))=f((a,0))+f((0,b))=af((1,0))+bf((0,1))=au+bv (I really want my vectors here to be column vectors, but I'm writing them lying down for brevity's sake). Let's throw the vectors u and v (as column vectors) into a matrix A. Since f is defined by A, we might as well write A instead of f everywhere. If x is a vector, let's write f(x)=A*x. This defines multiplication between matrices and column vectors. If the linear transformation f has the matrix A and the linear transformation g has the matrix B, then f(g(x))=A*(B*x). Now, matrix multiplication is defined such that (A*B)*x=A*(B*x).
In short, matrix multiplication is defined such that we can write any linear transformations f as f(x)=A*x and such that if g(x)=B*x, then f(g(x))=(A*B)*x.
In short, matrix multiplication is defined such that we can write any linear transformations f as f(x)=A*x and such that if g(x)=B*x, then f(g(x))=(A*B)*x.

 Posts: 236
 Joined: Mon Mar 07, 2011 9:18 pm UTC
Re: Matrix multiplication seems so arbitrary
"The rows are the furniture types, the columns what goes into making them."
"The rows are the orders, the columns the types of furniture in the orders."
These 2 decisions both seem pretty abitrary to me. Also I don't understand why multiplying these matricies would solve this problem.
"Each row is an order, with in each column the amount of wood/screws/manhours it takes to fulfil that order."
Why is this true?
"The rows are the orders, the columns the types of furniture in the orders."
These 2 decisions both seem pretty abitrary to me. Also I don't understand why multiplying these matricies would solve this problem.
"Each row is an order, with in each column the amount of wood/screws/manhours it takes to fulfil that order."
Why is this true?
Re: Matrix multiplication seems so arbitrary
scratch123 wrote:"The rows are the furniture types, the columns what goes into making them."
"The rows are the orders, the columns the types of furniture in the orders."
These 2 decisions both seem pretty abitrary to me. Also I don't understand why multiplying these matricies would solve this problem.
"Each row is an order, with in each column the amount of wood/screws/manhours it takes to fulfil that order."
Why is this true?
It's how we set up the problem.
Here's another toy problem:
A contractor needs to spend some money on a capital investment to earn a tax credit. In total, he needs to spend 10000 dollars to earn the credit. He has decided that he can spend the money on buying computers, at 1000 each, or plotters at 2000 each. Because of shipping requirements from his supplier, he must buy exactly six items.
One way to write this is:
x + y = 6
1000*x + 2000*y = 10000
However, we can set this problem up as a matrix equation:
[math]\left(\begin{array}{cc} 1 & 1 \\
1000 & 2000 \end{array}\right)
\left(\begin{array}{c} x \\
y \end{array}\right) = \left(\begin{array}{c} 6 \\
10000 \end{array}\right)[/math]
x represents the number of computers to buy, and y represents the number of plotters.
One way to solve this is through Gaussian elimination. The basic principle is that we try to isolate the bottom row to have only 1 nonzero entry, which will uniquely solve for the value of y. The shortcut way of doing this is to "imagine" multiplying the first row by some number such that when you subtract the result from the bottom row, you have a zero in the (2,1) position.
However, more formally, what we're doing is multiplying both sides of this equation (on the left) by another matrix:
[math]\left(\begin{array}{cc} 1 & 0 \\
1000 & 1 \end{array}\right)
\left(\begin{array}{cc} 1 & 1 \\
1000 & 2000 \end{array}\right)
\left(\begin{array}{c} x \\
y \end{array}\right) = \left(\begin{array}{cc} 1 & 0 \\
1000 & 1 \end{array}\right) \left(\begin{array}{c} 6 \\
10000 \end{array}\right)[/math]
This results in:
[math]\left(\begin{array}{cc} 1 & 1 \\
0 & 1000 \end{array}\right)
\left(\begin{array}{c} x \\
y \end{array}\right) = \left(\begin{array}{c} 6 \\
4000 \end{array}\right)[/math]
Which gives us the answer.
To recap, we start with 2 linear equations with 2 unknowns. It is natural to write this in matrix form, because matrices are linear structures, and our 2 unknowns are mathematically equivalent to a 2D vector. In order to solve this equation, we want to left multiply both sides of the equation by a new matrix (call it P, this is the [1 0; 1000 1] matrix above) that will isolate one of our variables. However, the lefthand side of our equation contains a matrix, and the right hand side contains a vector. So in order to left multiply by P, we have to have a system of multiplication that will work for both matrices and vectors. That means that the number of columns in P must equal the number of rows in our equation.
Re: Matrix multiplication seems so arbitrary
scratch123 wrote:"The rows are the furniture types, the columns what goes into making them."
"The rows are the orders, the columns the types of furniture in the orders."
These 2 decisions both seem pretty abitrary to me. Also I don't understand why multiplying these matricies would solve this problem.
"Each row is an order, with in each column the amount of wood/screws/manhours it takes to fulfil that order."
Why is this true?
Another way of looking at the other example:
Use the data. Solve the problem the manual way.
For the first problem, you have to make 1 table and 4 chairs. Each table takes 10 planks, 20 screws and 40 hours. Each chair takes 8 planks, 10 screws, and 30 hours.
So to make 1 table + 4 chairs you need:
10 planks + 4 * 8 planks, 20 screws + 4* 10 screws, 40 hours + 4 * 30 hours = 42 planks, 60 screws, 160 hours. Notice how this is exactly the first row in the product A*B (also note, jaap wrote B*A, which was probably a typo, since the way he defined B and A, they can't be multiplied like that. But A*B gives the intended result).
We set up the matrices this way because in a matrix, you can think of the columns or rows as vectors. In a vector, v = (x,y), no matter how much you change x, y will remain unchanged until you explicitly change y. So if you're building furniture, think about it this way: no matter how hard you try, you can never turn a screw into a plank, and you can never turn a plank into a man hour. In my example, you can never turn a plotter into a computer, or vice versa. These are fullyindependent quantities. When you have independent dimensions like this, it is a natural choice to think of them as vectors, because with a vector, you can never affect the value in along one axis by only doing something to a value in another axis.
Re: Matrix multiplication seems so arbitrary
I can somewhat sympathize with the OP.
I remember taking an intro linear algebra course in my first semester at university, and being a little worried that it wasn't "intuitive" to me why we multiply matrices the way we do. Why do we take a row of the first matrix and a column of the second matrix?
Let's say we're inventing matrix algebra from scratch, and we have a problem like jaap's.
It makes sense to put these in a matrix. There are two options. We could have a row for each type of furniture:
[ 10 20 40 ]
[ 8 10 30 ]
or a column for each type of furniture:
[ 10 8 ]
[ 20 10 ]
[ 40 30 ]
It probably doesn't really matter which of those two conventions we choose. But once we choose one, we might then find it convenient to choose other conventions in a way that fits well with our previous choices.
So say we choose the first option, where we have a row for each type of furniture.
Next, we get some orders for particular quantities of furniture. Say one order is for 2 tables and 8 chairs. We can put those numbers in a matrix as well. What do we choose? Maybe
[ 2 8 ]
or maybe
[ 2 ]
[ 8 ]
If all we ever wanted to do was just multiply one matrix by one vector, we could probably choose either of those two options, and choose our definitions and conventions in a way that makes it work.
But maybe if we start doing more things with matrices, we'll see reasons for preferring one convention to another.
Now let's think a little more algebraically.
Suppose we have variables t and c that are linear functions of other variables s, w, and m:
t = 10w + 20s + 40m
c = 8w + 10s + 30m
Then, say we have other variables p, q, r that are linear functions of t and c. (p, q, r could represent "orders".)
p = 1t + 4c
q = 2t + 8c
r = 1t + 1c
The dependence of (t,c) on (w,s,m) could be summarized in a matrix. And the dependence of (p,q,r) on (t,c) could be summarized in a matrix. And we might have a choice among different conventions, but we probably want to be "consistent" in some sense (e.g. the first of these two matrices will have 2 rows and 3 columns, and the second will have 3 rows and 2 columns).
What I've said may not be super precise and I may have rambled a bit, but I hope these types of considerations can make the conventions for matrix multiplication seem a little less arbitrary.
I remember taking an intro linear algebra course in my first semester at university, and being a little worried that it wasn't "intuitive" to me why we multiply matrices the way we do. Why do we take a row of the first matrix and a column of the second matrix?
Let's say we're inventing matrix algebra from scratch, and we have a problem like jaap's.
You can make a table out of 10 planks of wood, 20 screws, and 40 manhours.
You can make a chair out of 8 planks of wood, 10 screws, and 30 manhours.
It makes sense to put these in a matrix. There are two options. We could have a row for each type of furniture:
[ 10 20 40 ]
[ 8 10 30 ]
or a column for each type of furniture:
[ 10 8 ]
[ 20 10 ]
[ 40 30 ]
It probably doesn't really matter which of those two conventions we choose. But once we choose one, we might then find it convenient to choose other conventions in a way that fits well with our previous choices.
So say we choose the first option, where we have a row for each type of furniture.
Next, we get some orders for particular quantities of furniture. Say one order is for 2 tables and 8 chairs. We can put those numbers in a matrix as well. What do we choose? Maybe
[ 2 8 ]
or maybe
[ 2 ]
[ 8 ]
If all we ever wanted to do was just multiply one matrix by one vector, we could probably choose either of those two options, and choose our definitions and conventions in a way that makes it work.
But maybe if we start doing more things with matrices, we'll see reasons for preferring one convention to another.
Now let's think a little more algebraically.
Suppose we have variables t and c that are linear functions of other variables s, w, and m:
t = 10w + 20s + 40m
c = 8w + 10s + 30m
Then, say we have other variables p, q, r that are linear functions of t and c. (p, q, r could represent "orders".)
p = 1t + 4c
q = 2t + 8c
r = 1t + 1c
The dependence of (t,c) on (w,s,m) could be summarized in a matrix. And the dependence of (p,q,r) on (t,c) could be summarized in a matrix. And we might have a choice among different conventions, but we probably want to be "consistent" in some sense (e.g. the first of these two matrices will have 2 rows and 3 columns, and the second will have 3 rows and 2 columns).
What I've said may not be super precise and I may have rambled a bit, but I hope these types of considerations can make the conventions for matrix multiplication seem a little less arbitrary.
Re: Matrix multiplication seems so arbitrary
scratch123 wrote:Why not do something simpler like multiply the columns with each other instead or not add a bunch of numbers together? Also why are there no other ways to multiply 2 sets of numbers together?
Others have addressed in depth why our usual matrix multiplication is defined the way it is. They all boil down to, "we do it that way because it's useful for solving certain types of problems". If we want to talk about other multiplications then we need to be looking at other types of problems. AND such problems and other multiplications do exist.
The Kronecker product http://en.wikipedia.org/wiki/Kronecker_product is one such product. The Schur product is another.
Most people won't see these (or other alternative) multiplications in a one semester firstintroductiontomatrices class but in a second or third semester of an undergrad matrix/linear algebra sequence or in a entry level graduate level class these are not uncommon products... Of course most of the material even in these courses will often deal with multiplication in a "normal" setting. Why? Because that's the setting we find ourselves in... and indeed the linear setting is what we try to force many of our "harder" problems into because once we have a linear approximation we have all the tools of linear algebra at our disposal.

 Posts: 236
 Joined: Mon Mar 07, 2011 9:18 pm UTC
Re: Matrix multiplication seems so arbitrary
I was trying to find a type of math that was like polynomials except instead of the variables representing numbers they represented sets of numbers. I was hoping learning more about matricies would lead to something like this but it hasn't. Matricies just seem too focused on geometry. I was also hoping to find some connection between matricies and cellular automata but I haven't found one there either. Overall it seems like matricies are only used to solve problems that I am not interested in.
Re: Matrix multiplication seems so arbitrary
I wrote a handout on linear maps for a multivariable calculus class I was TAing this quarter. It is designed to have leading questions which force you to discover how linear maps work on your own. For this, it is very important that you work through the packet in order  don't skip ahead before you have solved an exercise. By the end of the packet, you should have a pretty strong understanding of why matrix multiplication works the way it does. If you have any questions, just reply at this thread and I will get back to you.
 Attachments

 254.01Handout1.pdf
 (98.81 KiB) Downloaded 140 times
Re: Matrix multiplication seems so arbitrary
@your last comment about matrices only being used for problems you are not interested in  This is just totally not true. Linear algebra is one great building blocks of modern mathematics. Most of mathematics could be viewed as a big machine for turning really hard problems into basic calculus, combinatorics, or linear algebra.
A big example comes from multivariable calculus. If you appreciate the power of the derivative, understand that it is something which gives you linear approximations to a function. In several variables, the derivative also gives you locally linear approximations  that is in multivariable calculus, the derivative is a matrix (even if this was kept secret from you when/if you learned multiV).
Homology and cohomology are essentially tools for turning topological problems into linear algebra problems.
I know nothing about cellular automata, but I would be extremely surprized if linear maps never made an appearance.
Please don't underestimate the power of linear algebra.
A big example comes from multivariable calculus. If you appreciate the power of the derivative, understand that it is something which gives you linear approximations to a function. In several variables, the derivative also gives you locally linear approximations  that is in multivariable calculus, the derivative is a matrix (even if this was kept secret from you when/if you learned multiV).
Homology and cohomology are essentially tools for turning topological problems into linear algebra problems.
I know nothing about cellular automata, but I would be extremely surprized if linear maps never made an appearance.
Please don't underestimate the power of linear algebra.
Re: Matrix multiplication seems so arbitrary
scratch123 wrote:I was trying to find a type of math that was like polynomials except instead of the variables representing numbers they represented sets of numbers. I was hoping learning more about matricies would lead to something like this but it hasn't. Matricies just seem too focused on geometry. I was also hoping to find some connection between matricies and cellular automata but I haven't found one there either. Overall it seems like matricies are only used to solve problems that I am not interested in.
Matrices have deep relationships to polynomials. Let's say you're doing a polynomial interpolation of data, and you want to exactly fit a nth degree polynomial to n points. You'll form what's known as a Vandermonde matrix to do so.
Or, perhaps you have n data points, but only want an mth order polynomial, where n > m. A Vandermonde matrix allows you to set up a linear least squares problem.
Matrices also appear in things like dynamical systems and differential equations. In control systems, you might talk about Lyapunov stability, or the Riccati equations; these are matrix equations that have some very elegant and beautiful solution techniques.
The way we multiply matrices has deeper connections than just "it's convenient to do so." There are properties of matrices that can only hold if that method of multiplication is used. For example, let's say you do an eigenvalue decomposition of a full rank square matrix: [imath]A = Q^{1}VQ[/imath]. Then you can easily compute the eigenvalues of the inverse of A, since [imath]A^{1} = (Q^{1}VQ)^{1} = Q^{1}V^{1}Q[/imath]. Why does this work? Well, when you invert a product of matrices, you invert each matrix, and reverse the multiplication order. (If this seems arbitrary, it's not, since once you understand a matrix inverse, you can see how this property immediately follows). Well, since you've got a inverse Q on one side, and then a normal Q on the other, when you invert the whole shebang, your inverse Q and normal Q end up staying where they were before! All that changes is that you go from V to inverse V. What's even more interesting is that V is a diagonal matrix, so its inverse is just one over its diagonal entries.
Edited, because the eigenvector matrix need not be orthogonal. The rest holds, however.
Last edited by gorcee on Wed Mar 16, 2011 3:38 pm UTC, edited 1 time in total.
 Yakk
 Poster with most posts but no title.
 Posts: 11115
 Joined: Sat Jan 27, 2007 7:27 pm UTC
 Location: E pur si muove
Re: Matrix multiplication seems so arbitrary
If you can connect your cellular automata to graph theory (and who can't?), then matrix multiplication is ridiculously useful.
One of the painful things about our time is that those who feel certainty are stupid, and those with any imagination and understanding are filled with doubt and indecision  BR
Last edited by JHVH on Fri Oct 23, 4004 BCE 6:17 pm, edited 6 times in total.
Last edited by JHVH on Fri Oct 23, 4004 BCE 6:17 pm, edited 6 times in total.

 Posts: 236
 Joined: Mon Mar 07, 2011 9:18 pm UTC
Re: Matrix multiplication seems so arbitrary
The math I am interested in is usually much simpler than the stuff described in this topic and is probably different from the math most people are interested in. I like math where simple rules create complex behavior (if its not complex thats sometimes ok too) and math that is related to games (game theory and combitorial game theory). Since I mainly learn math through wikipedia I know about the existance of many fields of math but my depth of knowledge in them isn't enough to solve even the simplest problems in many of those fields. In other words my breadth of knowledge is great but my depth of knowledge is very low.
I don't like calculus much because it deals with infinite things too much. Also infinity reminds me of repetition which I don't like.
I don't like calculus much because it deals with infinite things too much. Also infinity reminds me of repetition which I don't like.
Re: Matrix multiplication seems so arbitrary
scratch123 wrote:The math I am interested in is usually much simpler than the stuff described in this topic and is probably different from the math most people are interested in. I like math where simple rules create complex behavior (if its not complex thats sometimes ok too) and math that is related to games (game theory and combitorial game theory). Since I mainly learn math through wikipedia I know about the existance of many fields of math but my depth of knowledge in them isn't enough to solve even the simplest problems in many of those fields. In other words my breadth of knowledge is great but my depth of knowledge is very low.
I don't like calculus much because it deals with infinite things too much. Also infinity reminds me of repetition which I don't like.
I hate to break it to you, but going from basic game theory to uncovering some of the elegant and important properties of the theory requires calculus. In particular, there's something called a Stieltjes Integral that appears frequently in the study of games. And in order to understand those, you have to understand calculus.
Infinity rarely has to do with repetition. Calculus is (more or less) based on the notion of limits, and while in studying limits you can encounter things like repeated sums, this is really a step in the process, and not necessarily the process itself.
 Antimony120
 Posts: 830
 Joined: Wed Apr 09, 2008 4:16 am UTC
 Location: Wherever you can look  wherever there's a fight, so hungry people can eat.
Re: Matrix multiplication seems so arbitrary
scratch123 wrote:The math I am interested in is usually much simpler than the stuff described in this topic and is probably different from the math most people are interested in. I like math where simple rules create complex behavior (if its not complex thats sometimes ok too) and math that is related to games (game theory and combitorial game theory). Since I mainly learn math through wikipedia I know about the existance of many fields of math but my depth of knowledge in them isn't enough to solve even the simplest problems in many of those fields. In other words my breadth of knowledge is great but my depth of knowledge is very low.
I don't like calculus much because it deals with infinite things too much. Also infinity reminds me of repetition which I don't like.
Matricies are GREAT for game theory, you can use them in a whole shwack of places if you set the problem up correctly. The reason everything seems to relate to geometry isn't because mathmaticians have a really hard time drawing straight lines, it's because nearly everything can be represented geometrically if you want.
And if you don't like calculus you're going to have issues with a lot of math, and miss out on a lot of simple rules creating complex behaviour.
Wolydarg wrote:That was like a roller coaster of mathematical reasoning. Problems! Solutions! More problems!
****************Signature Dehosted, New Signature Under Construction************************
 Yakk
 Poster with most posts but no title.
 Posts: 11115
 Joined: Sat Jan 27, 2007 7:27 pm UTC
 Location: E pur si muove
Re: Matrix multiplication seems so arbitrary
https://secure.wikimedia.org/wikipedia/ ... ncy_matrix
Nothing requiring real numbers at all. No linear algebra, no calculus.
Still useful.
(Well, sort of linear algebra  but linear algebra on the integers, which isn't what is normally mean when people say linear algebra).
That will, as noted, show up in analysis of automata and game theory.
Nothing requiring real numbers at all. No linear algebra, no calculus.
Still useful.
(Well, sort of linear algebra  but linear algebra on the integers, which isn't what is normally mean when people say linear algebra).
That will, as noted, show up in analysis of automata and game theory.
One of the painful things about our time is that those who feel certainty are stupid, and those with any imagination and understanding are filled with doubt and indecision  BR
Last edited by JHVH on Fri Oct 23, 4004 BCE 6:17 pm, edited 6 times in total.
Last edited by JHVH on Fri Oct 23, 4004 BCE 6:17 pm, edited 6 times in total.
Re: Matrix multiplication seems so arbitrary
Until you start to compute the spectrum of a graph (the eigenvalues of the adjacency matrix). Then both real/imaginary numbers and linear algebra show up at the same time.Yakk wrote:https://secure.wikimedia.org/wikipedia/en/wiki/Adjacency_matrix
Nothing requiring real numbers at all. No linear algebra, no calculus.
The sad truth is, you can't dodge calculus or linear algebra, really. Differential operators and matrices shows up WAY too often, even if you are working with formal power series that you don't care a thing about limits or convergence.
Re: Matrix multiplication seems so arbitrary
Linear algebra is essential to probability. You can't do game theory without probability.
Even when it comes to CGT, linear algebra can be important depending on what game you're analyzing.
In fact, out of all branches of "mainstream" mathematics, linear algebra is perhaps the one with the most varied and pervasive applications.
Even when it comes to CGT, linear algebra can be important depending on what game you're analyzing.
In fact, out of all branches of "mainstream" mathematics, linear algebra is perhaps the one with the most varied and pervasive applications.
Who is online
Users browsing this forum: No registered users and 13 guests