## Macro Machines

For the discussion of math. Duh.

Moderators: gmalivuk, Moderators General, Prelates

### Macro Machines

I was reading an article on turing machines, and I came upon an interesting topic, the acceleration of turing machines. What I mean by this is that you have a turing machine, for example the current 6 state-2 symbol best contender, that takes >1056354 steps to reach the halt state, for which it is absolutely useless to run just by itself on an initially blank tape, simply because there is not enough computational power in the entire universe to run >1056354. Now add an acceleration technique, now suddenly, you can simulate many more steps of this turing machines in much less time than just simulating each step.

One acceleration technique that is particularly interesting to me is the concept of a macro machine, a turing machine with many more states, and possibly more symbols, that can simulate what the original turing machine would do, but each step of the macro machine is at least 1 and probably more steps of the original machine. One way these machines do this is by taking a block of n symbols in the original turing machine, and turning it into 1 symbol on the macro machine, and with more state transitions, you can define how each group of n symbols behaves when the original turing machine goes on to that part of the tape.

Now there is three possibilities of what the macro machine will do, it will either change the symbol on the tape and keep on going on the the next square (and possibly change states), it could reach a halting state, because if the original turing machine would halt in the middle of the n symbols, then the macro machine would, or it could enter the loop state, which means that the machine loops and never leaves that group of n symbols. You can test whether the original turing machine would loop on those symbols, because there are a finite number of combinations of symbols, and so can run it for that amount of time, and test whether it loops or not.

My question is: Is there an algorithm that, given the original turing machine, can find a macro machine (it does not have to be optimal or anything like that) that groups n symbols (n is also given) in the way mentioned above, and speeds up the simulation of the turing machine?
I have discovered a truly marvelous proof of this, which this margin is too narrow to contain.
tomtom2357

Posts: 433
Joined: Tue Jul 27, 2010 8:48 am UTC

### Re: Macro Machines

Sounds a lot like an abstraction of RISC vs CISC to me.
Please be gracious in judging my english. (I am not a native speaker/writer.)
lorb

Posts: 135
Joined: Wed Nov 10, 2010 10:34 am UTC
Location: Austria

### Re: Macro Machines

Random guess: Probably. You compress 2^n different options (expressed with n bits) to a single symbol with 2^n options. Re-formulate the turing machine in terms of these symbols, and it should work.
However, the steps of this macro machine are more difficult than the original machine. If you want to run both as simulation on a computer (and you don't want to build this physically anyway), it might be quicker, but in the worst case it might be even slower.

In addition, "add an acceleration technique" is not necessarily a solution. Does it speed up the calculation by a factor of 1000? Great, now we need only >1056351 steps instead of >1056354!
A factor of 10 billion? Now it is >1056344.
The square root? This would be an exceptional speedup by a factor of 1028177. Now it is >1028177.

A logarithmic reduction would help, but how do you get this?
mfb

Posts: 803
Joined: Thu Jan 08, 2009 7:48 pm UTC

### Re: Macro Machines

tomtom2357 wrote:My question is: Is there an algorithm that, given the original turing machine, can find a macro machine (it does not have to be optimal or anything like that) that groups n symbols (n is also given) in the way mentioned above, and speeds up the simulation of the turing machine?

Yes. Your recipe for a maco machine that groups blocks of n symbols wouldn't be hard to formalize. As you say, take a Turing machine T and create a new Turing machine T' which simulates T with a speedup factor of roughly n. The tape symbols are strings of n original symbols, the Turing machine states are pairs (original Turing machine state, L/R), together with a "loop" state, where the symbol L or R indicates whether the simulated T is reading the leftmost or rightmost cell of the current string, and the transition functions take the machine T' to a state simulating where T would be when it first leaves the current n-cell string and enters a neighboring string (or, if it loops infinitely within the current n-cell string, it enters the loop state, and gets stuck there). It wouldn't exactly be a factor of n speedup, because it can take more or fewer than n steps between T entering a strip of n cells coded by a single cell of T' and leaving that strip. But it takes at least n steps between entering and leaving if it didn't leave on the same side that it entered, so I think you can prove that if n is sufficiently large, you can make the speedup factor arbitrarily large (assuming T halts).

However, as mfb says, an arbitrary constant speedup factor doesn't get you very far, if you need to have a Turing machine which is 213 times as complicated to get a speedup factor of 13. What would be nice is a machine which is faster by more than just a constant, which computes the same function as the original Turing machine, but where the time or space complexity is lower. This is what mfb was driving at:
mfb wrote:The square root? This would be an exceptional speedup by a factor of 1028177. Now it is >1028177.

A logarithmic reduction would help, but how do you get this?

However, while this may be possible for some machines, it is not possible in general, by the time hierarchy theorem. By arguments similar to the proof of the time hierarchy theorem, there's no general procedure which tells you whether a Turing machine can be replaced by a significantly asymptotically faster Turing machine, and no general procedure for finding one if it exists. Here, "significantly asymptotically faster" should be about the same as the bound in the Time hierarchy theorem.
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

skeptical scientist
closed-minded spiritualist

Posts: 5920
Joined: Tue Nov 28, 2006 6:09 am UTC

### Re: Macro Machines

What you do is you take the macro machine, and construct a macro machine for that macro machine, and so on, until you have enough speed up to be able to calculate what happens. You also calculate the instructions lazily, so that each instruction is only calculated when it is needed. This way you end up with a massive speed up, so you can actually simulate the turing machine in feasible time.
I have discovered a truly marvelous proof of this, which this margin is too narrow to contain.
tomtom2357

Posts: 433
Joined: Tue Jul 27, 2010 8:48 am UTC

### Re: Macro Machines

tomtom2357 wrote:What you do is you take the macro machine, and construct a macro machine for that macro machine, and so on, until you have enough speed up to be able to calculate what happens. You also calculate the instructions lazily, so that each instruction is only calculated when it is needed. This way you end up with a massive speed up, so you can actually simulate the turing machine in feasible time.

Except you can't actually do that, because if you could speed arbitrary computations up into the "feasible" range (whatever you mean by "feasible), you could violate the time hierarchy theorem.
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

skeptical scientist
closed-minded spiritualist

Posts: 5920
Joined: Tue Nov 28, 2006 6:09 am UTC

### Re: Macro Machines

*facepalm*
No. No you can't.

At each step, you've reduced the time scale, but increased the size of the state-transition matrix. Maybe if you had (near) infinite memory for the state transition table with constant lookup (impossible, of course), and could represent one of a (near) infinite number of symbols in each cell of the tape, then you could do it.

So how bad is the tradeoff? To get a speedup of a constant factor, you need an exponentially larger matrix. Good luck.
Ben-oni

Posts: 270
Joined: Mon Sep 26, 2011 4:56 am UTC

### Re: Macro Machines

OK then, assuming you're right, and its impossible to violate the time hierarchy theorem, then how did anybody simulate the 6-state best contender in the first place? It runs 1056534 steps, but how could anyone know that without first simulating it? Also, the state transition matrix does not need to be fully evaluated before starting the simulation, you can evaluate each transition as it is needed, so that you have far less transitions to store.
I have discovered a truly marvelous proof of this, which this margin is too narrow to contain.
tomtom2357

Posts: 433
Joined: Tue Jul 27, 2010 8:48 am UTC

### Re: Macro Machines

You can analyze it by hand or with a sufficiently advanced program. While there is no algorithm to solve all problems, you can find ways to analyze a lot of problems.
However, finding these ways requires a lot of trickses, which cannot be generalized.
mfb

Posts: 803
Joined: Thu Jan 08, 2009 7:48 pm UTC

### Re: Macro Machines

mfb wrote:You can analyze it by hand or with a sufficiently advanced program. While there is no algorithm to solve all problems, you can find ways to analyze a lot of problems.
However, finding these ways requires a lot of trickses, which cannot be generalized.

You are right, of course. No trick to prove halting (or non-halting) can be generalized, or it would solve the halting problem. However, my trick does not claim to solve the halting problem for all machines, it simply speeds up the execution, and proves non-halting for a limited set of machines. My original question has been answered too, so thanks.

I have another question: I was looking up turing machines and I came on a few articles that seemed to suggest that there is no way to find BB(19) or higher BB numbers, because it was undecidable for some reason. Is this true? And if it is, what axiom system are they asserting it is undecidable for? ZFC?
I have discovered a truly marvelous proof of this, which this margin is too narrow to contain.
tomtom2357

Posts: 433
Joined: Tue Jul 27, 2010 8:48 am UTC

### Re: Macro Machines

tomtom2357 wrote:I have another question: I was looking up turing machines and I came on a few articles that seemed to suggest that there is no way to find BB(19) or higher BB numbers, because it was undecidable for some reason. Is this true? And if it is, what axiom system are they asserting it is undecidable for? ZFC?

Yes. In any axiom system for which it is decidable whether a given statement is an axiom, the set of theorems will be computably enumerable. Then it is possible to write a program P (using the recursion theorem) that checks whether any theorem of the form "the program P halts in at most n steps or else never halts" is ever enumerated, and if it is, runs another n steps and then halts. If the theory is sound, therefore, no such theorem will ever be enumerated. Since one can (easily) prove "BB(k) ≤ n => every ≤k state Turing machine halts in at most n steps or else runs forever," this implies that no value of BB(k) can be proven in the given axiom system, assuming it is sound, when k is more than the number of states needed to implement P as a Turing machine.

So the statement is true for EVERY axiom system, so long as the set of axioms is computable (or even just computably enumerable), and sound. In particular, it applies to ZFC and PA (Peano arithmetic). However, the least value k for which no value for BB(k) can be proven will depend on the axiom system. I'm not sure where the bound 19 came from, or what axiom system it's supposed to apply to.
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

skeptical scientist
closed-minded spiritualist

Posts: 5920
Joined: Tue Nov 28, 2006 6:09 am UTC

### Re: Macro Machines

While macro machines can provide a small linear speedup in some cases (and more if iterated), it seems that the best speedups for simulating the current winning 5 and 6 state Turing machines come from parametrizable pattern matching, where you can prove that repeating patterns of certain forms but possibly of arbitrary length will always transition in a certain way.

See for example this simulation:
http://www.drb.insel.de/~heiner/BB/simmbL6_3-a.html

The simulation is quite obviously achieving an exponential speedup over plain simulation by using a simple type of pattern matching along with run-length-encoding of the tape. The log of the number of steps taken is approximately linear with how far you go down the page, so it's pretty easy to imagine how this TM could have been run for 10^50000 steps had it not halted earlier. Turing machines with more states might produce complex enough behavior to both run for a long time and not be readily amenable to pattern matching quite this simple, but it seems that most of the 5 and 6 state Turing machines that run for long times are also easily sped up in this way.
lightvector

Posts: 184
Joined: Tue Jun 17, 2008 11:04 pm UTC

### Re: Macro Machines

tomtom2357 wrote:assuming you're right, and its impossible to violate the time hierarchy theorem
I think you mean, "Assuming mathematics is consistent, and this mathematical theorem is actually true."
In the future, there will be a global network of billions of adding machines.... One of the primary uses of this network will be to transport moving pictures of lesbian sex by pretending they are made out of numbers.
Spoiler:
gmss1 gmss2

gmalivuk
Archduke Vendredi of Skellington the Third, Esquire

Posts: 19450
Joined: Wed Feb 28, 2007 6:02 pm UTC
Location: Here, There, Everywhere (near Boston, anyway)

### Re: Macro Machines

gmalivuk wrote:
tomtom2357 wrote:assuming you're right, and its impossible to violate the time hierarchy theorem
I think you mean, "Assuming mathematics is consistent, and this mathematical theorem is actually true."

There are undoubtedly many false published theorems. Of course, chances are much higher that someone screwed something up than that any of them points to an inconsistency in mathematics.

...of course, I doubt that the time hierarchy theorem is one of them. Most are probably in papers than have only been read by a handful of people.
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

skeptical scientist
closed-minded spiritualist

Posts: 5920
Joined: Tue Nov 28, 2006 6:09 am UTC

### Re: Macro Machines

skeptical scientist wrote: Most are probably in papers than have only been read by a handful of people.
That describes every paper I've ever written
addams wrote:This forum has some very well educated people typing away in loops with Sourmilk. He is a lucky Sourmilk.
mike-l

Posts: 2535
Joined: Tue Sep 04, 2007 2:16 am UTC

### Re: Macro Machines

mike-l wrote:
skeptical scientist wrote: Most are probably in papers than have only been read by a handful of people.
That describes every paper I've ever written

I'm in the same boat. Then again, my area is a pretty small area.
I'm looking forward to the day when the SNES emulator on my computer works by emulating the elementary particles in an actual, physical box with Nintendo stamped on the side.

"With math, all things are possible." —Rebecca Watson

skeptical scientist
closed-minded spiritualist

Posts: 5920
Joined: Tue Nov 28, 2006 6:09 am UTC

### Re: Macro Machines

skeptical scientist wrote:
mike-l wrote:
skeptical scientist wrote: Most are probably in papers than have only been read by a handful of people.
That describes every paper I've ever written

I'm in the same boat. Then again, my area is a pretty small area.

Pretty much every paper really only gets read by a handful of people. The most wide-read papers are either groundbreaking or foundational. Everything else just builds on that, so you're either adding a bit of value to a growing body of literature, or you're adding value into a very niche field built off of some other groundbreaking or foundational work.
gorcee

Posts: 1501
Joined: Sun Jul 13, 2008 3:14 am UTC
Location: Charlottesville, VA