Mind Blowing Algorithms

A place to discuss the science of computers and programs, from algorithms to computability.

Formal proofs preferred.

Moderators: phlip, Prelates, Moderators General

Mind Blowing Algorithms

Postby snow5379 » Mon Jan 09, 2012 3:19 am UTC

Have you ever seen an algorithm that just blew your mind and made you wonder how it was ever thought of? For example I find the following one amazing:

Code: Select all
static int bitCount(int a){int b=0;while(a>0){a&=a-1;b++;}return b;}
snow5379
 
Posts: 247
Joined: Tue Apr 12, 2011 6:06 pm UTC

Re: Mind Blowing Algorithms

Postby mfb » Mon Jan 09, 2012 10:56 am UTC

mfb
 
Posts: 838
Joined: Thu Jan 08, 2009 7:48 pm UTC

Re: Mind Blowing Algorithms

Postby troyp » Mon Jan 09, 2012 8:55 pm UTC

snow5379 wrote:Have you ever seen an algorithm that just blew your mind and made you wonder how it was ever thought of? For example I find the following one amazing:

Code: Select all
static int bitCount(int a){int b=0;while(a>0){a&=a-1;b++;}return b;}


I'm sorry, I have to ask: can you think of a *more* obvious and straightforward way to count the number of bits in a int than the one you quoted? Because personally, I would find *that* amazing.
troyp
 
Posts: 529
Joined: Thu May 22, 2008 9:20 pm UTC
Location: Lismore, NSW

Re: Mind Blowing Algorithms

Postby scarecrovv » Mon Jan 09, 2012 9:13 pm UTC

troyp wrote:
snow5379 wrote:Have you ever seen an algorithm that just blew your mind and made you wonder how it was ever thought of? For example I find the following one amazing:

Code: Select all
static int bitCount(int a){int b=0;while(a>0){a&=a-1;b++;}return b;}


I'm sorry, I have to ask: can you think of a *more* obvious and straightforward way to count the number of bits in a int than the one you quoted? Because personally, I would find *that* amazing.

Well the most obvious and straightforward way is:

Code: Select all
b = 0;
while (a > 0) {
  if (a & 1) {
    b++;
  }
  a = a >> 1;
}


You've got to admit that the a&=a-1 optimization is clever.
User avatar
scarecrovv
It's pronounced 'double u'
 
Posts: 644
Joined: Wed Jul 30, 2008 4:09 pm UTC
Location: California

Re: Mind Blowing Algorithms

Postby shawnhcorey » Mon Jan 09, 2012 9:14 pm UTC

troyp wrote:I'm sorry, I have to ask: can you think of a *more* obvious and straightforward way to count the number of bits in a int than the one you quoted? Because personally, I would find *that* amazing.


Code: Select all
while( a > 0 ){

    /* add one if a is odd, add zero otherwise */
    b += ( a & 1 );

    /* divide a by 2 */
    a >>= 1;

}
Last edited by shawnhcorey on Mon Jan 09, 2012 9:49 pm UTC, edited 1 time in total.
User avatar
shawnhcorey
 
Posts: 42
Joined: Sun Jan 08, 2012 2:08 pm UTC

Re: Mind Blowing Algorithms

Postby troyp » Mon Jan 09, 2012 9:29 pm UTC

scarecrovv wrote:Well the most obvious and straightforward way is...
You've got to admit that the a&=a-1 optimization is clever.


Yeah, you're right: the most obvious way would use shifts. I guess I was thinking of the a&=a-1 as just an incremental optimization of the same basic algorithm, but it is notable, I guess. I couldn't call it a "mind-bending algorithm" with a straight face, though. Maybe "interesting bit hack".

On a general note, I like this topic and I have come across algorithms which really impressed me with their creativity, but I can't recall any at the moment. Except for maybe half of Oleg Kiselyov's website (or half of the fraction I can understand).
troyp
 
Posts: 529
Joined: Thu May 22, 2008 9:20 pm UTC
Location: Lismore, NSW

Re: Mind Blowing Algorithms

Postby Proginoskes » Tue Jan 10, 2012 7:36 am UTC

shawnhcorey wrote:
troyp wrote:I'm sorry, I have to ask: can you think of a *more* obvious and straightforward way to count the number of bits in a int than the one you quoted? Because personally, I would find *that* amazing.


Code: Select all
while( a > 0 ){

    /* add one if a is odd, add zero otherwise */
    b += ( a & 1 );

    /* divide a by 2 */
    a >>= 1;

}


You forgot to initialize. Also, a for loop makes it one statement long:

Code: Select all
for (b = 0; a; a >>= 1) b += (a & 1);
User avatar
Proginoskes
 
Posts: 309
Joined: Mon Nov 14, 2011 7:07 am UTC
Location: Sitting Down

Re: Mind Blowing Algorithms

Postby shawnhcorey » Tue Jan 10, 2012 1:31 pm UTC

Proginoskes wrote:
shawnhcorey wrote:
troyp wrote:I'm sorry, I have to ask: can you think of a *more* obvious and straightforward way to count the number of bits in a int than the one you quoted? Because personally, I would find *that* amazing.


Code: Select all
while( a > 0 ){

    /* add one if a is odd, add zero otherwise */
    b += ( a & 1 );

    /* divide a by 2 */
    a >>= 1;

}


You forgot to initialize. Also, a for loop makes it one statement long:

Code: Select all
for (b = 0; a; a >>= 1) b += (a & 1);


You forgot the comments.
User avatar
shawnhcorey
 
Posts: 42
Joined: Sun Jan 08, 2012 2:08 pm UTC

Re: Mind Blowing Algorithms

Postby letterX » Tue Jan 10, 2012 11:13 pm UTC

Why are your comments helpful? Especially the one that says "divide a by 2". We actually don't care that we're dividing a by 2 -- what makes the algorithm useful is that we're shifting all the bits right by 1, so that the algorithm can consume the bits of a one-by-one. Really, the one liner is fine if you wrap it up in a function or (god forbid) a macro called CountBitsSet or somesuch. A one line function with a self-evident name should really be all that's necessary to know what the function does.
letterX
 
Posts: 526
Joined: Fri Feb 22, 2008 4:00 am UTC
Location: Ithaca, NY

Re: Mind Blowing Algorithms

Postby shawnhcorey » Wed Jan 11, 2012 1:46 am UTC

letterX wrote:Why are your comments helpful?


Documentation is like sex: when it's good, it's very, very good; and when it's bad, it's still better than nothing.
-- Dick Brandon
User avatar
shawnhcorey
 
Posts: 42
Joined: Sun Jan 08, 2012 2:08 pm UTC

Re: Mind Blowing Algorithms

Postby EvanED » Wed Jan 11, 2012 1:56 am UTC

shawnhcorey wrote:
letterX wrote:Why are your comments helpful?


Documentation is like sex: when it's good, it's very, very good; and when it's bad, it's still better than nothing.
-- Dick Brandon

When documentation is bad it is at best distracting and at worst misleading and something additional to maintain.

No documentation is (almost) always better than incorrect documentation, and a lot of the time no documentation is better than bad documentation.
EvanED
 
Posts: 4133
Joined: Mon Aug 07, 2006 6:28 am UTC
Location: Madison, WI

Re: Mind Blowing Algorithms

Postby Proginoskes » Wed Jan 11, 2012 6:52 am UTC

shawnhcorey wrote:You forgot the comments.


Real Men don't use comments. If it was hard to write, it should be hard to figure out.

Getting back to the original topic ... There is a polynomial-time (cubic, even) algorithm which checks to see whether a graph G can be embedded in 3-dimensional space with every cycle being (homeomorphic to) the unknot. (This is due to a series of papers about graph minors published by Robertson and Seymour; you just check to see whether G has a "minor" isomorphic to a graph in a certain finite set S.) However, no one knows any algorithm for this task, even a double-exponential one!

This has to be a mind-blowing result, if anything.
User avatar
Proginoskes
 
Posts: 309
Joined: Mon Nov 14, 2011 7:07 am UTC
Location: Sitting Down

Re: Mind Blowing Algorithms

Postby WarDaft » Wed Jan 11, 2012 3:51 pm UTC

Proginoskes wrote:Getting back to the original topic ... There is a polynomial-time (cubic, even) algorithm which checks to see whether a graph G can be embedded in 3-dimensional space with every cycle being (homeomorphic to) the unknot. (This is due to a series of papers about graph minors published by Robertson and Seymour; you just check to see whether G has a "minor" isomorphic to a graph in a certain finite set S.) However, no one knows any algorithm for this task, even a double-exponential one!

This has to be a mind-blowing result, if anything.


See, this is probably the sort of thing we'd get even if we did find P=NP.
All Shadow priest spells that deal Fire damage now appear green.
Big freaky cereal boxes of death.
User avatar
WarDaft
 
Posts: 1574
Joined: Thu Jul 30, 2009 3:16 pm UTC

Re: Mind Blowing Algorithms

Postby Proginoskes » Thu Jan 12, 2012 9:17 am UTC

WarDaft wrote:
Proginoskes wrote:Getting back to the original topic ... There is a polynomial-time (cubic, even) algorithm which checks to see whether a graph G can be embedded in 3-dimensional space with every cycle being (homeomorphic to) the unknot. (This is due to a series of papers about graph minors published by Robertson and Seymour; you just check to see whether G has a "minor" isomorphic to a graph in a certain finite set S.) However, no one knows any algorithm for this task, even a double-exponential one!

This has to be a mind-blowing result, if anything.


See, this is probably the sort of thing we'd get even if we did find P=NP.


You don't need P=NP for this algorithm; all you need is the set S. Unfortunately, no one knows even how big S is.

(If you're testing whether a graph is planar, you can use a similar algorithm; |S| = 2 (Kuratowski's Theorem). If you're checking for projective-plane embedding, |S| = 35 (Archdeacon's Theorem). If you're looking at whether a graph embeds on the torus, |S| is known to be at least a few thousand.)
User avatar
Proginoskes
 
Posts: 309
Joined: Mon Nov 14, 2011 7:07 am UTC
Location: Sitting Down

Re: Mind Blowing Algorithms

Postby WarDaft » Fri Jan 13, 2012 12:00 am UTC

No, I mean... if we somehow did find that P=NP, we'd probably end up with a non-constructive proof and no clue how to actually take advantage of that, while simultaneously becoming terrified now that all of the algorithms our current security is based on are now actually proven insecure.
All Shadow priest spells that deal Fire damage now appear green.
Big freaky cereal boxes of death.
User avatar
WarDaft
 
Posts: 1574
Joined: Thu Jul 30, 2009 3:16 pm UTC

Re: Mind Blowing Algorithms

Postby phlip » Fri Jan 13, 2012 1:28 am UTC

Well, there exists a meta-algorithm that can solve any NP problem, which has the property that if P=NP then it will run in polynomial time.

It just has an absolutely ridiculous constant term and is in no way usable in the real world (essentially, if the shortest program that could solve the problem in polynomial time is n bits long, then the meta-algorithm will solve the problem in polynomial time with a constant term on the order of 4n).
While no one overhear you quickly tell me not cow cow.
but how about watch phone?
User avatar
phlip
Restorer of Worlds
 
Posts: 7172
Joined: Sat Sep 23, 2006 3:56 am UTC
Location: Australia

Re: Mind Blowing Algorithms

Postby Proginoskes » Fri Jan 13, 2012 5:57 am UTC

phlip wrote:Well, there exists a meta-algorithm that can solve any NP problem, which has the property that if P=NP then it will run in polynomial time.

It just has an absolutely ridiculous constant term and is in no way usable in the real world (essentially, if the shortest program that could solve the problem in polynomial time is n bits long, then the meta-algorithm will solve the problem in polynomial time with a constant term on the order of 4n).


Source?
User avatar
Proginoskes
 
Posts: 309
Joined: Mon Nov 14, 2011 7:07 am UTC
Location: Sitting Down

Re: Mind Blowing Algorithms

Postby phlip » Fri Jan 13, 2012 6:19 am UTC

Wikipedia covers the basics, but it's pretty simple. First you need some sort of enumeration of all the possible Turing machines, so that you can say "run Turing machine number 17 for 30 steps" and see what you get. Then, given F(inp, ans) which returns true if ans is the correct answer for the input inp for a particular NP puzzle (by definition, there is code to calculate F() in polynomial time... that's what NP means). Then, you can do this:
Code: Select all
for n in [1..]:
  for i in [1..n]:
    Simulate Turing machine i, with input inp, for n steps
    ans := the result of the simulated Turing machine
    if F(inp. ans) returns true:
      return ans
Now, if Turing machine i solves the problem, and it takes n steps, then this will take on the order of max(i2, n2) loops to find, and each loop will run the F() function, which is polynomial-time.

So the end result is that, if Turing machine i solves the problem in O(f(n)) time, and our checker-function can check a solution in O(g(n)) time, then this meta-algorithm will accept the problem in O(max(i2, f(n)2) * g(n)) time. All the terms there are polynomial in terms of n (i2 is a constant for a given problem in terms of n, g(n) is polynomial by definition, f(n) is polynomial if P=NP). So the full expansion is also a polynomial (it's one polynomial, squared, times another polynomial). But i2 is the constant term, and it's very big (if the program found is k bits long, then i=2k, so the constant term is i2=4k).

Now, if P != NP, then this meta-algorithm will still work the same... but it won't necessarily find a polynomial-time solution to simulate (indeed, for an NP-complete problem, it won't find one), so the runtime of the meta-algorithm will blow out and won't be polynomial either. However, the meta-algorithm will always work if there exists a machine that will solve the problem, and it's been proven that every NP problem can be solved in exponential time with a brute-force search. So it all else fails, the meta-algorithm will find a Turing machine that does that brute-force search, which will in turn eventually find the solution.
While no one overhear you quickly tell me not cow cow.
but how about watch phone?
User avatar
phlip
Restorer of Worlds
 
Posts: 7172
Joined: Sat Sep 23, 2006 3:56 am UTC
Location: Australia

Re: Mind Blowing Algorithms

Postby letterX » Fri Jan 13, 2012 7:03 pm UTC

It's also worth pointing out that on inputs that should return "no", the algorithm runs forever (note that there isn't any mechanism for the program even ouputting "no"). So it's only a polynomial-time recognizer for an NP-complete language, not a full algorithm. I don't actually know of an algorithm which is guaranteed to halt in p(n) steps for some polynomial, assuming P=NP, but it's possible that there is one that I don't know about.
letterX
 
Posts: 526
Joined: Fri Feb 22, 2008 4:00 am UTC
Location: Ithaca, NY

Re: Mind Blowing Algorithms

Postby WarDaft » Fri Jan 13, 2012 11:38 pm UTC

phlip wrote:Wikipedia covers the basics, but it's pretty simple. First you need some sort of enumeration of all the possible Turing machines, so that you can say "run Turing machine number 17 for 30 steps" and see what you get. Then, given F(inp, ans) which returns true if ans is the correct answer for the input inp for a particular NP puzzle (by definition, there is code to calculate F() in polynomial time... that's what NP means). Then, you can do this:
Code: Select all
for n in [1..]:
  for i in [1..n]:
    Simulate Turing machine i, with input inp, for n steps
    ans := the result of the simulated Turing machine
    if F(inp. ans) returns true:
      return ans
Now, if Turing machine i solves the problem, and it takes n steps, then this will take on the order of max(i2, n2) loops to find, and each loop will run the F() function, which is polynomial-time.

So the end result is that, if Turing machine i solves the problem in O(f(n)) time, and our checker-function can check a solution in O(g(n)) time, then this meta-algorithm will accept the problem in O(max(i2, f(n)2) * g(n)) time. All the terms there are polynomial in terms of n (i2 is a constant for a given problem in terms of n, g(n) is polynomial by definition, f(n) is polynomial if P=NP). So the full expansion is also a polynomial (it's one polynomial, squared, times another polynomial). But i2 is the constant term, and it's very big (if the program found is k bits long, then i=2k, so the constant term is i2=4k).

Now, if P != NP, then this meta-algorithm will still work the same... but it won't necessarily find a polynomial-time solution to simulate (indeed, for an NP-complete problem, it won't find one), so the runtime of the meta-algorithm will blow out and won't be polynomial either. However, the meta-algorithm will always work if there exists a machine that will solve the problem, and it's been proven that every NP problem can be solved in exponential time with a brute-force search. So it all else fails, the meta-algorithm will find a Turing machine that does that brute-force search, which will in turn eventually find the solution.


Actually, it doesn't even need to find an algorithm that brute forces it... it is an algorithm that brute forces it. It will quickly find machines that output specific outputs for the given input, possibly faster than it finds machines that do brute forcing on the algorithm, and certainly faster than the machines it finds complete the process.
All Shadow priest spells that deal Fire damage now appear green.
Big freaky cereal boxes of death.
User avatar
WarDaft
 
Posts: 1574
Joined: Thu Jul 30, 2009 3:16 pm UTC

Re: Mind Blowing Algorithms

Postby lightvector » Sat Jan 14, 2012 1:39 am UTC

letterX wrote:It's also worth pointing out that on inputs that should return "no", the algorithm runs forever (note that there isn't any mechanism for the program even ouputting "no"). So it's only a polynomial-time recognizer for an NP-complete language, not a full algorithm. I don't actually know of an algorithm which is guaranteed to halt in p(n) steps for some polynomial, assuming P=NP, but it's possible that there is one that I don't know about.


I also don't know one if we require the program to remain explicit. I'd be interested to know if there is a way. But otherwise but we can still get pretty close.

One simple way is to augment the meta-program that Philip described with a polynomial time bound, where if it fails to output an answer within that bound, it halts and declares no. Then if P = NP, for any NP problem, there is *some* time bound where it will work and correctly output "no" for the "no" instances of the problem. This loses a little bit of the explicitness, though, since we don't know the time bound.

Another more fun way is to augment the meta-program with a ZFC-set-theory proof checker. It says "yes" if any of the Turing machines output a proof in ZFC that the answer to that instance of the problem is "yes" and it outputs "no" if any of the Turing machines output a proof in ZFC that the answer to that instance is "no". Then, if there exists any polynomial time algorithm for the problem that can be proven correct, the meta-algorithm will solve the problem in polynomial time because it will eventually find the Turing machine that runs that algorithm, outputs the proof of its correctness, and outputs a record of the computation it did to show that it indeed ran that algorithm.

This meta-program is still perfectly explicit and also gets all the "no" answers in polynomial time too if P = NP. Except it might not work in the unfortunate case where there are polynomial-time algorithms for a problem, but none of these algorithms can be proven to be correct!
lightvector
 
Posts: 198
Joined: Tue Jun 17, 2008 11:04 pm UTC

Re: Mind Blowing Algorithms

Postby Thesh » Sat Jan 14, 2012 5:48 am UTC

For counting 1 bits (hamming weight), see:

http://graphics.stanford.edu/~seander/b ... etParallel

Code: Select all
v = v - ((v >> 1) & 0x55555555);                    // reuse input as temporary
v = (v & 0x33333333) + ((v >> 2) & 0x33333333);     // temp
c = ((v + (v >> 4) & 0xF0F0F0F) * 0x1010101) >> 24; // count
Deceiving appearance, they're dressed up as gods.
Fake that they care, their conscience is lost.
Denial their craft and riots our goal.
They lead those who follow and break those who fall.
User avatar
Thesh
Has the Brain Worms, In Case You Forgot.
 
Posts: 3646
Joined: Tue Jan 12, 2010 1:55 am UTC
Location: California, Southern USA

Re: Mind Blowing Algorithms

Postby Sc4Freak » Sun Jan 15, 2012 7:31 am UTC

Something simple is C++'s next_permutation function. Without looking at the source I have no idea how it works. It's like black magic.
User avatar
Sc4Freak
 
Posts: 673
Joined: Thu Jul 12, 2007 4:50 am UTC
Location: Redmond, Washington

Re: Mind Blowing Algorithms

Postby phlip » Sun Jan 15, 2012 8:44 am UTC

Sc4Freak wrote:Without looking at the source I have no idea how it works.

I'm not sure exactly how that particular function works, but I know how I'd write it... Thinking about it, you need to find the longest sublist at the end of the list which is sorted in decreasing order - obviously you can't create the "next" permutation just by reordering those elements, they're already in the maximal order. So the element before that sublist is the one you need to change. Call that previous element a, and the sublist b1...bn. By definition, a < b1, and bi > bi+1 for all i. Now, find the "next" element in the b's that's bigger than a... that is, find the largest i such that bi > a. Then you just need to reorder these trailing elements so that bi is first, and the rest are sorted into ascending order. Which means you need to have them in this order: bi, bn...bi+1, a, bi-1..b1. (NB: either of those two ranges may be empty if i is 1 or n). Or, in more direct terms: you swap a with bi, and then reverse the list of b's (which takes floor(n/2) swaps). The only time you can't do this is if the entire list is in descending order, in which case you have the last permutation... so you just reverse the entire list to get the first permutation again.

I'm not sure what you do if there are elements that compare as equal... this algorithm might still work, or it might not, I'm not certain. It definitely won't come up with strictly every permutation, but I think it might come up with every permutation if you consider two permutations where only two equal elements have been swapped to be "the same" permutation.

[edit]
Yep, works fine with repeated elements too. Implemented in Python, so this is using list-reversals and such instead of explicit swapping, but still:
Code: Select all
#!/usr/bin/python
def next_permutation(lst):
   tail_size = 1
   while lst[-tail_size-1] >= lst[-tail_size]:
      tail_size += 1
      if tail_size >= len(lst):
         return lst[::-1]
   a = lst[-tail_size-1]
   for i in xrange(1,tail_size+1):
      if lst[-i] > a:
         break
   return lst[:-tail_size-1] + [lst[-i]] + lst[len(lst)-i+1:][::-1] + [lst[-tail_size-1]] + lst[-tail_size:-i][::-1]

def tst(a, n):
   print a
   for i in xrange(n):
      a = next_permutation(a)
      print a

tst([1,2,3,4], 24) # simple permutations
print "====="
tst([1,2,2,3], 12) # permutations with repeated elements
print "====="
class A:
   def __init__(self,a,b):
      self.a = a
      self.b = b
   def __str__(self):
      return "A(%s,%s)" % (self.a, self.b)
   __repr__ = __str__
   def __cmp__(self, other):
      return self.a.__cmp__(other.a)
tst([A(1,1), A(2,1), A(2,2), A(3,1)], 12) # permutations with repeated but distinct elements for tracking
While no one overhear you quickly tell me not cow cow.
but how about watch phone?
User avatar
phlip
Restorer of Worlds
 
Posts: 7172
Joined: Sat Sep 23, 2006 3:56 am UTC
Location: Australia

Re: Mind Blowing Algorithms

Postby Jplus » Sun Jan 15, 2012 4:10 pm UTC

Thesh wrote:For counting 1 bits (hamming weight), see:

http://graphics.stanford.edu/~seander/b ... etParallel

Code: Select all
v = v - ((v >> 1) & 0x55555555);                    // reuse input as temporary
v = (v & 0x33333333) + ((v >> 2) & 0x33333333);     // temp
c = ((v + (v >> 4) & 0xF0F0F0F) * 0x1010101) >> 24; // count

If that algorithm would actually work it would be sort of mind blowing, but that doesn't seem to be the case. As an example, try it with 0x13d1; you'll get a result that is way off.
I had a look at the page that you linked to, and I found an "unabbreviated" version of the quoted algorithm just above it:
Code: Select all
unsigned int v; // count bits set in this (32-bit value)
unsigned int c; // store the total here
static const int S[] = {1, 2, 4, 8, 16}; // Magic Binary Numbers
static const int B[] = {0x55555555, 0x33333333, 0x0F0F0F0F, 0x00FF00FF, 0x0000FFFF};

c = v - ((v >> 1) & B[0]);
c = ((c >> S[1]) & B[1]) + (c & B[1]);
c = ((c >> S[2]) + c) & B[2];
c = ((c >> S[3]) + c) & B[3];
c = ((c >> S[4]) + c) & B[4];

That version still doesn't work, but at least it hints to the proper algorithm:
Code: Select all
c = (v & 0x55555555) + ((v >> 1) & 0x55555555);
c = (c & 0x33333333) + ((c >> 2) & 0x33333333);
c = (c & 0xf0f0f0f) + ((c >> 4) & 0xf0f0f0f);
c = (c & 0xff00ff) + ((c >> 8) & 0xff00ff);
c = (c & 0xffff) + ((c >> 16) & 0xffff);

Which is just "divide and conquer", nothing special IMHO. The multiplication by 0x1010101 (followed by the right-shift by 24 bits) in the algorithm that was quoted first is an optimization to replace the last two lines of the proper algorithm, so the following is equivalent:
Code: Select all
c = (v & 0x55555555) + ((v >> 1) & 0x55555555);
c = (c & 0x33333333) + ((c >> 2) & 0x33333333);
c = (c & 0xf0f0f0f) + ((c >> 4) & 0xf0f0f0f);
c = (c * 0x1010101) >> 24;

However, I can't explain what the substraction in the first two algorithms was supposed to do. It's not equivalent to the first line of the proper algorithm, so given that the rest is more or less the same, I don't see how it could ever guarantee to yield the right result.
Edit: I just realized the substraction in the failing quotes is equivalent to the first line of the proper algorithm (reason what happens to two consecutive bits in either approach to see what I mean). So the only reason for the first two algorithms to fail is that they skipped some bitmasks. Here's the final algorithm, almost the same as the first quoted algorithm:
Code: Select all
c = v - ((v >> 1) & 0x55555555);
c = (c & 0x33333333) + ((c >> 2) & 0x33333333);
c = (c & 0xf0f0f0f) + ((c >> 4) & 0xf0f0f0f);
c = (c * 0x1010101) >> 24;
Feel free to call me Julian. J+ is just an abbreviation.
Image coding and xkcd combined
User avatar
Jplus
 
Posts: 1554
Joined: Wed Apr 21, 2010 12:29 pm UTC
Location: classified

Re: Mind Blowing Algorithms

Postby Proginoskes » Mon Jan 16, 2012 4:27 am UTC

I saw the following code online for the number of 1s. Haven't tested it.

Code: Select all
/* count number of 1's in 9-bit argument (Schroeppel) */
unsigned count_ones(unsigned36 a) {
  return ((a * 01001001001)     /* 4 adjacent copies */
             & 042104210421)    /* every 4th bit */
             % 15;              /* casting out 15.'s in hexadecimal */
}
User avatar
Proginoskes
 
Posts: 309
Joined: Mon Nov 14, 2011 7:07 am UTC
Location: Sitting Down

Re: Mind Blowing Algorithms

Postby WarDaft » Mon Jan 16, 2012 5:00 am UTC

Actually...

phlip wrote:Well, there exists a meta-algorithm that can solve any NP problem, which has the property that if P=NP then it will run in polynomial time.

It just has an absolutely ridiculous constant term and is in no way usable in the real world (essentially, if the shortest program that could solve the problem in polynomial time is n bits long, then the meta-algorithm will solve the problem in polynomial time with a constant term on the order of 4n).

... while not necessarily practical, I wonder if we had a dense enough and expressive enough syntax, if this could in fact be tractable in the real world for some problems with enough hardware thrown at it. Suppose we had tiny cores designed to run reduced BF (+, >, <, [, ]) or BLC as a hardware language (BF easier to hard code, BLC more expressive, I believe.) Millions to billions of them on a single board. There may in fact be some interesting problems solved by programs that we could afford to enumerate at this scale.

Also, we can reduce it to 2n in the number of bits by choosing a desirable upper bound for running time of the discovered polynomial. Suppose we decide we want it to halt in 2^32 steps or less for small inputs (we might even be able to be more restrictive, every bit we take off here lets us add a bit to the size of programs we are searching.) Assuming the mini-cpus can execute at least 2^32 steps per second (they could probably do more, the whole chip will be microscopic so no instruction will have to travel very far...) and if we have 100 boards with a billion mini-cpus each, then we can search through all 64 bit programs in a year for solutions as fast as we requested. We can then take all the solutions that correctly answered the problem and check them against other inputs to see if they are general solutions.

Probably a waste of money, but possibly not!

Because really, we don't know what problems many small programs actually solve. Some might be what we're looking for.
All Shadow priest spells that deal Fire damage now appear green.
Big freaky cereal boxes of death.
User avatar
WarDaft
 
Posts: 1574
Joined: Thu Jul 30, 2009 3:16 pm UTC

Re: Mind Blowing Algorithms

Postby mfb » Mon Jan 16, 2012 10:35 pm UTC

I think for all solutions which are solvable by 64 bit of brainfuck (~30 symbols), humans can find a reasonable solution, too. That is just a guess, I have no way to check it.
Don't forget another thing: You want to look for a large number of problems. So you need a lot of inputs with known solutions or extra CPUs to check each output with respect to each problem.
mfb
 
Posts: 838
Joined: Thu Jan 08, 2009 7:48 pm UTC


Return to Computer Science

Who is online

Users browsing this forum: No registered users and 0 guests