Deliberately bad algorithms

A place to discuss the science of computers and programs, from algorithms to computability.

Formal proofs preferred.

Moderators: phlip, Moderators General, Prelates

User avatar
hotaru
Posts: 1025
Joined: Fri Apr 13, 2007 6:54 pm UTC

Re: Deliberately bad algorithms

Postby hotaru » Mon Jun 22, 2015 10:44 pm UTC

Derek wrote:And if you need two posts on a forum, hell if you need even one post, to explain how your single line of code works, your code is unreadable and should be rewritten.

well, that's kind of the point of using "fix". with "fix", you just need to point people who don't get it to an existing explanation of "fix", and then it's pretty obvious what the code does.

Code: Select all

factorial product enumFromTo 1
isPrime n 
factorial (1) `mod== 1

Derek
Posts: 2154
Joined: Wed Aug 18, 2010 4:15 am UTC

Re: Deliberately bad algorithms

Postby Derek » Tue Jun 23, 2015 4:30 am UTC

hotaru wrote:
Derek wrote:And if you need two posts on a forum, hell if you need even one post, to explain how your single line of code works, your code is unreadable and should be rewritten.

well, that's kind of the point of using "fix". with "fix", you just need to point people who don't get it to an existing explanation of "fix", and then it's pretty obvious what the code does.

"Fix" is not the source of confusion in that code.

User avatar
hotaru
Posts: 1025
Joined: Fri Apr 13, 2007 6:54 pm UTC

Re: Deliberately bad algorithms

Postby hotaru » Tue Jun 23, 2015 6:33 am UTC

Derek wrote:
hotaru wrote:
Derek wrote:And if you need two posts on a forum, hell if you need even one post, to explain how your single line of code works, your code is unreadable and should be rewritten.

well, that's kind of the point of using "fix". with "fix", you just need to point people who don't get it to an existing explanation of "fix", and then it's pretty obvious what the code does.

"Fix" is not the source of confusion in that code.

then what is? "fmap"? "=<<"?

Code: Select all

factorial product enumFromTo 1
isPrime n 
factorial (1) `mod== 1

User avatar
phlip
Restorer of Worlds
Posts: 7543
Joined: Sat Sep 23, 2006 3:56 am UTC
Location: Australia
Contact:

Re: Deliberately bad algorithms

Postby phlip » Tue Jun 23, 2015 7:50 am UTC

Well, let me put it this way: when I read it the first time (as a very novice-level Haskell programmer), I had to sit down with a pen and paper for a while and figure out exactly what "fmap . flip id" actually meant... you're passing higher-level functions into other higher-er level functions and it all gets a bit brain-bending for someone who hasn't had a long time of experience with that stuff. Eventually figured out that what you meant was "\x: fmap ($x)", which is just so much easier to read. Maybe if you'd used "($)" instead of "id", I'd have been able to figure it out sooner, but figuring out what "flip id" meant (how do you flip a one-argument function?) was only one of the small stumbling blocks to decyphering this.

Then, we get to (=<<). Monads are weird at the best of times, for a new user, and the function monad in particular I'd worked with exactly zero times before I looked at this. And even then you're being tricksy with partial application and whatnot.

I eventually gave up at this point and started attacking it from the other side - the type signature of the final thing was [[a] -> a] -> [a] (for the list functor), and given the "fix" in there, I just guessed the rest, that the [a] that was being used as a parameter was the same as the [a] that was the result, and tried it, and it seemed to match my guess. So that's what I posted. I was lucky that I'd already spent the down-payment of time in learning how "fix" works, or I'd have been even more lost.

It was only later that I actually decyphered what (=<<) actually means for the function monad, as something akin to the S combinator (another thing that I'd hardly class as "intuitive")... so when the "fix' does essentially:

Code: Select all

loeb = (fmap . flip id) =<< loeb
what it's actually doing is making it so that the parameter go "loeb" gets passed to both sides:

Code: Select all

loeb fs = (fmap . flip id) (loeb fs) fs

Code: Select all

loeb fs = fmap ($loeb fs) fs
which is... again... just so much simpler to read.

Maybe all this stuff comes as second nature for someone who's been working with Haskell for years... but for me it took quite some time to be able to decypher.

Code: Select all

enum ಠ_ಠ {°□°╰=1, °Д°╰, ಠ益ಠ╰};
void ┻━┻︵​╰(ಠ_ಠ ⚠) {exit((int)⚠);}
[he/him/his]

User avatar
benneh
Posts: 74
Joined: Tue Aug 26, 2008 8:24 am UTC

Re: Deliberately bad algorithms

Postby benneh » Thu Jun 25, 2015 2:51 pm UTC

Yakk wrote:Strange. I'm getting merely n! from that:

Code: Select all

Let S(n) be the cost of sorting n elements.

Let S(n,m) be the cost where the first element is m away from the right place.
S(n) <= S(n, n-1)

S(1) = 1
S(n,0) <= 1 + 2S(n-1) (the case where head(list)<=min(tail(list)))
S(n,k) <= 1 + S(n-1) + S(n, k-1) (otherwise)

So S(n) <= n + (n+1) S(n-1)

which works out to (n+1)! no?

So the n=6 case should be only 7 times slower than the n=5 case.

What did I miss?

The test I ran was on the algorithm in the spoiler:
benneh wrote:

Code: Select all

import Prelude hiding (minimum)

import Control.Applicative (liftA2)
import Data.List (permutations)

stupidesterSort :: (Ord a) => [a] -> [a]
stupidesterSort as = find' (\ bs -> isSorted bs && bs `elem` permutations as) (listPower (length as) as)

listPower :: Int -> [a] -> [[a]]
listPower n as = foldr (liftA2 (:)) [[]] (replicate n as)

isSorted :: (Ord a) => [a] -> Bool
isSorted [] = True
isSorted [_] = True
isSorted (a:as) = a <= minimum as && isSorted as

minimum :: (Ord a) => [a] -> a
minimum = head . sort

find' :: (a -> Bool) -> [a] -> a
find' condition = head . filter condition

mousewiz
Posts: 107
Joined: Wed Oct 26, 2011 6:50 pm UTC

Re: Deliberately bad algorithms

Postby mousewiz » Tue Jun 30, 2015 7:03 pm UTC

Code: Select all

floor(float f) {
    for(int i = 0; i <= f; i++) ;
    return i;
}


Negative floors need not apply.
Last edited by mousewiz on Tue Jun 30, 2015 7:07 pm UTC, edited 1 time in total.

User avatar
Yakk
Poster with most posts but no title.
Posts: 11053
Joined: Sat Jan 27, 2007 7:27 pm UTC
Location: E pur si muove

Re: Deliberately bad algorithms

Postby Yakk » Tue Jun 30, 2015 7:06 pm UTC

All algorithms are improved by recursion:

Code: Select all

int floor(float f, int guess=0);
int ceil(float f, int guess=0);

int ceil(float f, int guess=0) {
    if (f < 0.) return -floor( -f, guess );
    if (guess >= f) return guess;
    for(int i = guess; i < ceil(f,i+1); i++) ;
    return i;
}
int floor(float f, int guess=0) {
    if (f < 0.) return -ceil( -f, guess );
    if (guess > f) return guess-1;
    for(int i = guess; i < floor(f,i+1); i++) ;
    return i;
}
One of the painful things about our time is that those who feel certainty are stupid, and those with any imagination and understanding are filled with doubt and indecision - BR

Last edited by JHVH on Fri Oct 23, 4004 BCE 6:17 pm, edited 6 times in total.

User avatar
Isaac Hill
Systems Analyst????
Posts: 506
Joined: Wed Mar 14, 2007 9:35 pm UTC
Location: Middletown, RI

Re: Deliberately bad algorithms

Postby Isaac Hill » Thu Jul 02, 2015 2:45 am UTC

I haven't used real programming languages in years, so this is based on MATLAB scripting. To sort a list of positive integers, first, find the maximum value:

Code: Select all

M = input(1);
for i = 2:length(input)
    if input(i) > M,  M = input(i); end
end

Then create a new array, and fill it with the values that appear in the input:

Code: Select all

output = [];
for i = 1:M
   for j = 1:length(input)
      if input(j) == i, out = [out i]; end
   end
end

The order doesn't just depend on your list size, but also the maximum value. If your input is [2 100 1], it takes 2 iterations of the first part to determine the max value, then 300 iterations of the second part to check each value from 1 to 100 to see if it's in the 3 item list. Plus, you're dynamically allocating memory since the size of output keeps changing.
Alleged "poems"
that don't follow a rhyme scheme
are not poetry

User avatar
You, sir, name?
Posts: 6974
Joined: Sun Apr 22, 2007 10:07 am UTC
Location: Chako Paul City
Contact:

Re: Deliberately bad algorithms

Postby You, sir, name? » Thu Jul 09, 2015 8:06 pm UTC

Code: Select all

int multiplyByTwo(int i) {
    for (int j=0;;j++) {
        double val = pow(2*cos(i)*sin(i),2) + pow(cos(j),2);
        if (abs(val-1) < 0.0000001) {
            return i < 0 ? -j : j;
        }
    }
}
I edit my posts a lot and sometimes the words wrong order words appear in sentences get messed up.

User avatar
Dinosaur!
Posts: 30
Joined: Fri Dec 14, 2012 8:45 pm UTC
Location: Santa Cruz, CA

Re: Deliberately bad algorithms

Postby Dinosaur! » Fri Jul 17, 2015 3:52 am UTC

You, sir, name? wrote:

Code: Select all

int multiplyByTwo(int i) {
    for (int j=0;;j++) {
        double val = pow(multiplyByTwo(cos(i)*sin(i)),2) + pow(cos(j),2);
        if (abs(val-1) < 0.0000001) {
            return i < 0 ? -j : j;
        }
    }
}


Fixed that for ya :D

User avatar
Wildcard
Candlestick!
Posts: 251
Joined: Wed Jul 02, 2008 12:42 am UTC
Location: Outside of the box

Re: Deliberately bad algorithms

Postby Wildcard » Fri Jul 17, 2015 4:47 am UTC

Dinosaur! wrote:
You, sir, name? wrote:

Code: Select all

int multiplyByTwo(int i) {
    for (int j=0;;j++) {
        double val = pow(multiplyByTwo(cos(i)*sin(i)),2) + pow(cos(j),2);
        if (abs(val-1) < 0.0000001) {
            return i < 0 ? -j : j;
        }
    }
}


Fixed that for ya :D

You need a base for an inductive method. The way you changed this one, it will just recursively call forever and not return the correct answer...no good. "Deliberately bad" refers to writing *correct* algorithms that are extremely, mind-bogglingly inefficient, whether in terms of time or space (or both).

My favorite example from this thread so far, is:
benneh wrote:For a maximum balance of simplicity and inefficiency:

Code: Select all

sort(list):
    if length(list) < 2:
        return list
    else:
        if head(list) <= minimum(tail(list)):
            return head(list) followed by sort(tail(list))
        else:
            return sort(tail(list) followed by head(list))

minimum(list):
    return head(sort(list))

What I find impressive is that with *that* much superfluous recursion, the algorithm is still finite and correct. I haven't done a formal analysis of the time complexity but would love to see one. :)
There's no such thing as a funny sig.

lightvector
Posts: 222
Joined: Tue Jun 17, 2008 11:04 pm UTC

Re: Deliberately bad algorithms

Postby lightvector » Sun Jul 19, 2015 3:30 am UTC

Wildcard wrote:My favorite example from this thread so far, is:
benneh wrote:For a maximum balance of simplicity and inefficiency:

Code: Select all

sort(list):
    if length(list) < 2:
        return list
    else:
        if head(list) <= minimum(tail(list)):
            return head(list) followed by sort(tail(list))
        else:
            return sort(tail(list) followed by head(list))

minimum(list):
    return head(sort(list))

What I find impressive is that with *that* much superfluous recursion, the algorithm is still finite and correct. I haven't done a formal analysis of the time complexity but would love to see one. :)


Let n be the length of the list, and T(n) the worst case cost of sorting that length of list. Worst case, the recursion will run n times rotating the list before "head(list) <= minimum(tail(list))" is satisfied, and each rotation costs T(n-1) via the call to "minimum" - everything else is constant time or trivial in comparison, including "length" at the start - and then will call sort one more time for a total of n+1 calls to a smaller sort.

So T(n) is O((n+1) * T(n-1)).
So T(n) is O((n+1)!)

I don't know if you can obtain a slightly tighter bound. It seems to me that with the way the rotation works, you can't hit the worst case number of rotations all the time within the recursive calls.

User avatar
You, sir, name?
Posts: 6974
Joined: Sun Apr 22, 2007 10:07 am UTC
Location: Chako Paul City
Contact:

Re: Deliberately bad algorithms

Postby You, sir, name? » Sun Jul 19, 2015 10:21 am UTC

lightvector wrote:I don't know if you can obtain a slightly tighter bound. It seems to me that with the way the rotation works, you can't hit the worst case number of rotations all the time within the recursive calls.


Stirling's approximation? ;-)
I edit my posts a lot and sometimes the words wrong order words appear in sentences get messed up.

User avatar
Flumble
Yes Man
Posts: 1951
Joined: Sun Aug 05, 2012 9:35 pm UTC

Re: Deliberately bad algorithms

Postby Flumble » Sun Jul 19, 2015 2:49 pm UTC

lightvector wrote:I don't know if you can obtain a slightly tighter bound. It seems to me that with the way the rotation works, you can't hit the worst case number of rotations all the time within the recursive calls.

Counting all the recursive calls to sort, I got the sequence 1, 3, 11, 47, 227, 1215, 7107, 44959, 305091, 2206399.*
It's equivalent to A035009, which seems to grow 'only' exponentially.
Image

Compared to a factorial, it immediately deminishes:
Image

It seems to be squeezed between ex and ex^2. No guarantees though, because the sequences stop after only a couple dozen numbers.


*I tried all permutations of lists of length < 7 and each time the reversed list took the longest. So assuming a reversed list takes the longest, I performed the sort on those lists, resulting in 10 elements before also that took too long.
Spoiler:

Code: Select all

import Prelude hiding (minimum, maximum)
import Data.List hiding (sort)

printLn = mapM_ print

tryAll len = map logSort $ permutations [1..len]
   where logSort list = show list ++ ": " ++ show (snd $ sort list) ++ " iterations"

tryReverse = [snd $ sort $ reverse [1..n] | n <- [1..]]

maximum = maximumBy (\a b -> compare (snd $ sort a) (snd $ sort b))


--forgive me for not using a monad
sort list = sort'(list,0)
   where
      minimum (list,n) = (head sorted,n')
         where
            (sorted,n') = sort'(list,n)

      sort' ([],n) = ([],n+1)
      sort' ([element],n) = ([element],n+1)
      sort' ((first:rest),n) = if first <= min then (first:sortedTail,nTail) else (sortedRot,nRot)
         where
            (min,n') = minimum (rest,n)
            (sortedTail, nTail) = sort'(rest,n')
            (sortedRot , nRot ) = sort'(rest++[first],n')

Derek
Posts: 2154
Joined: Wed Aug 18, 2010 4:15 am UTC

Re: Deliberately bad algorithms

Postby Derek » Mon Jul 20, 2015 3:40 am UTC

Wildcard wrote:My favorite example from this thread so far, is:
benneh wrote:For a maximum balance of simplicity and inefficiency:

Code: Select all

sort(list):
    if length(list) < 2:
        return list
    else:
        if head(list) <= minimum(tail(list)):
            return head(list) followed by sort(tail(list))
        else:
            return sort(tail(list) followed by head(list))

minimum(list):
    return head(sort(list))

What I find impressive is that with *that* much superfluous recursion, the algorithm is still finite and correct. I haven't done a formal analysis of the time complexity but would love to see one. :)

Isn't that algorithm wrong? If head(list) is greater than minimum(tail(list)), but less than maximum(tail(list), won't this algorithm incorrectly place head(list) at the end of the final list?

User avatar
ConMan
Shepherd's Pie?
Posts: 1632
Joined: Tue Jan 01, 2008 11:56 am UTC
Location: Beacon Alpha

Re: Deliberately bad algorithms

Postby ConMan » Mon Jul 20, 2015 3:48 am UTC

Derek wrote:
Wildcard wrote:My favorite example from this thread so far, is:
Isn't that algorithm wrong? If head(list) is greater than minimum(tail(list)), but less than maximum(tail(list), won't this algorithm incorrectly place head(list) at the end of the final list?

You're looking at the return statement wrong. It's saying to return sort(tail followed by head), meaning that it puts the head at the end of the list then tries to sort the *entire* list again. So it will keep recursing on that statement until it finally finds the smallest element on the list, then keeps that at the start of the list, and sorts the rest.
pollywog wrote:
Wikihow wrote:* Smile a lot! Give a gay girl a knowing "Hey, I'm a lesbian too!" smile.
I want to learn this smile, perfect it, and then go around smiling at lesbians and freaking them out.

User avatar
phlip
Restorer of Worlds
Posts: 7543
Joined: Sat Sep 23, 2006 3:56 am UTC
Location: Australia
Contact:

Re: Deliberately bad algorithms

Postby phlip » Mon Jul 20, 2015 4:36 am UTC

If you replace the "minimum" definition with something more reasonable, then the sort function ends up behaving essentially like a selection sort. Rotating the list until the smallest value is at the start, then moving on to the next sublist. Inefficient, but not embarrassingly so. It's when "minimum" is in turn defined in terms of "sort" that it suddenly starts becoming ridiculous.

Code: Select all

enum ಠ_ಠ {°□°╰=1, °Д°╰, ಠ益ಠ╰};
void ┻━┻︵​╰(ಠ_ಠ ⚠) {exit((int)⚠);}
[he/him/his]

Derek
Posts: 2154
Joined: Wed Aug 18, 2010 4:15 am UTC

Re: Deliberately bad algorithms

Postby Derek » Mon Jul 20, 2015 5:05 am UTC

ConMan wrote:You're looking at the return statement wrong. It's saying to return sort(tail followed by head), meaning that it puts the head at the end of the list then tries to sort the *entire* list again. So it will keep recursing on that statement until it finally finds the smallest element on the list, then keeps that at the start of the list, and sorts the rest.

Ah, yeah, that's exactly what I did. I missed that closing parens.

User avatar
Yakk
Poster with most posts but no title.
Posts: 11053
Joined: Sat Jan 27, 2007 7:27 pm UTC
Location: E pur si muove

Re: Deliberately bad algorithms

Postby Yakk » Mon Jul 20, 2015 12:36 pm UTC

Is there a way to use the recursive structure of conway arrows in an algorithm that seems to be trying to do something sensible, like sorting?
One of the painful things about our time is that those who feel certainty are stupid, and those with any imagination and understanding are filled with doubt and indecision - BR

Last edited by JHVH on Fri Oct 23, 4004 BCE 6:17 pm, edited 6 times in total.

User avatar
quintopia
Posts: 2906
Joined: Fri Nov 17, 2006 2:53 am UTC
Location: atlanta, ga

Re: Deliberately bad algorithms

Postby quintopia » Thu Jul 30, 2015 1:25 am UTC

I had this idea today for finding the arithmetic mean of a list without ever averaging more than two of its elements at a given time. It's pretty obvious that it works, but I have no idea what its worst case running time is. I'm sure it's at least O(n²b) (where b bits of precision are used in the mantissa), but it's not clear to me how to compute it exactly. Some help?

(As a side note, if you want an algorithm that's actually terrible, you could simply repeatedly replace two randomly chosen elements with their average, stopping when all elements are equal. My goal here was to optimize the basic algorithm as much as possible to see how bad the "fundamental idea" was, rather than concoct the absolute worst recursive random over-working algorithm ever.)

Code: Select all

# -*- coding: utf-8 -*-
from __future__ import division

def bubble_avg(L):
    if len(L)==1:
        return L[0]
    while True:
        t = (L[0] +L[-1])/2
        inserted = False
        for i in range(0,len(L)-1):
            if not inserted and L[i+1]<=t:
                L[i]=L[i+1]
            if inserted:
                L[i],t = t,L[i]
            if L[i+1]>t and not inserted:
                L[i]=t
                inserted=True
        L[-1]=t
        if L[0]==L[-1]:
            return L[0]
           
           
           
test = [1, 2, 4, 8, 16, 32, 64]
test.sort()
print bubble_avg(test)
print float(sum(test))/len(test)


Here's another way to do basically the same thing. I'm pretty sure it's slower because even though it converges in half as many passes, each pass requires a full sort, though in this case, Timsort will have a lot of monotonic runs to work with, and will return very rapidly.

Code: Select all

# -*- coding: utf-8 -*-
from __future__ import division

def sort_avg(L):
    while True:
        for i in range(len(L)//2):
            t = (L[i]+L[-(i+1)])/2
            L[i] = t
            L[-(i+1)] = t
        L.sort()
        if L[0]==L[-1]:
            return L[0]
           
test = [1, 2, 4, 8, 16, 32, 64]
print sort_avg(test)
print float(sum(test))/len(test)

User avatar
Shufflepants
Posts: 16
Joined: Wed Jul 20, 2016 3:12 pm UTC

Slowest Sorting Algorithms that Terminate

Postby Shufflepants » Mon Aug 15, 2016 6:55 pm UTC

For some reason I'm interested in really slow sorting algorithms. We've all heard about terrible ones like Bogo Sort that has average running time of O(n!), but is not guaranteed to terminate. There are others such as Slow Sort which has a greater than polynomial running time and doesn't really admit any unnecessary work.

My own idea for a really slow one is what I call Turing Sort.

Turing Machines can be explicitly enumerated starting with a machine with 1 state, then all the machines with 2 states, and so on. So the idea for the sorting algorithm is to simulate Turing Machines.

Here's the rough pseudo-code:

Code: Select all

int i=0;
while(true) {
  initialize Turing Machine i and add it to the list
  for each Turing Machine in the list {
    simulate one step
    if the machine has halted {
      check to see if what the machine has produced on its tape is the sorted list, if it is return it
    } else {
      remove this machine from the list
    }
  }
}


Because for any given string there is some Turing machine which can produce it given a blank tape, we guarantee that this will eventually terminate even though many of the machines being simulated never will.

I estimated that this algorithm has an approximate running time of n! (number of strings of the correct length) times 2^k (the number of Turing Machines with k states where k will somehow be depend on n). I'm sure there needs to be several more factors including the constant that represents the proportion of Turing Machines that halt (which is uncomputable) because of the fact that you continue to process steps for all machines which have fewer states but do not halt than the one that eventually produces the sorted list.

As a funny side note, this algorithm also finds you the Turing Machine that minimizes some function of number of states and number of steps to produce the sorted list; such efficiency!

If any one has any better estimates on the average running time or any of their own really slow running sorting algorithms that are guaranteed to terminate, I'd love to hear them.

User avatar
Soupspoon
You have done something you shouldn't. Or are about to.
Posts: 2544
Joined: Thu Jan 28, 2016 7:00 pm UTC
Location: 53-1

Re: Slowest Sorting Algorithms that Terminate

Postby Soupspoon » Mon Aug 15, 2016 8:38 pm UTC

I think the 'slowest possible terminating' algorithm has to always move just enough elements in just the right ways, per iteration, to get a net movement towards 'sorted' (and towards zero 'unsortedness') by the lowest possible (non-zero) amount. The reason being that any algorithm that can increase the 'unsort' assessment in an given iteration can just resort/reunsort/reresort ad infinitum and enter into a never finishing fugue (ditto a neutral sort that unsorts one element for every one sorted).

This makes me think that O(n!) sounds about right as a limit (a cycle needed for every element, and for every displacement from 'sorted' that element can have within the constraints by the previously misplaced/displaced elements), in reality a sort of a reverse(sort(list)) will (with the most deliberately unintelligent method, so described) have to move individual elements something like 2*(n!/2), but in any non-negative 'net' sorting movement of two elements, that reduces the unsorted metric by two positions, so that's back to half n! (still O(n!), though).

So I'm wondering about shifting trios. Can arbitrary elements A, B and C have an unsortedness shift of -1, -1 and +1..? I can't get round in my head how I could actually accomplish that knor, for any odd number 2n+1, make n+1 elements lose minimal unsortedness, whilst the remaining n gain unsortedness) and I don't think that's possible. Two microsortings can balance with a single two-unit microunsorting, but that's a net zero, so not allowed for our terminating algorithm. Per cycle, I think that an even (including zero and negative, if we hadn't disallowed them for the above reasons) reduction of unsortedness is the only possible outcome (until zero is reached), but someone will surely be able to correct me on that.

Unless we 'cheat' and define a meta-cycle of iterations (of arbitrary length?) that in total produce a net reduction of unsortedness by the minimum amount (2?), but only by summing positive, negative and zero motions, doing 'wasted' shuffling along the way towards minimum unsortedness. But if that's "list=reverse(list); list=reverse(list); list=minimallysort(list)" then if at any point our list is actually reverse(fullysorted(list)) then at the ⅓rd point we have list=fullysorted(list)), in this example, and it surely can't count as a unit of iteration if its status as fully sorted isn't even tested for and termination of the sort process sought. And it doesn't affect the Order of the sort, anyway.

Or... What about an algorithm that always generates every possible (re-)ordering of a list, and then proceeds to test pairs of generated lists to produce the 'most sorted'. That would be an O(n!) process, stage one, followed by an O(n) process (howsoever the pairs are compared, bracketting or 'winner stays on against next to be tested').. Is that O(n!+n)? My head still hurts from trying to create a non-even reduction of unsortedness to zero, so someone will tell me if that's not actually O(n!) in all but name. It's been too long...

User avatar
Xanthir
My HERO!!!
Posts: 5228
Joined: Tue Feb 20, 2007 12:49 am UTC
Location: The Googleplex
Contact:

Re: Slowest Sorting Algorithms that Terminate

Postby Xanthir » Tue Aug 16, 2016 1:03 am UTC

O(n!+n) is equal to O(n!), so if all you care about is the big-O notation, you're covered either way.
(defun fibs (n &optional (a 1) (b 1)) (take n (unfold '+ a b)))

User avatar
Soupspoon
You have done something you shouldn't. Or are about to.
Posts: 2544
Joined: Thu Jan 28, 2016 7:00 pm UTC
Location: 53-1

Re: Slowest Sorting Algorithms that Terminate

Postby Soupspoon » Tue Aug 16, 2016 1:55 am UTC

Thought so, but my head was swimming.

(So much that I got it drastically wrong in my head. After(/during) creating n! reorderings, there are n!-1 pairs to whittle down. So, it's 2n!, which is back to O(n!)... Yes?)

Anyway, that's not changing much to what I wrote. I shall leave the implementation up to others, but I'd suggest an iterator. I don't have any handy programming tools on this tablet, but mixing coding styles a bit, for freehand pseudocode:

Code: Select all

nextPermute = iteratePermute(list); // exercise to reader, returns function producing next unique recombiation
bestTest=&nextPermute();
bestDisorder=disorderMetric(bestTest);
while (newTest=&nextPermute()) {
  newDisorder=disorderMetric(newTest);
  if (newDisorder < bestDisorder) { // we're not bothered of keeping all 'equally but differently sorted' examples,
                                   // but <= would be ~linearly more intensive?
    bestDisorder=newDisorder; // yes, recalculating this alongside newDisorder, each time, would be wasteful,
                              // but that isn't the objective
    bestTest=newTest;
 }
}
return bestTest // from whatever container function (unstated) was made to enclose this...
 

Tub
Posts: 322
Joined: Wed Jul 27, 2011 3:13 pm UTC

Re: Slowest Sorting Algorithms that Terminate

Postby Tub » Tue Aug 16, 2016 11:48 am UTC

Shufflepants wrote:

Code: Select all

      check to see if what the machine has produced on its tape is the sorted list, if it is return it

I love how the fastest algorithms for checking set equality of both lists involve... sorting both lists.

About the upper bound of the runtime complexity. Enumerating a turing machine by some bitwise encoding is great for saying that you have 2^n turing machines with n bits, but that doesn't help us in figuring out what these turing machines can do.
So I'll be using a turing machine defined by 5-tuples: (state, symbol, new symbol, new state or terminate, tape movement), and we will iterate by the amount of tuples required to define the machine.
How many machines with t tuples are there? There's at most as many states as tuples, there are exactly n symbols, and tape movement is (left, hold, right), so we have t^2 * n * (n+1) * 3 possible tuples on a t-tuple machine with n symbols, thus:
s(t, n) := (t^2 * n * (n+1) * 3) ^ t possible machines with t tuples over n symbols.1

With this definition, for a list of n elements, there is a trivial turing machine with n tuples that outputs a permutation of that list after n steps, like this:
(1, *, 1, 2, right)
(2, *, 2, 3, right)
...
(n, *, n, terminate, hold)
where * is whatever symbol the empty tape contains.
In other words, after having executed n steps on any turing machine with n tuples, the algorithm must have terminated.

Now we can calculate how many machines get instantiated before our trivial winner is found:
sum(i from 1 to n) (i^2*n^2)^i

Add n iterations to actually run the machine, figure out that the whole loop requires O(n^2) for n iterations2, so in theory we have everything we need to establish an upper bound.

Eyeballing the thing seems to lead to something like O(n^n), so you might actually have created something worse than O(n!).


1 Many of those machines will have unreachable states, and we might find a more efficient iteration, but we're interested in an upper bound.
2 Assuming no turing machine is ever removed from the list, though I would be surprised if removal of the terminated machines would lower the complexity class. If you actually have a useful bound on the fraction of turing machines that terminate after a set amount of steps, I have a suspicion that you've gotten terribly close to computing the non-computable busy beaver numbers.

User avatar
jaap
Posts: 2073
Joined: Fri Jul 06, 2007 7:06 am UTC
Contact:

Re: Slowest Sorting Algorithms that Terminate

Postby jaap » Tue Aug 16, 2016 12:08 pm UTC

Tub wrote:
Shufflepants wrote:Eyeballing the thing seems to lead to something like O(n^n), so you might actually have created something worse than O(n!).

I don't think so. Because of Stirling's approximation of n!, I think O(n!) is slightly worse than about the same as O(nn).

User avatar
Shufflepants
Posts: 16
Joined: Wed Jul 20, 2016 3:12 pm UTC

Re: Slowest Sorting Algorithms that Terminate

Postby Shufflepants » Tue Aug 16, 2016 2:04 pm UTC

Tub wrote:2 Assuming no turing machine is ever removed from the list, though I would be surprised if removal of the terminated machines would lower the complexity class. If you actually have a useful bound on the fraction of turing machines that terminate after a set amount of steps, I have a suspicion that you've gotten terribly close to computing the non-computable busy beaver numbers.


What I referred to, but not by name in my original post: https://en.wikipedia.org/wiki/Chaitin%27s_constant a non-computable constant that represents the proportion of machines that halt for a given encoding.
I didn't expect the removal of machines to lower the complexity class as far as Big O is concerned, it would only be a constant factor, but a very interesting constant!

My true dream is to come up with sorting algorithm that has a running time that grows as a non-computable function. But I suspect that any such algorithm would not be guaranteed to terminate otherwise we'd have found a way to calculate it a la sorting all lists up to a certain size to discover the running time.

Tub
Posts: 322
Joined: Wed Jul 27, 2011 3:13 pm UTC

Re: Slowest Sorting Algorithms that Terminate

Postby Tub » Tue Aug 16, 2016 4:20 pm UTC

jaap wrote:
Tub wrote:
Shufflepants wrote:Eyeballing the thing seems to lead to something like O(n^n), so you might actually have created something worse than O(n!).

I don't think so. Because of Stirling's approximation of n!, I think O(n!) is slightly worse than about the same as O(nn).

n! = 1 * 2 * 3 * ... * n
n^n = n * n * n * ... * n
O(n^n) is at least as large as O(n!). In fact, if you take the limit, n^n will always be at least a factor of n larger than n!, making it its own class.

Shufflepants wrote:What I referred to, but not by name in my original post: https://en.wikipedia.org/wiki/Chaitin%27s_constant a non-computable constant that represents the proportion of machines that halt for a given encoding.

I see. But that'll only tell you if they halt, but not when. It's useless to prove a better upper bound on the loop - after all, it's entirely possible that your winning turing machine is the first to halt.

Zowayix
Posts: 9
Joined: Fri Nov 16, 2012 7:20 pm UTC

Re: Slowest Sorting Algorithms that Terminate

Postby Zowayix » Fri Aug 26, 2016 1:51 pm UTC

Has worstsort been mentioned yet? Given any computable function it's supposed to generate a bounded-time sorting algorithm with runtime worse than that function.

User avatar
BlackSails
Posts: 5314
Joined: Thu Dec 20, 2007 5:48 am UTC

Re: Deliberately bad algorithms

Postby BlackSails » Sun Oct 02, 2016 2:11 pm UTC

What about

Code: Select all

k=1
While the list isnt sorted
      Pick a random turing machine of size k
      Calculate BB_k
      Run turing machine on list (just pick an encoding, dont see why it would matter) until it either halts or runs for BB_k steps
      If turing machine outputs sorted list, return the list
      k++

Tub
Posts: 322
Joined: Wed Jul 27, 2011 3:13 pm UTC

Re: Deliberately bad algorithms

Postby Tub » Sun Oct 02, 2016 2:30 pm UTC

BlackSails wrote:

Code: Select all

      Calculate BB_k

Can you please provide an implementation for this line that terminates after a bounded, finite amount of steps?

User avatar
BlackSails
Posts: 5314
Joined: Thu Dec 20, 2007 5:48 am UTC

Re: Deliberately bad algorithms

Postby BlackSails » Sun Oct 02, 2016 8:45 pm UTC

Sure, just takes specialized hardware. You put a mathematician in a box and then query the box.

User avatar
Soupspoon
You have done something you shouldn't. Or are about to.
Posts: 2544
Joined: Thu Jan 28, 2016 7:00 pm UTC
Location: 53-1

Re: Deliberately bad algorithms

Postby Soupspoon » Sun Oct 02, 2016 8:49 pm UTC

A Chinese mathematician? Or a feline one?

korona
Posts: 495
Joined: Sun Jul 04, 2010 8:40 pm UTC

Re: Deliberately bad algorithms

Postby korona » Sun Oct 02, 2016 10:01 pm UTC

Unless the mathematician has infinite handwriting speed (or some other method to process arbitrary long proofs in bounded time) he cannot compute BB_k either. :D

User avatar
Yakk
Poster with most posts but no title.
Posts: 11053
Joined: Sat Jan 27, 2007 7:27 pm UTC
Location: E pur si muove

Re: Deliberately bad algorithms

Postby Yakk » Mon Oct 03, 2016 2:19 pm UTC

For some k she cannot calculate BB_k.

But the bound on which the mathematician certainly cannot calculate BB_k is pretty high.

However, to make the algorithm more likely to terminate, we should have the mathematician both calculate BB_k and try to prove it cannot be calculated.

If it cannot be calculated, the mathematician should come up with a new axiomatic framework where it can be calculated, then she should continue.
Last edited by Yakk on Mon Oct 03, 2016 6:24 pm UTC, edited 1 time in total.
One of the painful things about our time is that those who feel certainty are stupid, and those with any imagination and understanding are filled with doubt and indecision - BR

Last edited by JHVH on Fri Oct 23, 4004 BCE 6:17 pm, edited 6 times in total.

User avatar
Flumble
Yes Man
Posts: 1951
Joined: Sun Aug 05, 2012 9:35 pm UTC

Re: Deliberately bad algorithms

Postby Flumble » Mon Oct 03, 2016 2:41 pm UTC

If you put a mathematician in a box, you can be sure the mathematician will terminate within a week, so that only adds a constant.


Different approach:

Code: Select all

1. Send the computing device into a supermassive black hole
2. Sort the list using quicksort

Tub
Posts: 322
Joined: Wed Jul 27, 2011 3:13 pm UTC

Re: Deliberately bad algorithms

Postby Tub » Mon Oct 03, 2016 8:42 pm UTC

Anyone who claims that replacing a machine with a human will somehow avoid the problems of uncomputablity hasn't understood the problems.

Yakk wrote:For some k she cannot calculate BB_k.

For all but a finite number of k, she cannot calculate BB_k. The busy beaver function is a bit more strict in its uncomputability, because knowing BB_a for any a allows you to compute all BB_b with b < a.

For BlackSails' algorithm, that means that the loop can only run a finite number of iterations. But even if it knew all BB_k, it is not guaranteed to terminate. The chance for it to terminate approaches 0 as list length increases.

It doesn't sort, it doesn't terminate, but at least it's slow. One out of three. :D

Yakk wrote:But the bound on which the mathematician certainly cannot calculate BB_k is pretty high.

What bound do you mean? For any k, there's at least one TM (or mathematician) which outputs BB_k, so there's no theoretical upper bound k for which BB_k cannot be known.

For all practical purposes, something as low as BB_10 is too large to be known or uttered or written down by anything that fits a human-sized box.

Flumble wrote:Different approach:

Code: Select all

1. Send the computing device into a supermassive black hole
2. Sort the list using quicksort

So.. uh.. now how do I read the sorted list? I kinda needed that.

(If you tell me I can't, then your algorithm doesn't work. If you tell me to jump after the device, then from my perspective it isn't slow at all.)

User avatar
Flumble
Yes Man
Posts: 1951
Joined: Sun Aug 05, 2012 9:35 pm UTC

Re: Deliberately bad algorithms

Postby Flumble » Mon Oct 03, 2016 9:57 pm UTC

Tub wrote:So.. uh.. now how do I read the sorted list? I kinda needed that.

If you care about the result, why do you use a deliberately bad algorithm? :roll:

Anyway, you have to wait until the SMBH is evaporated to retrieve the computing device. I don't know how much time there is inside a black hole, but hopefully the computer is done once you retrieve it.

User avatar
Xanthir
My HERO!!!
Posts: 5228
Joined: Tue Feb 20, 2007 12:49 am UTC
Location: The Googleplex
Contact:

Re: Deliberately bad algorithms

Postby Xanthir » Tue Oct 04, 2016 6:10 am UTC

Tub wrote:
Yakk wrote:But the bound on which the mathematician certainly cannot calculate BB_k is pretty high.

What bound do you mean? For any k, there's at least one TM (or mathematician) which outputs BB_k, so there's no theoretical upper bound k for which BB_k cannot be known.

The upper bound of possible knowability is the size of the smallest UTM, no? If you knew BB for that size, you could solve the halting problem generally.
(defun fibs (n &optional (a 1) (b 1)) (take n (unfold '+ a b)))

User avatar
Wildcard
Candlestick!
Posts: 251
Joined: Wed Jul 02, 2008 12:42 am UTC
Location: Outside of the box

Re: Deliberately bad algorithms

Postby Wildcard » Tue Oct 04, 2016 1:02 pm UTC

Flumble wrote:If you care about the result, why do you use a deliberately bad algorithm? :roll:


Just to remind you: "Deliberately bad" refers to writing correct algorithms that are extremely, mind-bogglingly inefficient, whether in terms of time or space (or both). But kudos for entertaining imaginativeness. :)

My favorite from this thread is still benneh's, which I've translated to Python:

Code: Select all

def sort(list):
    if len(list) < 2:
        return list
    else:
        if list[0] <= minimum(list[1:]):
            return list[0:1] + sort(list[1:])
        else:
            return sort(list[1:] + list[0:1])

def minimum(list):
    return sort(list)[0]


I also asked about the time complexity analysis of this algorithm on the Computer Science stack exchange a while back; might make for interesting reading for those here. (I gave credit to benneh, of course.)

Just for fun, here is the same code dropped into a wrapper, to run it on reverse-sorted lists and measure the time it takes:

Code: Select all

$ cat bad_sort.py
#!/usr/bin/python

import sys

def sort(list):
  if len(list) < 2:
    return list
  else:
    if list[0] <= minimum(list[1:]):
      return list[0:1] + sort(list[1:])
    else:
      return sort(list[1:] + list[0:1])

def minimum(list):
  return sort(list)[0]

mylist = list(range(int(sys.argv[1])))
mylist.reverse()

print(sort(mylist))
$ for n in {1..15}; do time -p ./bad_sort.py "$n"; done
[0]
real 0.03
user 0.02
sys 0.00
[0, 1]
real 0.02
user 0.02
sys 0.00
[0, 1, 2]
real 0.03
user 0.02
sys 0.00
[0, 1, 2, 3]
real 0.02
user 0.02
sys 0.00
[0, 1, 2, 3, 4]
real 0.03
user 0.02
sys 0.00
[0, 1, 2, 3, 4, 5]
real 0.03
user 0.02
sys 0.00
[0, 1, 2, 3, 4, 5, 6]
real 0.04
user 0.03
sys 0.00
[0, 1, 2, 3, 4, 5, 6, 7]
real 0.09
user 0.08
sys 0.00
[0, 1, 2, 3, 4, 5, 6, 7, 8]
real 0.48
user 0.47
sys 0.00
[0, 1, 2, 3, 4, 5, 6, 7, 8, 9]
real 3.34
user 3.33
sys 0.01
[0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10]
real 27.03
user 27.01
sys 0.01
[0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11]
real 226.84
user 226.78
sys 0.06
^C
There's no such thing as a funny sig.


Return to “Computer Science”

Who is online

Users browsing this forum: No registered users and 3 guests