Thoughts on the Singularity?

Things that don't belong anywhere else. (Check first).

Moderators: Moderators General, Prelates, Magistrates

User avatar
illway
Posts: 9
Joined: Thu Dec 24, 2009 1:48 am UTC

Thoughts on the Singularity?

Postby illway » Sat Jan 02, 2010 5:00 am UTC

I'm sure most of you know about Ray Kurzweil. He's an inventor and more recently a futurist/transhumanist who has been predicting many things to come in the future, namely the Singularity, which from my understanding is an intense jump in technology. Some of you may even know of him as a nutjob, who takes over 100 supplements a day so that he will be able to survive up to the point where humans will be immortal.

Anyway, I'm reading 'The Singularity is Near: When Humans Transcend Biology' and I'm finding it pretty intriguing so far. I just wanted to know what you guys think of his claims, as well as the Singularity, posthumanism, transhumanism, superintelligent AI, and all that cool stuff.

User avatar
poxic
Eloquently Prismatic
Posts: 4751
Joined: Sat Jun 07, 2008 3:28 am UTC
Location: Left coast of Canada

Re: Thoughts on the Singularity?

Postby poxic » Sat Jan 02, 2010 5:20 am UTC

They're lovely ideas. Unfortunately, we humans seem to like tearing each other apart, preferably those who didn't have the grace of $deity to be born and raised where and how we were born and raised (the heathens).

We really don't do well in packs of more than twenty or so, maybe 100 if we have to push it. If there is lots of entertainment and physical comfort around to numb us to each other, we can pack ourselves into cities fairly well if we each have our own little cage at the end of the day. I suspect that the combination of overpopulation and climate change will be enough to end this civilisation's run within the next two hundred years. More factors would shorten that time, whether those are rogues with dangerous technology or a super-sized solar storm that takes down the entire electric grid. (Can't find a reference just now for that last one, but the potential is there.)

So yeah, I guess I'm a pessimist on just how far we can get with all the fancy tech that Kurzweil and friends think we're so close to achieving. At the bottom of it all, we're just damn dirty apes who can't stop flinging poop at each other when we get riled. And it doesn't take much to get a critical mass of people riled.
A man who is 'ill-adjusted' to the world is always on the verge of finding himself. One who is adjusted to the world never finds himself, but gets to be a cabinet minister.
- Hermann Hesse, novelist, poet, Nobel laureate (2 Jul 1877-1962)

Syntax
Posts: 264
Joined: Fri Mar 30, 2007 3:39 am UTC

Re: Thoughts on the Singularity?

Postby Syntax » Sat Jan 02, 2010 5:51 am UTC

poxic wrote:They're lovely ideas. Unfortunately, we humans seem to like tearing each other apart, preferably those who didn't have the grace of $deity to be born and raised where and how we were born and raised (the heathens).

We really don't do well in packs of more than twenty or so, maybe 100 if we have to push it. If there is lots of entertainment and physical comfort around to numb us to each other, we can pack ourselves into cities fairly well if we each have our own little cage at the end of the day. I suspect that the combination of overpopulation and climate change will be enough to end this civilisation's run within the next two hundred years. More factors would shorten that time, whether those are rogues with dangerous technology or a super-sized solar storm that takes down the entire electric grid. (Can't find a reference just now for that last one, but the potential is there.)

So yeah, I guess I'm a pessimist on just how far we can get with all the fancy tech that Kurzweil and friends think we're so close to achieving. At the bottom of it all, we're just damn dirty apes who can't stop flinging poop at each other when we get riled. And it doesn't take much to get a critical mass of people riled.




Oh--my weltschmerz.... :'(

User avatar
Splendid
Posts: 111
Joined: Sat May 02, 2009 10:43 am UTC
Location: Indy-ish

Re: Thoughts on the Singularity?

Postby Splendid » Sun Jan 03, 2010 12:37 am UTC

From a GQ article titled "The Singularity", Jan 2010:
But what, exactly, are we assembling? "We're building God," Rick Schwall, a retired computer-system administrator who has attended the past three Singularity Summits, whispered to me in the theater at the Y. "And I, for one, think it's an excellent idea. We humans are terrible at being in charge. We're terrible to each other: We kill, cheat, torture, mutilate. So what we need is something else in charge--an intelligence programmed without these impulses. And the minute it gets to where it can talk, we ought to listen to it--and eventually turn the world over to it."

He's right! If only there were a list of good and bad actions, a guide on how to live, for example. THEN all our problems would be solved. Since there's no one so far telling people how to live, that must be the reason for all the violence and anger.
What an idiot. Too bad half the violence in the world is religion/belief/moral relativity-based because everyone is getting a different set of instructions. Let's add another "deity" -- a program written by a human.

The article goes on to say, though, that the God stuff "annoys most of the serious Singularity folk. They see it as a common dismissal of their vision, the idea that their research is nothing more than a cult dressed up as a scholarship."

The first part of the article mentions, basically, their efforts to extend human life to infinity. Upload your brain to a database, and there you are. On a hard drive that can theoretically control anything that will then be "you", and can be "you" forever, as we'll have no need for a "corporeal existence, no need for bodies that will eventually wrinkle, break down, and die."
Consciousness though, I think, is much different than a hard drive with a series of instructions on it on how to be Splendid or illway.

I'm only halfway through the article though.
Image

User avatar
Nath
Posts: 3148
Joined: Sat Sep 08, 2007 8:14 pm UTC

Re: Thoughts on the Singularity?

Postby Nath » Sun Jan 03, 2010 6:31 am UTC

I read one of his books a few years ago, and found it incredibly frustrating. His timeline of predictions reads like plausible but unoriginal science fiction, but he treats it like fact, based on some very weak reasoning and lots of false hope. The 'singularity', as he calls it, is not a particularly novel idea. It's not impossible, but it's certainly not a sure thing. And despite his nice and precise timeline, we have no idea how or when (or if) it will come about.

Basically, he annoys the hell out of me by getting people excited over science fiction that they don't know is science fiction.

User avatar
Josephine
Posts: 2142
Joined: Wed Apr 08, 2009 5:53 am UTC

Re: Thoughts on the Singularity?

Postby Josephine » Sun Jan 03, 2010 10:33 am UTC

this is a topic I've seen alluded to here on the fora and in the various IRC channels spawned by the fora. But nobody ever made a thread. this is a good thread to have.

poxic wrote:They're lovely ideas. Unfortunately, we humans seem to like tearing each other apart, preferably those who didn't have the grace of $deity to be born and raised where and how we were born and raised (the heathens).

We really don't do well in packs of more than twenty or so, maybe 100 if we have to push it. If there is lots of entertainment and physical comfort around to numb us to each other, we can pack ourselves into cities fairly well if we each have our own little cage at the end of the day. I suspect that the combination of overpopulation and climate change will be enough to end this civilisation's run within the next two hundred years. More factors would shorten that time, whether those are rogues with dangerous technology or a super-sized solar storm that takes down the entire electric grid. (Can't find a reference just now for that last one, but the potential is there.)

So yeah, I guess I'm a pessimist on just how far we can get with all the fancy tech that Kurzweil and friends think we're so close to achieving. At the bottom of it all, we're just damn dirty apes who can't stop flinging poop at each other when we get riled. And it doesn't take much to get a critical mass of people riled.
Even with existential risk expanding as fast as technology, we're not likely to completely destroy ourselves. Unless you completely annihilate everything (i.e. nuclear blasts covering the planet completely), humans will survive. As storage capacity reaches points where you could carry wikipedia around with you, knowledge will be preserved and technology will march onwards.

Assuming we don't blow ourselves up, there is plentiful space. Mars, Venus, uploading, building in the ocean, there really is a lot of untapped space to use. People don't have to be crammed into cities. If you can keep people spread out enough not to reach that critical mass, it's entirely possible to expand beyonnd the damn dirty ape and the fleshsack.



Nath wrote: ...based on some very weak reasoning and lots of false hope.
I actually think his reasoning is very solid. Exponential growth is undeniable. Look at Moore's Law. Look at the achievements made in the last 20 years compared to the last hundred compared to the last thousand. And earlier hominids worked on million-year timescales for technology. Your 'false hope' is the inevitable conclusion of exponential growth.



EDIT: and Splendid, that deity would actually exist, would wield real power.
Belial wrote:Listen, what I'm saying is that he committed a felony with a zoo animal.

User avatar
Nath
Posts: 3148
Joined: Sat Sep 08, 2007 8:14 pm UTC

Re: Thoughts on the Singularity?

Postby Nath » Sun Jan 03, 2010 11:46 am UTC

nbonaparte wrote:I actually think his reasoning is very solid. Exponential growth is undeniable. Look at Moore's Law. Look at the achievements made in the last 20 years compared to the last hundred compared to the last thousand. And earlier hominids worked on million-year timescales for technology. Your 'false hope' is the inevitable conclusion of exponential growth.

Progress is not an easily quantifiable thing, which means any attempt to fit an exponential curve onto it is going to have to be hand-wavey. And if you're willing to hand wave that much, you can fit any model you want onto it (within reason). For example, you could argue that progress comes in breakthroughs, with sharp bursts of activity, followed by long stagnant periods. The past few decades may have been one such burst of activity.

I'm not denying things seem to have been speeding up recently, but this trend hasn't been nearly clear enough for me to say with confidence that it will continue for another 100 years. In some ways things have already slowed down. Moore's law is not a long-term prediction; from the beginning, people have known that it would eventually run out of steam. It still has a few generations left in it, and maybe they'll find a clever way to keep it going a bit longer. But even if they do, people greatly overestimate the importance of transistor density. Performance doesn't necessarily improve exponentially along with it. And even if it did, performance is far from the only thing keeping us from human-level AI. And even if it was, many of the simple AI-type problems we're looking at today are exponential in complexity, so even exponential improvements in computing performance won't necessarily save us in the foreseeable future.

That's a lot of 'even-if's, and Kurzweil's predictions are contingent on all of them.

You talk about the things we've achieved in the last 20 years. But think back to 1990. Most of the new things we have today are fairly straightforward extrapolations of ideas that already existed then. They are powerful ideas, which is why their adoption has changed the world so much in so short a time. But new ideas in computer science are created in much the same way as new ideas in any other field of study. People read papers, come up with ideas, run experiments, and write papers. This process has become more streamlined over the years, and the community has gotten larger, but the basic turnaround time hasn't been decreasing exponentially. In fact, it's probably not changed by much since 1990. Or 1890. Ideas are the real limiting factor in our progress. What we need is an exponential speedup in the creation of ideas, not an exponential speedup in computing performance. After the singularity, these will be equivalent. But the singularity isn't here yet, so we have a chicken-and-egg problem.

User avatar
Josephine
Posts: 2142
Joined: Wed Apr 08, 2009 5:53 am UTC

Re: Thoughts on the Singularity?

Postby Josephine » Sun Jan 03, 2010 12:38 pm UTC

silicon semiconductors will reach their highest possible density soon. the law, despite functioning just fine, was designed almost 50 years ago, when the upper limits of silicon seemed very far away. But the law works if you look backwards too. It works for relays, for vacuum tubes, for punchcards, hell, even pen & paper. Who is to say that things like graphene, DNA, and quantum computing will work outside the boundaries?


You're right, progress is difficult to quantify. But that's why it has to work in paradigms. large, easily noticeable, and largely agreed upon revolutionary advancements in technology. Agriculture (over 9000 years ago (ha)). Printing (600 years ago). The Industrial Revolution (150-200 years ago). Computers (60-70 years ago). the Internet (tcp/ip really came into being 30 years ago). also notice that all of those assist faster information transfer (if indirectly at times).

I would somewhat agree that this past decade or two have only really been about maturation of technologies (with massively increased speeds for computers and internet connections), but if you take into account paradigms, you begin to see that that's okay. the next decade will be the last or second to last of maturation. That then will give rise to the next paradigm: human or greater intelligence AI.

Paradigm dates are from wikipedia. They're all in the introduction of the respective articles.

EDIT: I just noticed you talked about bursts and slow periods. that's sort of what happens, I think I made clear the cycle in my post.
Belial wrote:Listen, what I'm saying is that he committed a felony with a zoo animal.

User avatar
Sir_Elderberry
Posts: 4206
Joined: Tue Dec 04, 2007 6:50 pm UTC
Location: Sector ZZ9 Plural Z Alpha
Contact:

Re: Thoughts on the Singularity?

Postby Sir_Elderberry » Sun Jan 03, 2010 4:35 pm UTC

My biggest problem with the Singularity is that it assumes what the next breakthrough will be. I have no doubt that technology will go on progressing and doing amazing things in the next century, but that doesn't imply artificial intelligence. Maybe the next century will be dominated by biotech. Maybe it'll be nanotech. Maybe computer science will keep on being awesome and next week the "Sentience" App will be in the App Store.
http://www.geekyhumanist.blogspot.com -- Science and the Concerned Voter
Belial wrote:You are the coolest guy that ever cooled.

I reiterate. Coolest. Guy.

Well. You heard him.

User avatar
Splendid
Posts: 111
Joined: Sat May 02, 2009 10:43 am UTC
Location: Indy-ish

Re: Thoughts on the Singularity?

Postby Splendid » Sun Jan 03, 2010 5:47 pm UTC

nbonaparte wrote:EDIT: and Splendid, that deity would actually exist, would wield real power.

Those who believe in a deity now regard him/her as wielding very real power, and do very good and very bad things in his or her name. Adding in an all-knowing, "perfect" robot, I think, wouldn't change much. I for one wouldn't change my beliefs because they C++'d God and told me to listen to the Almighty Robot, and I bet you'd have a hard time convincing a lot of other people in the world to do the same. The basis of most religious, I think, is be nice to others. The problem are those loons on the far edges who interpret parts of their religious and take it to extremes. Surely there would be those on the edges of the world's new robot religion.

Obviously we're already merging with technology (mechanical arms wired to nerves, cameras to replace eyes), so I think what they are going for is a self-taught AI. They write a program and it learns how to learn, like a human.

And let's not forget The Jetsons: A 1960s cartoon that was depicting what life would be like 100 years into the future, in the year 2062. We aren't close to living in space. It seems to me as if every prediction as to "where we'll be" in 5, 10, 20, or 50 years (flying cars, anyone?) is usually an optimistic guess.
Image

User avatar
Incompetent
Posts: 396
Joined: Mon May 26, 2008 12:08 pm UTC
Location: Brussels

Re: Thoughts on the Singularity?

Postby Incompetent » Sun Jan 03, 2010 11:05 pm UTC

Sir_Elderberry wrote:My biggest problem with the Singularity is that it assumes what the next breakthrough will be. I have no doubt that technology will go on progressing and doing amazing things in the next century, but that doesn't imply artificial intelligence. Maybe the next century will be dominated by biotech. Maybe it'll be nanotech. Maybe computer science will keep on being awesome and next week the "Sentience" App will be in the App Store.


The point of all this Singularity stuff isn't that 'strong AI' will be the next breakthrough. If it happens, it will be the last real breakthrough humans as we know them today ever make. From then on, the AI is in charge, whether or not it is benign. It's not something you can just put in a box, because it's smart enough to persuade you to let it out.

User avatar
bright roar
Posts: 14
Joined: Sat Jan 02, 2010 9:31 pm UTC

Re: Thoughts on the Singularity?

Postby bright roar » Sun Jan 03, 2010 11:17 pm UTC

i haven't the faintest idea of who this kurzweil character is however all of those subjects are slightly interesting and can definitely be made more so depending on the context.

User avatar
-.Mateo.-
Posts: 264
Joined: Sun Jun 28, 2009 12:01 am UTC
Location: Location Location

Re: Thoughts on the Singularity?

Postby -.Mateo.- » Mon Jan 04, 2010 5:12 am UTC

Thing is, I don't think we would be able to be inmortals, I mean, we could make something that made humans inmortal, but that's when things get messy. First, it has to e released to everyone (let's face it, some people wouldn't do it if it wasn't good for them - "If no-one dies, then who buys my meds?" "If I sell it at 1kk the pill, then people will riot to get it"), then we would have overpopulation, lack of food and other resources, not enough jobs for everyone, etc...

Unless inmortality includes not needing food, air, shelter, etc. Then the only problem would be space...but maybe (and now it's just pure delusion) inmortalityium also gives you the ability to survive in outer space...then we would have no problems at all, we could just live in space (not the moon, not space stations, just float around)
Magus wrote:If history is to change, let it change. If the world is to be destroyed, so be it. If my fate is to die, I must simply laugh.

Just as you touch the energy of every life form you meet, so, too, will will their energy strengthen you.

User avatar
Nath
Posts: 3148
Joined: Sat Sep 08, 2007 8:14 pm UTC

Re: Thoughts on the Singularity?

Postby Nath » Mon Jan 04, 2010 5:41 am UTC

nbonaparte wrote:silicon semiconductors will reach their highest possible density soon. the law, despite functioning just fine, was designed almost 50 years ago, when the upper limits of silicon seemed very far away. But the law works if you look backwards too. It works for relays, for vacuum tubes, for punchcards, hell, even pen & paper. Who is to say that things like graphene, DNA, and quantum computing will work outside the boundaries?

What's the density of transistors for pen and paper? Moore's law (taken literally) makes no sense when applied to anything other than integrated circuits. Clearly, our ability to process information has been speeding up quite rapidly, but these concepts are not rigorous enough for the claim of 'exponential growth' to be treated as any more than a useful metaphor. What exactly is the quantity that's growing exponentially?

Sure, it's possible (and plausible) that our ability to process information will continue to speed up, with quantum computing as such. I said so in my last post. But this is not a sure thing. It's a very subjective, unreliable prediction, with no rigorous reasoning behind it. That doesn't mean it's a bad prediction; there are enough unknowns here that maybe this sort of guessing is the best we can do. But it's still a guess, and not a particularly reliable one. Certainly not one that should be treated as inevitable fact.

The other assumption is that this increased computational performance will be enough to guarantee human-level AI in the near future. This is an even weaker guess, for reasons I went into more in my last post (though, mind you, it's still not implausible). Kurzweil's predictions rely on both these pieces of guesswork.

nbonaparte wrote:I would somewhat agree that this past decade or two have only really been about maturation of technologies (with massively increased speeds for computers and internet connections), but if you take into account paradigms, you begin to see that that's okay. the next decade will be the last or second to last of maturation. That then will give rise to the next paradigm: human or greater intelligence AI.

Again, quite a plausible guess, but nothing more than a guess. Notice how you stated it as if it were a simple fact? That's my problem with Kurzweil.

User avatar
Sir_Elderberry
Posts: 4206
Joined: Tue Dec 04, 2007 6:50 pm UTC
Location: Sector ZZ9 Plural Z Alpha
Contact:

Re: Thoughts on the Singularity?

Postby Sir_Elderberry » Mon Jan 04, 2010 3:29 pm UTC

-.Mateo.- wrote:Thing is, I don't think we would be able to be inmortals, I mean, we could make something that made humans inmortal, but that's when things get messy. First, it has to e released to everyone (let's face it, some people wouldn't do it if it wasn't good for them - "If no-one dies, then who buys my meds?" "If I sell it at 1kk the pill, then people will riot to get it"), then we would have overpopulation, lack of food and other resources, not enough jobs for everyone, etc...

Unless inmortality includes not needing food, air, shelter, etc. Then the only problem would be space...but maybe (and now it's just pure delusion) inmortalityium also gives you the ability to survive in outer space...then we would have no problems at all, we could just live in space (not the moon, not space stations, just float around)

When people talk about the Singularity in this instance, they're often talking about living within computers--consciousness uploading and stuff. Which isn't to say it won't be pretty game-changing, but the whole point is to sidestep the messy biology you're talking about.
http://www.geekyhumanist.blogspot.com -- Science and the Concerned Voter
Belial wrote:You are the coolest guy that ever cooled.

I reiterate. Coolest. Guy.

Well. You heard him.

User avatar
Indon
Posts: 4433
Joined: Thu Oct 18, 2007 5:21 pm UTC
Location: Alabama :(
Contact:

Re: Thoughts on the Singularity?

Postby Indon » Tue Jan 05, 2010 4:28 pm UTC

I'd assert that we're in a technological singularity already, and have been since the symbol was invented, allowing us to start to use tools in order to better use tools.

Not only are we continually accelerating in technological development right now, but we're also manifesting the other major symptom of falling into a singularity - we're stretching out, because some parts of us are falling faster than others.

Yet, despite falling into a technological singularity for hundreds if not thousands of years, we aren't transhuman yet. We still call ourselves humans and we still live a perfectly normal day-to-day life oblivious of the fact that we are fundamentally not the beings we were before we started writing things down.

And when we're colonizing space in genetically engineered cyborg bodies, we'll still be human, doing what we always do.

If you fall into a singularity, by your perception, when do you stop?
So, I like talking. So if you want to talk about something with me, feel free to send me a PM.

My blog, now rarely updated.

Image

User avatar
Josephine
Posts: 2142
Joined: Wed Apr 08, 2009 5:53 am UTC

Re: Thoughts on the Singularity?

Postby Josephine » Tue Jan 05, 2010 9:06 pm UTC

We knew technology was moving quickly. Not at first, though. In the early part of civilization, the base of the curve looked linear to people. But there are 2 factors that make this part of it different. thanks to people like Kurzweil, we know about it. And, we're on our way towards the asymptote of the technology curve. It isn't fundamentally different from the past (looking at the past is how the Law of Accelerating Returns was conceived of), but it is significantly different from it, if only in the speed of advancement.
Belial wrote:Listen, what I'm saying is that he committed a felony with a zoo animal.

Glo
Posts: 39
Joined: Tue Jan 05, 2010 8:46 am UTC

Re: Thoughts on the Singularity?

Postby Glo » Tue Jan 05, 2010 9:57 pm UTC

If the speed of technology progress is proportional to its current value then the progress itself will grow up exponentially fast. And it seems like this. But to hold techical abilities and progress on level we should work on it and not waste our forces on just discussing what's going on. That is thing I don't like in SIAI. They actually don't work on AI, they browse what happens and comment it.

User avatar
Josephine
Posts: 2142
Joined: Wed Apr 08, 2009 5:53 am UTC

Re: Thoughts on the Singularity?

Postby Josephine » Tue Jan 05, 2010 10:04 pm UTC

The book doesn't really go into detail on how to do it, sure. It's almost more philosophy. It's a reflection on what's possible. It's up to others to create the AI.
Belial wrote:Listen, what I'm saying is that he committed a felony with a zoo animal.

User avatar
Fuzzy_Wuzzy.bmp
Posts: 166
Joined: Tue Jul 22, 2008 3:32 pm UTC
Location: Screaming vacuum of space

Re: Thoughts on the Singularity?

Postby Fuzzy_Wuzzy.bmp » Tue Jan 05, 2010 11:58 pm UTC

nbonaparte wrote:We knew technology was moving quickly. Not at first, though. In the early part of civilization, the base of the curve looked linear to people. But there are 2 factors that make this part of it different. thanks to people like Kurzweil, we know about it. And, we're on our way towards the asymptote of the technology curve. It isn't fundamentally different from the past (looking at the past is how the Law of Accelerating Returns was conceived of), but it is significantly different from it, if only in the speed of advancement.

Ah, the asymptotes of the exponential function.
Also, it seems awfully bold fitting exponential functions onto something as noisy and difficult to quantify as technological progress. For all we know the curve could be a bloody ArcTan(t) + π/2.
And don't forget about diminishing returns - the difficulty of advancing in a particular field of research might follow nasty functions that probably have an asymptote pointing up along the way - a real asymptote.
Oh and also:
nbonaparte wrote:EDIT: and Splendid, that deity would actually exist, would wield real power.

What did you not watch the terminator?

In conclusion: this is why I prefer reading Harry Potter versus "reflections on what's possible". Sweet Jesus.

Edit: Ha ha ha, oh boy, reading the predictions made in the book 'The Singularity Is Near' on wikipedia and having a jolly good time :D

User avatar
Nath
Posts: 3148
Joined: Sat Sep 08, 2007 8:14 pm UTC

Re: Thoughts on the Singularity?

Postby Nath » Wed Jan 06, 2010 1:40 am UTC

Fuzzy_Wuzzy.bmp wrote:Also, it seems awfully bold fitting exponential functions onto something as noisy and difficult to quantify as technological progress. For all we know the curve could be a bloody ArcTan(t) + π/2.

The curve can be whatever you want, as long as it's generally moving upward. If you're going to pretend that a civilization's technology level is a single real number, you have a lot of wiggle room in deciding how to make that mapping, depending on what point you want to make.

I just attended a talk by a multiprocessor architecture guy, talking about the future of computing performance. As far as he's concerned, the days of computing performance conveniently doubling every couple of years are already gone, and now we're stuck squeezing what performance we can out of parallelism (not that he minded, since parallelism is kind of his thing). This has been the case for long enough that consumers can now feel it. A PC from four years ago does its job just fine now, running pretty current software. In fact, I just looked at the website where I bought my laptop a few years ago. The stuff they're selling today has marginally better specs, and costs about the same. Just a few years ago, four years was forever. Four years could take you from a 233 MHz Pentium II running Windows 95 to a 2 GHz Pentium IV running XP.

For computing performance to continue to increase as fast as we're now used to, we're going to need an unforeseen breakthrough, right now. Unforeseen breakthroughs don't pop up on demand, no matter how many exponential curves you make up.

User avatar
negatron
Posts: 294
Joined: Thu Apr 24, 2008 10:20 pm UTC

Re: Thoughts on the Singularity?

Postby negatron » Fri Jan 08, 2010 12:43 am UTC

Nath wrote:As far as he's concerned, the days of computing performance conveniently doubling every couple of years are already gone, and now we're stuck squeezing what performance we can out of parallelism (not that he minded, since parallelism is kind of his thing)

I'm not seeing a problem. Performance is 'doubling' just the same, it's just that parallelism is required to make use of it.

Nath wrote:This has been the case for long enough that consumers can now feel it.

How do you figure? Why does the consumer care whether their applications are parallelized or not so long as it achieves the desired results?

Nath wrote:A PC from four years ago does its job just fine now, running pretty current software.

I have an old Casio calculator. It also does it's job just fine if I want to multiply two numbers. This speaks nothing of how obviously limited it's job is.

Nath wrote:The stuff they're selling today has marginally better specs

Perhaps you should judge performance by benchmarks rather than megahurtz. Performance per dollar is at least 10 times higher today over four years ago.

Nath wrote:For computing performance to continue to increase as fast as we're now used to

Why would we ever want that? Performance over the last few years has increased far more dramatically than we are used to. Historically all computation other than raster graphics was done on a CPU. Only in the last few years have arbitrary applications began to move floating point computation from 4 concurrent operations on a CPU to the hundreds available on a GPU. Metropolis Light transport renderers for example have effectively become almost 1000 times faster over the last few years. A pentium 4 2GHz couldn't even compute a 100 element Navier-Stoke problem in real-time, now I can simulate 100,000 elements at 30fps, plus rendering.

Countless unforeseen breakthroughs have brought us from vacuum tubes to 28nm transistors. In retrospect it's highly implausible to suggest that no further such breakthroughs will be made.
Image
I shouldn't say anything bad about calculus, but I will - Gilbert Strang

User avatar
Nath
Posts: 3148
Joined: Sat Sep 08, 2007 8:14 pm UTC

Re: Thoughts on the Singularity?

Postby Nath » Fri Jan 08, 2010 2:03 am UTC

Sorry about the quote-sniping; you've brought up many separate points, so my response seemed clearer this way.

negatron wrote:I'm not seeing a problem. Performance is 'doubling' just the same, it's just that parallelism is required to make use of it.
...
Why does the consumer care whether their applications are parallelized or not so long as it achieves the desired results?

But that's the thing: performance isn't doubling. We've been putting off parallelization for so long because it doesn't work very well. Many important algorithms are inherently sequential, and can't be highly parallelized. Most software developers don't know how to program effectively in parallel. The actual speedup grows far slower than the number of processors. So even if we keep doubling the number of cores, the performance will improve at a much slower pace.

The customers don't care that it's parallelized. They care that it's not much faster than it used to be. This is because computer architects no longer know how to use the extra transistors to effectively contribute to processor performance. So they parallelize, because it's a bit better than nothing.

negatron wrote:I have an old Casio calculator. It also does it's job just fine if I want to multiply two numbers. This speaks nothing of how obviously limited it's job is.

A PC from 8 years ago would have been obsolete 4 years ago, because software was written to take advantage of the increasing performance. A PC from 4 years ago is almost as useful as a PC bought this morning, because there's not much more performance for software to take advantage of, so old machines and new machines can run pretty much the same stuff.

negatron wrote:Perhaps you should judge performance by benchmarks rather than megahurtz. Performance per dollar is at least 10 times higher today over four years ago.

I just looked up a general benchmark for the processor in my laptop, released almost exactly four years ago. I compared it to a comparably priced machine sold by the same vendor today, with a processor released one year ago. The speedup is slightly under 1.7x, in three years.

negatron wrote:Why would we ever want that? Performance over the last few years has increased far more dramatically than we are used to. Historically all computation other than raster graphics was done on a CPU. Only in the last few years have arbitrary applications began to move floating point computation from 4 concurrent operations on a CPU to the hundreds available on a GPU. Metropolis Light transport renderers for example have effectively become almost 1000 times faster over the last few years. A pentium 4 2GHz couldn't even compute a 100 element Navier-Stoke problem in real-time, now I can simulate 100,000 elements at 30fps, plus rendering.

Many graphics algorithms are highly parallelizable, which is why they have been treated as a special case for several years. While people have been using GPUs for some non-graphical algorithms, they are only really applicable to a restricted set of algorithms.

Caveat: since this is a thread about the singularity, I'll admit that some promising machine learning algorithms are pretty parallelizable, including ones related to search and information retrieval. This is one of the reasons the past few years have seen continued progress in these areas. But this has required active research effort; we couldn't just run the same algorithms we had eight years ago and hope for a big speedup.

negatron wrote:Countless unforeseen breakthroughs have brought us from vacuum tubes to 28nm transistors. In retrospect it's highly implausible to suggest that no further such breakthroughs will be made.

Then it's good that nobody is making this assertion. What I'm saying is that it's silly to assume that these breakthroughs will continue to be made on a schedule of our choosing, particularly given that the trends that have brought us this far have already started to go off the rails. The thing about unforeseen breakthroughs is that they're unpredictable. They'll happen when they happen.

User avatar
negatron
Posts: 294
Joined: Thu Apr 24, 2008 10:20 pm UTC

Re: Thoughts on the Singularity?

Postby negatron » Fri Jan 08, 2010 6:40 am UTC

Nath wrote:But that's the thing: performance isn't doubling

Exponential growth has a constant doubling period, so yes this must be the case. Performance doubles approximately at some interval. Perhaps growth in circuit capacities is a polynomial function if that is your claim, but all of computing history does not show this to be true.

Nath wrote:We've been putting off parallelization for so long because it doesn't work very well

http://en.wikipedia.org/wiki/CDC_STAR-100

This piece of crap was built 40 years ago. It's processors had SIMD units. Even basement programmers have been using MMX on CPUs without much difficulty since the 90s.

http://en.wikipedia.org/wiki/File:KL_MDA_Unknown.jpg
This horrid thing here suggests that parallel computation on the desktop has been around since the early 80s.

Parallelization works perfectly well and it's difficult to imagine where you got the idea it's been put off. It's been around forever. I'm imagining you think dual core processors were the start of parallel computation. They're nothing more than an extension of an ancient paradigm.

Nath wrote:Many important algorithms are inherently sequential, and can't be highly parallelized.

Which algorithms would those be? I'm genuinely having trouble recalling just one.

Nath wrote:Most software developers don't know how to program effectively in parallel.

Most software programmers no longer need to. Commercial applications often make use of Intel compilers which support auto-vectorization. Though this approach is not good enough for high scale, performance critical applications, 'most' software developers are unfit to program such applications in any case. For applications which are vastly more parallel, such as rendering, physical simulations, etc, libraries such as linear algebra packages and solvers for either CPUs or GPUs don't require a programmer to have any knowledge of their internal workings whatsoever. Look at the DirectX SDK. The whole thing pretty much exclusively relies on massive parallel computation, and it manages to work regardless of how oblivious an amateur game developer may be of the fact.

Nath wrote:The actual speedup grows far slower than the number of processors.

I don't know what it is you're referring to here, but even high-polynomial and non-polynomial algorithms can make full use of all available resources. Even if the algorithm itself is non-linear with the number of elements, it's utilization of resources is. Even in the highest of complexity classes such as factorization, doubling computing resources will cut processing time in half.

Nath wrote:I just looked up a general benchmark for the processor in my laptop, released almost exactly four years ago. I compared it to a comparably priced machine sold by the same vendor today, with a processor released one year ago. The speedup is slightly under 1.7x, in three years.

There are any number of applications which can show off the processing power in a modern CPU yet you managed to stumble on the worst of them all. I am very curious what these benchmarks were. I'm sure an email or two would get these folks to fix their broken software.

http://www.cpubenchmark.net/

Nath wrote:While people have been using GPUs for some non-graphical algorithms, they are only really applicable to a restricted set of algorithms.

No, they're not. Just about everything which was historically considered a CPU problem is moving to the GPU. Rendering, simulating, database search, encoding, decoding, path finding, machine learning and neural networks, image recognition, fourier transforms. I think I've listed just about everything computers do. GPUs may not have glamorous prefetching and branch-prediction, yet that's somehow not a problem.

Nath wrote:The thing about unforeseen breakthroughs is that they're unpredictable.

They're very predictable. I can't tell you what the next breakthrough in integrated circuits will be, but I can, with much confidence, tell you that it will occur at about the time the industry needs it. Call it magic, I'm not so sure it's not, but every node reduction has been one breakthrough after another for the past 50 years. That's predictable.
Image
I shouldn't say anything bad about calculus, but I will - Gilbert Strang

User avatar
Nath
Posts: 3148
Joined: Sat Sep 08, 2007 8:14 pm UTC

Re: Thoughts on the Singularity?

Postby Nath » Fri Jan 08, 2010 7:56 am UTC

negatron wrote:Exponential growth has a constant doubling period, so yes this must be the case. Performance doubles approximately at some interval. Perhaps growth in circuit capacities is a polynomial function if that is your claim, but all of computing history does not show this to be true.

It might still be exponential, or it might not. In any case, it's not doubling every two years, which is the usual figure that gets thrown around in these discussions, and upon which Kurzweil's timeline is based.

And I'm aware that parallel computing has been around forever. As I've said earlier in this thread, most of the 'new' ideas of the past few years are just old ideas coming into maturity. But parallelization hasn't been the primary driving force behind computing performance until the last few years. Parallelization was expected to become the next big thing back in the 80s, until people ran into its limitations and realized there were easier ways to move forward (which we've since mostly exhausted). There's an interesting article here about trends in the number of multiprocessor papers in a computer architecture conference. It's interesting how it rose and fell during the 80s and 90s.

negatron wrote:Which algorithms would those be? I'm genuinely having trouble recalling just one.

In the context of AI, most widely-used inference algorithms in graphical models (e.g. belief propagation, MCMC), inductive learning (e.g. FOIL and descendants thereof), natural language understanding, planning, the dynamic programming algorithms for HMMs etc. have been developed for a single processor. Some of them can straightforwardly be adapted to run in parallel, e.g. by running them multiple times with different random seeds. Others (e.g. BP, DP) are deterministic, and parallelizing them is non-trivial. These are now hot research areas (though there's been the occasional paper now and then as parallelization went in and out of fashion), and some progress has already been made, but the limiting factor here is now the algorithms and not the underlying hardware performance.

negatron wrote:Most software programmers no longer need to. Commercial applications often make use of Intel compilers which support auto-vectorization. Though this approach is not good enough for high scale, performance critical applications, 'most' software developers are unfit to program such applications in any case. For applications which are vastly more parallel, such as rendering, physical simulations, etc, libraries such as linear algebra packages and solvers for either CPUs or GPUs don't require a programmer to have any knowledge of their internal workings whatsoever. Look at the DirectX SDK. The whole thing pretty much exclusively relies on massive parallel computation, and it manages to work regardless of how oblivious an amateur game developer may be of the fact.
...
I don't know what it is you're referring to here, but even high-polynomial and non-polynomial algorithms can make full use of all available resources. Even if the algorithm itself is non-linear with the number of elements, it's utilization of resources is. Even in the highest of complexity classes such as factorization, doubling computing resources will cut processing time in half.

The tools you mentioned are damn useful, but they won't parallelize the whole program, and therefore may not be able to use all available resources. Even highly parallel algorithms have sequential portions, and these become the limiting factor. You know, Amdahl's law and all that. A recent model predicted that even a 99% parallel program running on up to 256 cores will only get about a 72x speedup -- much less than what you would need for the performance increase to stay lined up with Moore's law. To double performance every two years, we'd need not only double the number of cores, but also the parallelism of code. Maybe this is possible; maybe not. It's certainly not what drove performance increases in the past, so the fact that we've managed to improve performance at this pace before is no guarantee that we can do so in the future. The kind of research needed is entirely different.

negatron wrote:There are any number of applications which can show off the processing power in a modern CPU yet you managed to stumble on the worst of them all. I am very curious what these benchmarks were. I'm sure an email or two would get these folks to fix their broken software.

http://www.cpubenchmark.net/

Wait, are you saying that CPUbenchmark.net is broken software, or are you saying that the benchmarks I used were broken and recommending this as an alternative?

negatron wrote:No, they're not. Just about everything which was historically considered a CPU problem is moving to the GPU. Rendering, simulating, database search, encoding, decoding, path finding, machine learning and neural networks, image recognition, fourier transforms. I think I've listed just about everything computers do.

Just about any problem can be parallelized, once you design and implement a suitable parallel algorithm. The problem is that most state-of-the-art algorithms for these things are not parallel, which is why parallelizing all of them is such a large, slow research effort. I'm not saying it can't be done; I'm saying that it's difficult, and that we've barely started. It's not enough to improve the hardware; we need to redesign the algorithms are rewrite the software to take advantage of the improved hardware. And it won't be over then. We'll need to continue to make the algorithms and software more and more parallel -- doubly so, in fact -- to continue improving performance at our recent pace.

negatron wrote:They're very predictable. I can't tell you what the next breakthrough in integrated circuits will be, but I can, with much confidence, tell you that it will occur at about the time the industry needs it. Call it magic, I'm not so sure it's not, but every node reduction has been one breakthrough after another for the past 50 years. That's predictable.

I hope you're right, but I'm not convinced. The evidence on your side is a trend a few decades old, saying that computing performance appears to roughly double every two years. The evidence on my side is a couple of millennia of human experience saying that these kinds of rapid changes are rarely long-term, and eventually slow down. Remember, the first stage of a logistic function looks like an exponential, too. Then it becomes linear. Then it becomes static. It's hard to tell just by looking at the beginning whether we've got an exponential curve or a logistic curve, or something else entirely.

User avatar
LuNatic
Posts: 973
Joined: Thu Nov 20, 2008 4:21 am UTC
Location: The land of Aus

Re: Thoughts on the Singularity?

Postby LuNatic » Fri Jan 08, 2010 8:56 am UTC

The article goes on to say, though, that the God stuff "annoys most of the serious Singularity folk. They see it as a common dismissal of their vision, the idea that their research is nothing more than a cult dressed up as a scholarship."


Which it pretty much is. It's one thing to have an idea, and to try and bring it to fruition. It's quite another to say "Hey everyone, stop what you're doing and come follow me. I have this great plan that will end war, make us all immortal, let us explore the galaxy and make us all happy and intelligent and stuff. It'll be awesome. Oh, one minor issue; we don't know how do any of this yet, but bare with us, I'm sure we'll figure it out soon."
Cynical Idealist wrote:
Velict wrote:Good Jehova, there are cheesegraters on the blagotube!

This is, for some reason, one of the funniest things I've read today.

User avatar
negatron
Posts: 294
Joined: Thu Apr 24, 2008 10:20 pm UTC

Re: Thoughts on the Singularity?

Postby negatron » Fri Jan 08, 2010 9:12 am UTC

Nath wrote:it's not doubling every two years

Indeed it's not, it's doubling every year by all relevant measure. Combined with algorithm improvements, scientific applications in particular have seen ~Billion fold improvement in throughput over the past 20 years. There was a really interesting journal publication earlier this year which showed the relative improvements of algorithms and hardware and their impact on scientific computing. I'd happily link to it if I can recall it's whereabouts, but in any case floating point throughput has been doubling every year on both CPUs and GPUs for as long as I can remember. Benchmarks have shown this to be true in practice, and they are not hard to find. I don't know what Kurzweil's claims are about the rate of improvement in processing power, but if he's claiming a doubling every two years he is being awfully conservative.

Nath wrote:Others (e.g. BP, DP) are deterministic, and parallelizing them is non-trivial.

Back propagation, unless I'm mistaken, does not have data dependencies and it too therefore should be trivially parallel. In fact searching for "back propagation gpu" shows that there are countless implementations for GPUs already, and I'm sure many more for CPUs. I'm not familiar with dynamic programming but here too "parallel dynamic programming" "gpu dynamic programming" shows many results, although clearly this is a more difficult problem. Fortunately even non-trivial algorithms only have to be implemented once for all time. Algorithmic complexity should not and is not a burden for an end programmer

Nath wrote:These are now hot research areas

Indeed they are. It only emphasizes that little, if anything at all, is out of bounds to parallel computation. Does a non-trivial problem even exist for which a solution can only be strictly sequential?

Nath wrote:Even highly parallel algorithms have sequential portions, and these become the limiting factor.

I don't know if that's quite true. A sequential portion would imply that there is a strictly sequential calculation which increases with problem size. I'm familiar with Amdahl's law, however the bottleneck is always easily resolved by increasing the problem size. Doing this increases the parallel fragment without increase the sequential fragment, as these are constant-set evaluations, such as logical branching in the main thread, and do not perform computation on the elements themselves. It cannot therefore be said to be a portion of the problem, but rather a constant which diminishes with increasing problem size.

Nath wrote:or are you saying that the benchmarks I used were broken and recommending this as an alternative?

Indeed. The one I linked to is a far more reliable measure. It's an averaging of many different benchmarks.

Nath wrote:these kinds of rapid changes are rarely long-term, and eventually slow down.

They may very well do that, however you suggested they have slowed down already and I'm only here to point out how wrong this is. :D
Image
I shouldn't say anything bad about calculus, but I will - Gilbert Strang

Carnildo
Posts: 2023
Joined: Fri Jul 18, 2008 8:43 am UTC

Re: Thoughts on the Singularity?

Postby Carnildo » Fri Jan 08, 2010 9:23 am UTC

negatron wrote:
Nath wrote:But that's the thing: performance isn't doubling

Exponential growth has a constant doubling period, so yes this must be the case. Performance doubles approximately at some interval. Perhaps growth in circuit capacities is a polynomial function if that is your claim, but all of computing history does not show this to be true.

"Circuit capacity has grown exponentially in the past, therefore it is growing exponentially now, and will grow exponentially in the future." Do you see the flaw in this argument?

Here's a hint: there are other curves that also look exponential if you only see a small part of them. The most common one is the logistic curve: rapid initial growth that levels off to a plateau, but there are also ones that show exponential initial growth followed by a catastrophic collapse once a limiting resource is exhausted. How do you know we aren't seeing one of the latter?

User avatar
Nath
Posts: 3148
Joined: Sat Sep 08, 2007 8:14 pm UTC

Re: Thoughts on the Singularity?

Postby Nath » Fri Jan 08, 2010 10:04 am UTC

negatron wrote:Indeed it's not, it's doubling every year by all relevant measure. Combined with algorithm improvements, scientific applications in particular have seen ~Billion fold improvement in throughput over the past 20 years. There was a really interesting journal publication earlier this year which showed the relative improvements of algorithms and hardware and their impact on scientific computing. I'd happily link to it if I can recall it's whereabouts, but in any case floating point throughput has been doubling every year on both CPUs and GPUs for as long as I can remember. Benchmarks have shown this to be true in practice, and they are not hard to find. I don't know what Kurzweil's claims are about the rate of improvement in processing power, but if he's claiming a doubling every two years he is being awfully conservative.

Well, if you do dig up that reference, it sounds like an interesting read. And I'm not disputing the fact that we've had dramatic improvement in performance recently, or the fact that these improvements have revolutionized scientific computing. I'm saying that it's showing signs of slowing down. Whether it was doubling every year or every two years in the past, it's taking longer now.

negatron wrote:Back propagation, unless I'm mistaken, does not have data dependencies and it too therefore should be trivially parallel. In fact searching for "back propagation gpu" shows that there are countless implementations for GPUs already, and I'm sure many more for CPUs. I'm not familiar with dynamic programming but here too "parallel dynamic programming" "gpu dynamic programming" shows many results, although clearly this is a more difficult problem. Fortunately even non-trivial algorithms only have to be implemented once for all time. Algorithmic complexity should not and is not a burden for an end programmer

:) Wrong BP. I meant belief propagation. There is a group that's working on parallelizing that, too, but it's been surprisingly difficult to do well. And dynamic programming refers to a pretty wide class of algorithms, all of which are hard to parallelize for much the same reason, but I don't think it's impossible to squeeze some parallelism out of it.

I agree that algorithmic complexity is more a research issue than an end programmer issue, but it's still not a one-time job, because it's not sufficient to come up with any old parallelized algorithm. You need to make all algorithms more and more parallel to keep getting the speedup as the number of cores increases.

Note that none of this was needed when the speedup was on a single core. You could just take your existing algorithm, run it out of the box, and enjoy the hard work of the computer architects. You could scale up as much as the hardware would allow without having to do any additional algorithmic research. That's no longer the case. Now, before a problem can take advantage of the new hardware, some poor schmuck has to sit down and think about it for a year and design a suitable algorithm. And then do it again the following year. By this time the hardware might have quadrupled in its theoretical capabilities, but that's not the limiting factor any more.

negatron wrote:Indeed they are. It only emphasizes that little, if anything at all, is out of bounds to parallel computation. Does a non-trivial problem even exist for which a solution can only be strictly sequential?

I doubt it, and said as much in my last post. The problem isn't the problems :). The problem is the algorithms.

negatron wrote:I don't know if that's quite true. A sequential portion would imply that there is a strictly sequential calculation which increases with problem size. I'm familiar with Amdahl's law, however the bottleneck is always easily resolved by increasing the problem size. Doing this increases the parallel fragment without increase the sequential fragment, as these are constant-set evaluations, such as logical branching in the main thread, and do not perform computation on the elements themselves. It cannot therefore be said to be a portion of the problem, but rather a constant which diminishes with increasing problem size.

This is a valid limitation of Amdahl's law, because it's true for some trivially parallelizable algorithms. Sadly, it doesn't always apply. The straightforward way to parallelize some algorithms, such as BP, requires splitting the problem up into chunks that will be processed on each node (making various approximations in the process), then knitting them back together to bring them back in sync, and repeating as needed. The end result has a lower quality solution. You can get a better quality solution by making it less parallel, or a faster but worse solution by making it more parallel. Annoying, isn't it?

negatron wrote:Indeed. The one I linked to is a far more reliable measure. It's an averaging of many different benchmarks.

So the benchmarks you recommended are more reliable than the ones I used? Interesting, since those are the ones I used. Awkward.

What made you think you knew what benchmarks I was using?

Carnildo wrote:Here's a hint: there are other curves that also look exponential if you only see a small part of them. The most common one is the logistic curve: rapid initial growth that levels off to a plateau, but there are also ones that show exponential initial growth followed by a catastrophic collapse once a limiting resource is exhausted. How do you know we aren't seeing one of the latter?

Thanks for bringing this up again. I mentioned it at the bottom of my last post, but it was apparently so long that nobody read it all the way through. :)

Apparently, I'm making a habit of this.


Return to “General”

Who is online

Users browsing this forum: No registered users and 17 guests