Aelfyre wrote:To be clear, I would *love* to wrong on this one.. and see the Singularity occur while I was reasonably young but I think there are several breakthrus yet to be made that rely upon several technologies yet to be developed.
I just personally believe that we are not even anywhere in the vicinity of creating anything that can be meaningfully called AI, even in a weak sense.
Sure, we can program computers, with application of massive amounts of processing power and storage capacity, to perform single task objectives, like chess or jeopardy, reasonably well. But that's just, in my opinion, not AI
. As long as you're brute forcing the functionality, you haven't created anything that is meaningfully intelligent. Or, at the very least, you haven't laid any of the conceptual groundwork for AI that can step beyond the single objective they've been brute force designed to accomplish (which is something any "singularity" inducing AI would need to be able to do).
And this is the crux of it for me, and where I agree with you that we are missing some key bits of knowledge that we'd need to design an AI capable of brining about the singularity: Our ability to design such an AI is currently hampered less by our lack of technological proficiency than by our lack of understanding of just exactly how "intelligence" functions in the first place.
Any continuing advancements we make re: computational power/efficiency will be made at a pre-singularity, human-level (as opposed to 'super-human') rates until we know a lot more about the underlying structure of systems that generate 'intelligence' than we currently do.
EDIT: Also, consider that even the creation of strong
AI isn't sufficient to bring about the technological singularity. After all, there are (although this is potentially debatable, depending on who you talk to) around 6 billions strong
non-artificial intelligences walking around on the planet right now, but none of us have to ability to achieve singularity-inducing rates of self-improvement.
To be capable of inducing the singularity, it isn't sufficient for an AI to just be smarter
than us - it will need to have a vastly, vastly different and improved spin on "intelligence" than anything we've created or postulated thus far.
Considering that we haven't even come close to creating any AI that can even begin to approximate human-level cognition, it's hard to see how we can be very near creating an AI that vastly outstrips