June 26, 2008

The Singularity

Those of us who are well versed in the minutiae of sub-atomic particular physics (?) are acutely aware of the fact that when we try to model the 'thing' in the middle of a black hole our physics breaks down. This 'thing in the middle of the black hole is termed 'a singularity'. (On a similar wavelength - excuse the pun - I always find it interesting that we can't go faster than light, allegedly, for no reason other than the fact that if we did, the mathematics we've used to model this wouldn't work. Doesn't this mean that the math is wrong rather than the speed of light is unbreakable?)

In a similar way if we try to model a world where there is a greater intelligence than humans, our ability to model that world breaks down as well.

Putting these two thoughts together has resulted in the creation of The Singularity Institute for Artificial Intelligence. This is an institute working on the creation of smarter-than-human intelligence. Their web site is worth a quick look at.

When we think of Artificial Intelligence (A.I.) we usually think of computers or robots (or even that awful film by Steven Spielberg about computers and robots). However according to the website there are actually several different types: direct brain-computer interfaces, biological augmentation of the brain, genetic engineering, ultra-high-resolution scans of the brain followed by computer emulation, for example.

One of the key items they are looking at is the ability to increase intelligence through speeding up thought processes. It works like this: Human neurons operate by sending electrochemical signals that propagate at a top speed of 150 meters per second along the fastest neurons. By comparison, the speed of light is 300,000,000 meters per second, two million times greater. Similarly, most human neurons can spike a maximum of 200 times per second. By comparison, speeds in modern computer chips are currently at around 2GHz – a ten millionfold difference – and still increasing exponentially. At the very least it should be physically possible to achieve a million-to-one speedup in thinking, at which rate a subjective year would pass in 31 physical seconds. At this rate the entire subjective timespan from Socrates in ancient Greece to modern-day humanity would pass in under twenty-two hours.

Imagine that - a whole institute devoted to creating intelligence that is greater than human intelligence! Does this sound a little freaky to you? It did to me when I first read it. But then again, as Arthur C Clark once said "Any sufficiently advance technology is indistinguishable from magic". I suppose if we look at this as some sort of Frankenstein mission to create a better brain we are doing ourselves, and the Institute, a disservice. But if we look at this as the next leap forward in understanding and improving our
own intelligence then this might be 'magic' that we coud all benefit from

Imagine a situation where we are using powerful artificial intelligence to work through scientific data and identify tests and trials to be done to cure cancer. What about one that can reside in an airplane cockpit and process information a million times faster than the pilot and help him fly the plane?

Surely this is something to be embraced?

I just checked the commitments and core values from the Singularity. They are:
  • SIAI will not enter any partnership that compromises our values.
  • Technology developed by SIAI will not be used to harm human life.
  • The challenge, opportunity and risk of artificial intelligence is the common concern of all humanity. SIAI will not show ethnic, national, political, or religious favoritism in the discharge of our mission..
I particularly like the one about not harming human life...

0 comments (See Policy http://tinyurl.com/5qgr5x):

Post a Comment