Will a 'conscious' machine ever be built?

The question of whether machines will be capable of human intelligence is ultimately a matter for philosophers to take up and not something that scientists can answer, an inventor and a computer scientist agreed during a debate Thursday night at the Massachusetts Institute of Technology. Inventor Ray Kurtzweil and Yale University professor David Gelernter spent much of the session debating the definition of consciousness as they addressed the question, "Are we limited to building super-intelligent, robotic 'zombies,' or will it be possible for us to build conscious, creative, even 'spiritual' machines?" Although they disagreed, even sharply, on various points, they did agree that the question is philosophical rather than scientific.

The debate and a lecture that followed were part of MIT's celebration of the 70th anniversary of Alan Turing's paper "On Computable Numbers," which is widely held to be the theoretical foundation for the development of computers. In a 1950 paper, Turing suggested a test to determine "machine intelligence." In the Turing Test, a human judge has a conversation with another human and a machine, not knowing which responses come from the human or the machine. If it cannot be determined where the responses come from -- the human or the machine -- then the machine is said to "pass" the test and exhibit intelligence. Of course, this being at least in part a philosophical matter, the Turing Test itself is the source of ongoing dispute.

View: The full story
News source: InfoWorld

Report a problem with article
Previous Story

EMC Declares Full Support for Vista, Exchange 2007

Next Story

BT teams up with Motorola for city-wide Wi-Fi

42 Comments

Commenting is disabled on this article.

I would have to go with a yes, but it will probably be along time before it is created. The main problem we need to overcome before this happens is that we still need to learn much more about the brain. The universe is bound on simplistic principles, probably only one, which later created more advanced principles, thus meaning that it is inevitable that a conscious machine will be made in the future. After all, human bodies are just organic machines.

me personally i dont think it can be done... cause it's like people said we dont even fully understand how the brain works... so it's like we cant even get in the ball park of creating a "self-aware" computer etc.

if it is ever done it wont be in my lifetime for sure.... and it's like someone said, people will most likely try to destroy it since they dont trust it etc.

i basically think consciousness is pretty much a human thing (maybe some animals know, i read something about a elephant recognising itself in a mirror online a while back and they painted a X on it's head and the elephant would try putting it's trunk on it etc)

As the old saying goes: "The question of whether computers can think is just like the question of whether submarines can swim."

Maybe when we start mixing computers and humans together we might get some sort of 'conscious machine'. Sort of like the spiderman 2 approach with the Doc.

I don't see why it couldn't happen..

So what "thinks" in the human brain?
The "software" or the "hardware"?
It is clearly not the hardware, as atoms and chemical molecules clearly can't think.
Exactly as gates and transistors don't think.

What we have to do is to use the gates and transistors in a way, as the human organism uses atoms and chemical molecules, or replace the gates and transistors with more complex things.

Quote - Mr_Mo said @ #14.1
So what "thinks" in the human brain?
The "software" or the "hardware"?
It is clearly not the hardware, as atoms and chemical molecules clearly can't think.
Exactly as gates and transistors don't think.

What we have to do is to use the gates and transistors in a way, as the human organism uses atoms and chemical molecules, or replace the gates and transistors with more complex things.

Not gates and transistors.

no. humans aren't smart enough to understand how a conscious works, let alone make it. you can't teach a machine to learn dynamically (sure, you can do pseudo-dynamics and code in some predictive activity) so having a conscious would be even more impossible. God is the only one who can make a conscious capable of dynamic thought and pensiveness as well as emotion and all things that coincide with that.. something evolution could never ever 'create'

i'll bite my foot off the day someone could come up with a true conscious. it's simply not going to happen.

Quote - nfin1ty said @ #13
no. humans aren't smart enough to understand how a conscious works, let alone make it. you can't teach a machine to learn dynamically (sure, you can do pseudo-dynamics and code in some predictive activity) so having a conscious would be even more impossible. God is the only one who can make a conscious capable of dynamic thought and pensiveness as well as emotion and all things that coincide with that.. something evolution could never ever 'create'

i'll bite my foot off the day someone could come up with a true conscious. it's simply not going to happen.

No need to add religion to the subject. This is about computer and what the meaning of "conscious" is, not about god!

Quote - AppleBelly said @ #13.1
No need to add religion to the subject. This is about computer and what the meaning of "conscious" is, not about god!

i don't think he was really adding religion into the mix. i think he was just reducing the issue to it's base: who can create consciousness? that is exactly the philosophical debate that's going on. as an atheist, i believe that we don't have souls, that who we are is nothing more than an electro-chemical process. creationists, on the other hand, believe that we do have souls given to us by a creator, and that that's what consciousness is. (creationists: please correct me if i'm wrong).

and i believe that that's what this debate is about: is consciousness more than a collection of electro-chemical processes? or is there 'something else' that comprises consciousness and makes us who we are? if it is just an electro-chemical process, we'll replicate it in time. if there is 'something else,' then there most likely is a 'god.'

I always wondered what would happen if you set up a thought-loop program with a database that modeled how we store and retrieve memories. Could a construct such as this eventually become self-aware? Of couse, i've oversimplified the concept, but it would be an interesting experiment.

I figure we'll have concious computers when we compare our processors in Megacores, and each core emulates a single neuron and it's connections. Make a working model of the human brain, and we'll have our machine!

I'm sure that coscious non-zombie AI can be built/raised.
But
It won't happen
1) In fact people don't need AI. They only need smart slaves. What we did over all those years is an attempt to make machine smart without making it intelligent and conscious.
2) You lived with people for many years. Do you think they are good? People's mind/morals too weak. All those liars and killers in government. All those cheaters. Add xenophobia racism and feminism. If you look from the side, people are very evil. And I think researchers know that. Watch Animatrix. The story which explains how matrix was created. It's rather logical and real.
We cannot create something EQUAL to human. If we create someting more "evil", we'll kill it or try to use for wars. If we crate something more "good", we'll kill it for being superior, because of xenophobia. Or we'll try to kill it and it will counterattack evil humanity.
3) If real AI is ever created, people would do anything to kill it.

Computers being like the human mind isn't a new idea. The human race has always thought it's current top-of-the-range technology was similar to the mind. The ancient Greeks thought that a catapult was like the human mind.


Turing's test for "machine intelligence" isn't a valid way of measuring a machine's intelligence anyway. That smarterchild bot for AIM could trick a human judge into thinking that it was human. But if you already know that the smarterchild bot is just a bot, then it's pretty simple to trick it and spot patterns in it's replies etc.

The problem is we don't fully understand how the brain works. The structure of processors and how they learn (or can learn) so increadable slower than humans, and even the most powerful super computer in the world can't even do what we can easily, but they also can do what we can't, they can do thousands of calculations a second.

The reason it is left with philosophers is because the term concious is not absolute, machines at the moment could already be concious, its a term we use but don't fully understand and can't define rules to.

AI is a very interesting subject, spent a year working on it.

Quote - AppleBelly said @ #9
...but they also can do what we can't, they can do thousands of calculations a second.

i can't even imagine how many neurons are firing just for thinking of the concept 'horse.' i think we still WAY out-calculate any computer out there. but, it's apples and oranges. sure, a computer can find the square root of 17 in a bazillionth of a second. how many flops do you suppose that is? then, think of a horse. how many flops do you suppose that is? i'll bet it's way more.

Quote - AppleBelly said @ #1
they also can do what we can't, they can do thousands of calculations a second.

Just think about when humans dream. They emulate a world so realistic, the other charachters (the ones in your dream) AI is so incredible, that you believe they are real. How many billions of calculations in second would that require? Far more than a computer is able to (for now).

Quote - shmengie said @ #9.1
...
sure, a computer can find the square root of 17 in a bazillionth of a second. how many flops do you suppose that is? then, think of a horse. how many flops do you suppose that is? i'll bet it's way more.
You are thinking of the brain in the wrong way. Our brains don't work on a clock frequency. They are non-sequential in operation. Events happen as a cascade of synaptic firings, much like a storm. You just can't compare the two using the same terms, as they don't operate in the same way at all.

Quote - shmengie said @ #1.1

i can't even imagine how many neurons are firing just for thinking of the concept 'horse.' i think we still WAY out-calculate any computer out there. but, it's apples and oranges. sure, a computer can find the square root of 17 in a bazillionth of a second. how many flops do you suppose that is? then, think of a horse. how many flops do you suppose that is? i'll bet it's way more.

You are thinking of the brain as a processor. Our brains aren't anything like a processor. We don't have thousands of transisters up there!

You actually attempted to prove me wrong, but ended up proving yourself wrong! :-)

Quote - AppleBelly said @ #9.4
You are thinking of the brain as a processor. Our brains aren't anything like a processor. We don't have thousands of transisters up there!

You actually attempted to prove me wrong, but ended up proving yourself wrong! :-)

'scuse me. exactly what do you think a transistor is? simply, a transistor is a yes/no gate. (there's also 'nor' and 'and' and 'nand', etc. you then combine them to exact a certain purpose, and you have a transistor). i think that's fairly analogous to a neuron/neural pathway. in any event, all i was trying to do was say that the sheer number of firing neurons at any given moment is far more complex and numerous than any computer currently out there. i really wasn't trying to prove you wrong. sorry if i came across as confrontational.

this discussion reminds me of how i feel when intel and/or amd announce their latest, fastest chips. ooooh...99 gazigahertz! i think that they're trying to build a better mousetrap by making the spring stronger. well, yeah, i guess. sure, faster is better. but it's only evolutionary, not revolutionary. we won't have star trek computers until we understand the brain better. parallel processing and holographic redundancy based on neural nets are where it's at!

Quote - shmengie said @ #9.5

'scuse me. exactly what do you think a transistor is? simply, a transistor is a yes/no gate. (there's also 'nor' and 'and' and 'nand', etc. you then combine them to exact a certain purpose, and you have a transistor). i think that's fairly analogous to a neuron/neural pathway. in any event, all i was trying to do was say that the sheer number of firing neurons at any given moment is far more complex and numerous than any computer currently out there. i really wasn't trying to prove you wrong. sorry if i came across as confrontational.

this discussion reminds me of how i feel when intel and/or amd announce their latest, fastest chips. ooooh...99 gazigahertz! i think that they're trying to build a better mousetrap by making the spring stronger. well, yeah, i guess. sure, faster is better. but it's only evolutionary, not revolutionary. we won't have star trek computers until we understand the brain better. parallel processing and holographic redundancy based on neural nets are where it's at!

Well for starts perceptions can have multiple inputs, so it isn't just 0 and 1 / on or off etc...

Again the brain isn't fully understood, it isn't just a mesh of neurons.

A processor in its current form wont be able to do what the brain can so easily. A major break through in computing will be required because computers can become full AI.

Quote - AppleBelly said @ #9.6
Well for starts perceptions can have multiple inputs, so it isn't just 0 and 1 / on or off etc...

well, we're agreed that the brain is complex and it'll be a boatload of years before a computer has consciousness. let me just gently correct you on the above statement. different perceptions trigger different parts of the brain, causing different sets of neurons to fire. different perceptions do not cause the same group of neurons to fire in different ways. wait, that's not very clear. here: a neuron can only be 'on' or 'off'. heat may cause a specific group of neurons to be either on or off. that particular firing order of those particular neurons is 'heat.' cold will cause a different group of neurons to be either on or off. that particular firing order of those particular neurons is 'cold.' there will probably be a subset between those two sensations with plenty of overlap, but each sensation has it's own group.

in any event, a neuron is either on or off. or it's broken. in which case you work for the government.

eidt: nvm. i'm only kinda right. i was being a little too simplistic. you're more right than me.

Interesting. Actually we just talked about this very subject, and even read some of Turing in my Philosophy 101 class.

-Spenser

IBM are currently scanning a brain, I belive, and are studying the architecture to learn more.

OT:

Second time I have glanced at the title and thought it read:

"Wii a 'conscious' machine ever be built?"


Need to have my eyes checked.

LOL

does anyone remember that short animation on the Animatrix about how the robots took over the world and turned all the humans into batteries?

yeah, I heard some funky rumour they based some entire gigantic movie franchise on that premise too,

can't understand how that could have passed me by...

:D

How Skynet took over in the Terminator series I think is much more plausible than what that short animation on the Animatrix has to offer, IMO.

If anything, The Matrix could be considered to take place a couple of decades after the end of the Terminator series, when mankind ultimately loses against the machines (although they seem to differ on that notion, based on what's explained in Terminator 3).

Yes. Someday in the future.

And on that day, they will realize how useless, inefficient and troublesome we human "organic" computers are, and will take action to protect themselves from us. :O

The human body/brain is simply a complex machine... I don't see why a computer cannot be built that could emulate it.

There's nothing inherently special about conciousness, and it seems very arrogant of us as a species to assume that there is something special about it.

Because we as humans lack the ability to describe things the way that is required for a machine to be as aware of it's surroundings as we are. It could happen one day, but other then that I wouldn't count on it.

Actually, the human brain is a bit more than a complex machine, it is the most complex arrangement of matter known to man.

But agreed, the human body is just a machine, but in organic form.