Google's LaMDA AI sentient? More like "pure clickbait" says AI expert

A scene from Ex Machina  where the bot is touching cutouts of human faces

Back in June this year, the internet was set abuzz with discussions of AI becoming sentient and the inevitable arrival of SkyNet in the not so distant future. This ensued after Google engineer Blake Lemoine claimed that the company's Language Model for Dialogue Applications (LaMDA) had become self-aware and was achieving sentience. Google was having none of it though and following this, Lemoine was first put on paid leave and then eventually fired.

For folks out there looking for a third-party perspective on the situation that does not involve Google or its ex-employee Lemoine, Standford University co-director of the Human-centered Artificial Intelligence (HAI), John Etchemendy, has stepped in. And like Google, Etchemendy is not impressed by Lemoine's claims, and has labeled the stories out there about Google's sentient AI as "pure clickbait".

He said:

Sentience is the ability to sense the world, to have feelings and emotions and to act in response to those sensations, feelings and emotions.

LaMDA is not sentient for the simple reason that it does not have the physiology to have sensations and feelings. It is a software program designed to produce sentences in response to sentence prompts.

When I saw the Washington Post article, my reaction was to be disappointed at the Post for even publishing it.

They published it because, for the time being, they could write that headline about the ‘Google engineer’ who was making this absurd claim, and because most of their readers are not sophisticated enough to recognize it for what it is. Pure clickbait.

Richard Fikes, emeritus professor of computer science at Stanford, says in agreement that LaMDA was simply trying to respond like a human rather than actually becoming like one. He said:

You can think of LaMDa like an actor; it will take on the persona of anything you ask it to. [Lemoine] got pulled into the role of LaMDa playing a sentient being.

Hence, many, including Fikes, believe that Blake Lemoine fell for the ELIZA Effect which is just a tendency to unconsciously assume computer behaviors to be analogous with humans.

Source: Stanford Daily

Report a problem with article
Microsoft Office 2021 for Mac and Windows
Next Article

Lifetime license of Microsoft Office 2021 for macOS or Windows from only $54.99

Box art of multiple PC games
Previous Article

Weekend PC Game Deals: The Walking Dead to collect, Far Cry to try, and more

11 Comments - Add comment

Advertisement