More than 1,100 signatories have signed an open letter urging all AI labs to halt the training of AI systems that are more powerful than GPT-4 for at least six months. The list of notable signatories includes computer scientists as well as tech personalities like Steve Wozniak, Evan Sharp, Elon Musk, and more.
The open letter warns that modern-day AI systems with "human-competitive intelligence" can pose risks to our society. There is a lack of "planning and management" that is required, and in recent months "AI labs [have been] locked in an out-of-control race to develop and deploy ever more powerful digital minds that no one – not even their creators – can understand, predict, or reliably control."
Contemporary AI systems are now becoming human-competitive at general tasks, and we must ask ourselves: Should we let machines flood our information channels with propaganda and untruth? Should we automate away all the jobs, including the fulfilling ones? Should we develop nonhuman minds that might eventually outnumber, outsmart, obsolete and replace us? Should we risk loss of control of our civilization? Such decisions must not be delegated to unelected tech leaders.
The letter says that the pause should be made public, include all key factors, and be verifiable. Here, the government should "institute a moratorium" if such a pause isn't enacted quickly, it adds. However, the letter clarifies that it's not asking labs to pause AI development in general but "merely a stepping back from the dangerous race to ever-larger unpredictable black-box models with emergent capabilities."
It also suggests that the buffer time could be used by AI labs and independent experts to collaborate and craft a rigorously audited "set of shared safety protocols for advanced AI design and development." While the letter quotes a statement from OpenAI, the list of signatories doesn't include anyone from the research lab (at the time of writing) except Musk who was among the founding members.
Source: Future of Life Via TechCrunch
14 Comments - Add comment