Elon Musk, AI researchers call for a pause in next-generation AI model development

Elon Musk, AI researchers call for a pause in next-generation AI model development

Risks posed by AI model development to humanity

On the 29th, an online signature campaign was launched asking all research institutes to suspend the development of a next-generation AI system that is more powerful than the AI ​​language model “GPT-4” released by OpenAI in March.

An open letter published by the non-profit Future of Life Institute states that, in consideration of the risks that the development of AI models poses to human society, we should establish a governance system to deal with the risks while building a monitoring and regulatory system. Claimed. He also pointed out that the government should intervene if the moratorium (suspension) cannot be implemented quickly.

The claim has gathered more than 1,300 signatures from prominent engineers, academics and researchers, including Tesla founder Elon Musk and Apple co-founder Steve Wozniak (at the time of writing).

The Future of Life Institute points out that AI with human-like intelligence could pose serious risks to society and humanity. On the other hand, since AI development is being carried out without planning and management systems, he expresses a sense of crisis about the current situation where “even the creators cannot understand, predict, or trust”.

This view is acknowledged by OpenAI itself, which developed GPT-4. In a February 24 letter, the company noted the need to undergo independent review before starting development of next-generation AI language models to limit the growth rate of development.

The open letter says that “now is the time” to suspend the race to develop AI with increasing unpredictability and risk, and AI labs and independent experts work together to create safety protocols for more advanced AI model development. argue that it should be monitored.

These protocols must ensure that systems adhering to them are secure beyond reasonable doubt.

In parallel, AI developers and policy makers should jointly expedite the development of an AI governance system, continued Open Letter. Specifically, it envisions AI-specific regulators, certification systems to distinguish between real and synthetic models and track model outflows, and resources to deal with the economic and political turmoil caused by AI.

Such an enabling environment will enable AI research and development to be “more accurate, secure, interpretable, transparent, robust, connected, trustworthy and faithful,” says the Future of Life Institute.

connection:US OpenAI to start supporting ChatGPT plugin Internet access possible

signature authenticity

Among the signatories are artificial intelligence pioneer and Turing Award winner Yoshua Bengio, computer scientist Stuart Russell, Stability AI CEO Emad Mostaque, and Ripple co-founder Chris Larsen. There are names, but some have questioned their authenticity.

Reuters reporter Krystal Hu said the top 11 people on the list, including Elon Musk, Steve Wozniak and Yoshua Bengio, were verified.

Anthony Aguirre, vice president of the Future of Life Institute, told foreign cryptocurrency media Decrypt that the early signatories had been fact-checked. Originally, the signature of OpenAI CEO Sam Altman was also included, but it has since been removed. It is said that the verification is progressing from time to time, and the name of the celebrity under verification may be hidden in the list.

On March 14, OpenAI of the United States announced the latest product “GPT-4” of the AI ​​language model. GPT-4 is supposed to be more reliable, more creative, and capable of handling more nuanced instructions than GPT-3.5, which was released on November 30, 2022.

The GPT-4 test program includes exams designed for humans, such as the SAT English Writing and UBE (Uniform Bar Examination). It is said that he passed the bar exam with a score in the top 10% of the examinees.

connection:US OpenAI releases AI language model “GPT-4”, virtual currency AI-related stocks are all high

Source –

Written by: Kurt Ebenzer


scroll to top