The Chernobyl of tech?Experts say unchecked AI could have life-changing consequences

Machine Learning


Stuart Russell is a highly respected and well-known expert in the field of artificial intelligence (AI) and machine learning.as a computer science professor University of CaliforniaBerkeley, he has devoted 45 years to AI research and co-authored “Artificial Intelligence: A Modern Approach”, a widely used text in the field.

Do not miss it: House printing robots to shake up 7.28 trillion dollar industry

In a recent interview with Business Today’s Aayush Ailawadi, Russell expressed concern about the potential dangers of unregulated AI development.

Russell emphasized the need for reasonable guidelines and safety measures to prevent catastrophic events with potentially far-reaching consequences. I warned you specifically. This is his reference to the 1986 Ukraine nuclear disaster, which caused widespread environmental and health effects. In the context of AI, the Chernobyl event could refer to the catastrophic failure of AI systems, or the unintended consequences of their development causing massive damage.

Russell’s concerns about uncontrolled AI development are shared by other prominent voices in the field. Tesla Inc. CEO Elon Musk and Apple. co-founder Steve WozniakCompanies such as OpenAI and GenesisAI continue to integrate AI into many aspects of everyday life. These experts have signed a petition calling for a pause in the development of the next iteration of GPT, a powerful AI language model, pending further investigation of its potential risks and benefits.

Do not miss it: Qnetic Unveils Innovative Flywheel Energy Storage System to Accelerate Adoption of Renewable Energy

Russell’s message emphasized that the development of AI requires care and careful consideration to ensure that AI is used safely and responsibly for the benefit of humanity.

Russell drew analogies between AI technology and nuclear power plant construction to illustrate the need for reasonable guidelines and safety measures to avoid catastrophic consequences. He stressed the need to convincingly prove that systems are secure before release.

While acknowledging that it is difficult to predict what a catastrophe like Chernobyl will do to AI, he cautioned against developing new AI products. He believes that common sense can be applied to prevent new AI systems from posing a threat to society.

On April 4, President Joe Biden met with the Science and Technology Advisory Board to discuss the risks and opportunities of rapid advances in AI. During the conference, he explained the importance of ensuring his AI technology is secure before being released to the public. Biden acknowledged the potential dangers of AI, saying he didn’t yet know if it was dangerous, and urged tech companies to prioritize safety measures to reduce risks to individual users and national security. asked to do

For the latest startup news and investment updates, visit Sign up for Benzinga’s Startup Investing & Equity Crowdfunding newsletter

Italy’s recent temporary block on ChatGPT over data privacy concerns has raised questions about AI regulation across the European Union. The US has taken a more laissez-faire approach to the commercial development of her AI as European Union lawmakers negotiate new rules to limit risky AI products across the bloc of 27 countries. said Russell Wald, his director of policy and charge managing. The Stanford Institute for Human-Centered Artificial Intelligence.

Biden’s recent remarks on AI may not change the U.S. approach any time soon, but the president’s focus on increasing attention to AI is a sign of a national dialogue on the topic. It’s an important step in setting the stage, and he believes it’s desperately needed.

see more startup investment from benzinga.

Don’t miss real-time stock alerts – join Benzinga Pro for free! Explore tools that help you make smarter, faster and better investments.

Is this article the Chernobyl of tech? Experts say unchecked AI could have life-changing consequences.

.

© 2023 Benzinga.com. Benzinga does not provide investment advice. all rights reserved.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *