Commentary: How Worry Should AI Be of Us? – Alexandria Echo Press

AI News


By Mohammad Hosseini, Chicago Tribune (Tribune Content Agency)

Jeffrey Hinton, a visionary expert at the heart of many innovations in artificial intelligence and machine learning, recently left Google. In an interview with CNN, he said: “I am just a scientist who suddenly realizes that these things are getting smarter than us. I want to tell you that you should worry about
Hinton was one of the influential figures in the 1980s who worked on techniques like backpropagation that are crucial in creating today’s large-scale language models and generative AI like ChatGPT. It has been called the “Godfather of AI”.
The proliferation of generative AI is creating a battlefield between societal and corporate values, pioneered by scientists like Hinton who started the fight but pulled out when things started to go wrong If so, what values ​​are we teaching the next generation of scientists?
Hinton said he was filing a whistleblowing, which doesn’t look like a whistleblower at all. If he really wants to whistle, he should tell us what’s going on behind the scenes at Google. This is what other computer scientists have done, including his Timnit Gebru, the leader of his AI ethics study at Google. She co-authored a paper titled “The Dangers of the Probabilistic Parrot: Can Language Models Be Too Big?”, which critically explored the use of Google’s search engine and its large language models. was fired after
When asked by a CNN interviewer about Mr. Gebreu’s criticism, Mr. Hinton said: I think it’s easier to express anxiety if you quit the company first. And their concerns are less existentially serious than the idea that these things will become smarter than us and take over. “
Among other things, this could be an undermining example of the courageous actions of African women who raised ethical issues long before The Godfather, or something far beyond what Gebrü warned us about. It could also imply that Hinton knows and intends to reveal.
I think the former is much more likely. For context, Gebru’s paper was published in March 2021. This was long before ChatGPT was released, followed by an avalanche of publications on social, legal, and ethical concerns related to large-scale language models.
Among other issues, Gebru and her colleagues describe the dangerous risks and biases of large-scale language models, their environmental and financial costs, their incomprehensibility, their semantic illusions, and their potential for language manipulation and public misunderstanding. emphasized gender.
Are these different from Hinton’s concerns? Because, unlike Hinton’s vague, sci-fi-like claim that AI conquers, Gebrew’s concerns were clear and specific.
Moreover, unlike Hinton, who said, “It’s not clear if we can solve this problem,” following the words, “These things stop us from ruling.” co-authors made very specific recommendations. Invest resources in collecting and carefully documenting datasets, rather than cost first and consuming everything on the web, and how the planned approach fits your R&D goals and stakeholder value. Conduct pre-development exercises to assess support for the view and encourage research directions beyond the ever-larger language. model,” writes Gebrew and her co-authors.
Will Hinton really blow his whistle now that he’s gone from Google? His current position suggests no such thing. Because he believes that “tech companies are the people most likely to think about how to manage this problem.” “This could imply many things, one of which is that technology companies are asking us, directly or indirectly, to take control of this technology like antivirus software. It means that we may charge, or that we may use this technology to intimidate the public if necessary.”
But would they? Maybe not, but it could be. But what we can expect is Hinton to act like a responsible scientist, putting society above commercial interests. He acted like a real whistleblower, going beyond punchy lines and dramatic lines like “these things are going to be passed on” to meaningful and informative statements about what’s going on in the tech industry. Specific information can be disclosed. Perhaps this way he can leave a better legacy than his godfather, protect his real family, and show his loyalty to us, the people, not Google.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *