The Gates Foundation is standing together to provide $1 billion to help develop AI tools

AI News


Alliance of funders including Gates Foundation And Ballmer Group will spend $1 billion over 15 years to support development artificial intelligence Tools for public defenders, parole officers, social workers and others to help Americans in unstable circumstances.

Funders announced Thursday that they would create a new entity, Next-Ladder Ventures, to provide grants and investments to nonprofits and for-profit organizations, and develop tools that often manage huge, low-resource caseloads.

“The solutions we are investing in are trying to do that by hundreds of entrepreneurs who propose solutions that incorporate cutting-edge technology, coming along with people living some of the economic struggles.

Other funders include John Overdecki, founders of the hedge fund and the Valhalla Foundation. This was started by Inuit co-founder Scott Cook and his wife Sign Ostby. Ballmer Group is a charity of former Microsoft CEO Steve Ballmer and his wife Connie. Funders refused to clarify the exact financial commitments made by each contributor.

The key to investing in these AI tools is to promote economic mobility, a shared focus among all funders, they said. Funders believe there are many ideas about how AI technology can help match those with resources after a disaster or eviction, or help, for example, to round out more cases for those who meet all the criteria but are waiting for paperwork to be handled.

AP Audio: Funders commit $1 billion to develop AI tools for frontline workers

Associated Press Correspondent Haya Panjwani reports on investment in AI.

“When we exchanged notes about where we were investing and where we saw a wider gap between the sector, Kevin Bromer, head of Ballmer Group technology and data strategies, said: He will also serve as a member of the Next-Ladder board of directors, which includes three independent board members and representatives of other funders.

Next -Ladder will be led by Ryan Rippel, who previously directed the Gates Foundation Economic Mobility Portfolio. The Funder Group has not yet decided whether Next -Ladder will be incorporated as a nonprofit or for-profit organization, but said returns from investments will return to funding new initiatives.

Jim Fruchterman, founder of Tech Matters and author of the recent book, Technology for Good, said he hopes Next-Ladder will primarily fund nonprofits if he wants to achieve his mission to reach the poorest people and places. He said he was optimistic about his focus on serving frontline workers rather than trying to replace them.

“The nonprofit sector is about people who help humans,” Fulchterman said. “And instead of giving AI to the poor or needy people, “Hey, you're a frontline worker. What's the least productive part of your job?” And they'll tell you, and if you work on it, you could be more successful. ”

Next -Ladder becomes your partner AI company humanityProvides technical expertise and access to technology to nonprofits and businesses that invest in. Humanity commits about $1.5 million a year to the partnership, said Elizabeth Kelly is the head of useful developments, a team focused on giving back to society.

“We want to have our grantees on hand by using Claude with the same care and commitment we provide to our biggest corporate customers,” Kelly said, referring to the leading language model of humanity.

The charity said that charity can reduce the risk of these types of investments and provide organizations with more time to prove their ideas.

“If we succeed, this will be the first capital to show that it is possible,” Hook said.

Suzy Madigan, responsible AI lead at Care International UK, has investigated the risks and benefits of using AI tools in a humanitarian context. She said she saw a rush to explore how AI technology can bridge the gap as funds were cut.

“The rise of artificial intelligence, unfolding in a more sensitive context, brings a very important new ethical and governance question, as inequality can actually increase, even if there is good intention behind it,” Madigan said.

The key to not hurting vulnerable communities is to engage them in every step of developing, deploying and evaluating AI tools, ensuring that those tools will replace frontline workers, she said.

Researchers of Active Learning Networks for Accountability and Performance in Humanitarian Action: Risks related to using AI tools When interacting with sensitive populations or Handling high stakes interactionsfor example, in a humanitarian context.

They recommend assessing whether AI is the best tool to solve problems, and, importantly, use decisively whether it will work accurately in a high-risk setting. We also recommend that you consider the costs of potential dependency on a particular provider and evaluate bias tools, taking into account privacy protection.

The National Institute of Standards and Technology emphasizes this Trustworthy AI System You should be responsible for the user and be able to explain or track how the tool has reached a particular conclusion or decision.

Hooks emphasized that the AI tools Next-Ladder invests in are shaped by the needs and feedback of these frontline workers. Tools that don't work for them won't succeed, he said. Despite the potential risks of AI tools, he said it is essential that groups struggling to move above the economic ladder have access to new technologies.

“The idea that we are robbing people struggling in our country from the benefits of cutting-edge solutions is unacceptable,” Hook said.

___

This story has been updated to correct the name of Scott Cook.

___

Associated Press' philanthropy and nonprofit coverage is supported through collaborations with the Associated Press and Conversation US, along with funding from Lilly Endowment Inc. AP is solely responsible for this content. Visit us for all AP charity matters https://apnews.com/hub/philanthropy.





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *