
Support staff at research institutions are exploring how they can leverage the technology behind tools like ChatGPT to more effectively assist scientists with tasks like writing grant applications.Credit: Peter Kováč/Alamy
“It's almost magical.” That's how Mads Lykke Berggreen's fellow scientists described his ability to synthesize complex research ideas into compelling writing. But over the past year, as generative artificial intelligence (AI) tools such as ChatGPT have shown themselves to be just as capable, if faster and perhaps better, he's felt his star begin to fade. “Suddenly I became replaceable,” says the research advisor, who is based at VIA University College in Aarhus, Denmark.
Since making this realization, Lykke Berggreen has thought hard about how generative AI could impact her own research management work, and that of the wider profession, and she has decided to embrace the technology.
For example, he uses ChatGPT to help researchers write the first draft of their research proposals: “I prepare the headline and the structure of the proposal in advance, and then I interview the researchers about the proposal. I just pull out all the things they would have put in the first draft anyway in the conversation.” He uses a word processor to take handwritten notes, which he then inputs into ChatGPT. “Then ChatGPT creates the text for me.” This cuts down the turnaround time from several working days to a few hours.
Lykke Berggreen is not alone: research managers around the world are exploring how generative AI can help them in their daily work.
Yolanda David, associate director of research and development at the University of the Witwatersrand in Johannesburg, South Africa, says she uses ChatGPT to draft letters and reports. For example, she gives the tool a brief description of her research project and any other information she wants to highlight, and asks it to write a letter of support for funders that highlights the potential impact of the research and its importance in the South African context. “When ChatGPT comes back with the results, I review them and make corrections,” she says. That includes making the English sound South African rather than American, and removing the “elaborate adjectives and descriptions” that ChatGPT's writing tends to litter.
AI-generated hypotheses may be able to uncover “blind spots” in research
Kelly Basinger, senior proposal manager at the University of North Texas Institute for Advanced Environmental Studies in Denton, says she uses ChatGPT to show researchers how to improve the readability of their papers. The tool can paraphrase complex, jargon-filled sentences to fit the reading level of late middle school and early college students, showing faculty how to improve the readability of their papers. “Faculty naturally want their ideas to be funded,” Basinger says. “The first step is to get other people to understand the idea.”
Many research managers, such as Nick Claassen, managing director of the Association of European Research Managers in Brussels, see AI as an opportunity for research careers. But using AI in grant applications is not without risks, says Ellen Schenk, a research funding consultant based in Rotterdam, Netherlands. One of the more sinister aspects of ChatGPT, she says, is its tendency to fabricate documents in an effort to please users, a phenomenon known as hallucination.
Schenk experienced this first-hand when she was working on a proposal for a European funding call on inequalities and access to healthcare. She asked ChatGPT if the proposed project was suitable for the call. The answer was a resounding yes. But when Schenk asked ChatGPT to back up its claims, they provided references that didn't exist. She says she's now “very, very reluctant” to ask ChatGPT to design a project for her. “If you're not critical of the deliverables, you can have a great proposal and the judges will probably accept it. But the project won't be realistic or feasible.”
Some users are disappointed when ChatGPT produces results that look great at first glance, only to find on closer inspection that they're incorrect or too wordy. Lykke Berggreen says the best way to get past this “word salad” stage is to learn what information ChatGPT needs to produce good output. There are plenty of AI influencers out there who share prompts and give advice, but he says the best way to learn is through trial and error.
Both Lykke Berggreen and Schenk use the subscription version of ChatGPT, which Schenk says offers several advantages over the free version: guaranteed access (the free service was inundated with requests, a particular issue in the tool's early days), a much higher character limit resulting in better quality answers and inferences, and access to AI plugins (tools built with specific tasks in mind, such as database searches of academic literature).
Scale up
Despite the limitations of AI tools, many research managers believe the technology will have a significant impact on their work. James Shelley, who works in knowledge dissemination and science communication at Western University in Ontario, Canada, said he became interested in developing AI applications for research management in part because he wanted to have a job in the future. He doesn't use ChatGPT itself in his work, but he does use the technology behind the tool.
Shelley and his colleagues pay a few dollars a month for access to back-end technology from Open AI, the California company that developed ChatGPT, and are using it to develop automated workflows to help with research management. He thinks this kind of bespoke tooling, rather than individual managers copying and pasting text into ChatGPT, will show how AI might be integrated into the profession in the future.
How ChatGPT and other AI tools will impact scientific publishing
One workflow currently used internally at the university is to create plain-language summaries of new journal articles published by researchers in the university's Faculty of Health Sciences. These summaries feed into regular emails sent to faculty research management and communications teams. This is something that hasn't been done before, Shelley adds, because it wouldn't make sense to hire someone just to summarize every research paper produced by a faculty. So far, the feedback has been great.
Another low-hanging fruit for the technology, Shelley says, is a system that does initial screening of funding proposals to ensure they comply with basic submission guidelines before passing them on to staff. “I think this is probably where most institutions will start to deploy AI at scale in research management,” Shelley says.
Proper Guidance
Several research managers interviewed for this article expressed concern about the lack of guidelines on the appropriate use of technology in research management. Tse-Xian Chen, a funding advisor and grant application writer at the University Medical Center Laboratory Utrecht in the Netherlands, said he expects this to become clearer soon. The European Union is developing AI legislation that will set out rules and guidelines on how to use AI systems safely and legally while respecting fundamental human rights. His institution, in collaboration with Utrecht University, also in the Netherlands, is developing guidelines for the use of generative AI, especially in the context of research support. “I think we're almost certainly not alone” in tackling this kind of work, he says.
Scalability is also a concern for Lykke Berggreen, who developed an AI assistant to write applications for the Danish National Research Council Danmarks Frie Forskningsfond (DFF). The assistant uses the same interview-based system that Lykke Berggreen developed to write grant applications with ChatGPT: the questions are customized to extract the details the council requires in applications, and researchers enter their answers. The tool then produces a first draft aligned to DFF's specifications.
ChatGPT: Benefits and burdens?
Rikke Berggreen is optimistic about the threat that AI tools pose to her own employment. “AI will definitely replace a lot of research administrative tasks, maybe some research managers,” she says. But she believes that machines can't do key parts of her job. She hopes that AI will take over the menial tasks, giving her more time to mentor researchers. “When I talk to researchers, I try to tell them that their ideas are good enough and to build their confidence. I tell them they can and should apply for this grant or that grant. I think it's hard to replace that with a machine,” she says.