In a February 2026 report, the company described several instances where its tools were used to generate content for fraudulent schemes and deceptive online activities.
The incidents also included romance and “task” scams that followed a similar pattern, attracting targets through social media and messaging apps, building emotional involvement, and demanding payment. In one example, called “Operation Date Bait,” the network used AI-generated promotional messages and chatbots to promote a fictitious dating service aimed at young people in Indonesia. The conversation was then moved to a private messaging platform where the victim was prompted to make a series of payments.
Another incident, Operation False Witness, involved an account posing as a legal expert offering to help victims recover previously lost funds. According to the report, the attackers created formal communications and demanded an upfront fee before the supposed recovery services were provided.
The update also describes several coordinated online campaigns that use AI tools to draft articles, social media posts, and comments on geopolitical issues. In some cases, content was published on multiple platforms and in different languages. The company noted that levels of engagement vary widely and are often dependent on the size and reach of the accounts sharing the material, rather than the use of AI itself.
For example, in “Operation Trolling Stone,” accounts created and translated content about the arrest of Russians in Argentina and published articles and comments on Facebook, YouTube, and Medium, which the report characterized as an effort to simulate grassroots engagement.
Additionally, OpenAI said it has identified attempts to use its model in planning or documenting broader online influence efforts. The company said its systems denied certain requests that violated its policies, and that some related content circulating online did not appear to have been generated using its tools.
OpenAI emphasized that it will continue to monitor abuse of its services and work with industry partners and authorities to limit fraudulent and deceptive activities related to AI technology.
Earlier, Kajinform news agency reported that Singapore-based cryptocurrency mining company Bitdeer had sold all its Bitcoin holdings in an effort to redirect funds to artificial intelligence and high-performance computing projects.
