Recorded Future offers a glimpse into the AI ​​future of threat intelligence

Applications of AI

Threat intelligence firm Recorded Future announced Tuesday that it is deploying a generative artificial intelligence tool that relies on a fine-tuned version of Open AI’s GPT model to synthesize data.

The rapid progress of generative AI in recent months has seen companies sprout initiatives to incorporate this technology into their products, and companies such as Recorded Future, which has large amounts of proprietary data, are wondering how this technology could be incorporated. indicates that there is a possibility that commercialized in a short period of time.

Over the course of nearly 15 years in business, Recorded Future has collected vast amounts of data on malicious hacker activity, technological infrastructure, and criminal activity. The company used that data to tune versions of Open AI’s deep learning models and build tools to summarize data and events for analysts and clients. By connecting AI models to an intelligence graph that collects data from across the web, the models contain near real-time information about commonly exploited vulnerabilities and recent breaches.

“Recorded Future Co-Founder and Chief Technology Officer Staffan Torve told CyberScoop: “As you move information, someone is summarizing it in real time.”

Cybersecurity companies have incorporated AI extensively into their products over the past decade, but the next step in incorporating machine learning into enterprise applications is figuring out how to build useful generative tools.

Companies such as Recorded Future, which have large amounts of internal data, have in recent months adopted deep learning technologies to build generative AI tools. Late last month, Bloomberg rolled out BloombergGPT, a 50 billion parameter model trained on financial data.

By taking large data holdings and feeding them into AI models, companies such as Recorded Future and Bloomberg are seeking to build generative AI systems fine-tuned to answer the questions their clients rely on for answers. is. Companies with large amounts of data may look to generative AI to turn that data into a more productive resource.

But Bloomberg and Recorded Future also provide examples of how companies can take different approaches to building generative AI models and have a significant impact on the broader industry. Bloomberg built its own bespoke model, while Recorded Future relies on OpenAI’s underlying GPT model and pays the company based on the volume of queries against the model.

Truve doesn’t comment on the financial terms of Recorded Future’s relationship with OpenAI, but these types of business-to-business deals are a pretty lucrative business for OpenAI, a company facing a difficult road to profitability. It may represent a model. You are facing huge compute costs to train your model.

It is difficult to assess the quality of Recorded Future’s AI products. The company doesn’t test its models against standard AI benchmarking tools, instead relying on in-house analysts to test and validate their accuracy. The company relies on OpenAI’s most advanced GPT model, but OpenAI severely limits the amount of information available regarding its top-of-the-line product.

Advanced AI models are prone to hallucinations as they eagerly try to answer questions. In other words, state information with confidence as facts that have no basis in reality. But he said that the company’s models are largely used to summarize the set of information returned as part of a query, so they are largely able to avoid hallucinations.

In fact, Recorded Future’s AI performance is underpinned by the fact that its purpose is fairly simple. His AI capabilities at the company primarily serve as summarization tools, and Truvé sees his AI tools as a complement to cybersecurity analysts.

“The challenge facing cybersecurity people is that there is too much information and too few people to process it,” Truve said. It seeks to solve a serious analyst shortage.”

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *