The startup is pioneering edge generative AI inference on small devices thanks to the efficiency of its AI accelerator IP cores. GenAI v1
Spain, June 4, 2024 — The company recently announced the first generative AI hardware accelerator, and is now going a step further to offer a turnkey solution. LLM Inference Now Available on a Wide Range of Low-Cost FPGA Devices.

RaiderChip GenAI v1 running Phi-2 LLM model on Versal FPGA with single memory controller
The RaiderChip v1 design utilizes 32-bit floating-point arithmetic. With full accuracy, we can directly use the weights of the original LLM model.without any modification or quantization. This preserves the full intelligence, The inference capabilities of the raw LLM model as intended by its creators.
This perfect accuracy is combined with real-time AI LLM inference speed. “The efficiency of our design allows our customers to Unquantized LLM models at full interactive speedLimited memory bandwidth where competitors are over 20% slower Significantly faster than CPU-based inference solutions” explains the RaiderChip team.
GenAI v1 IP cores are already available for FPGAs of all sub-families. AMD Versal FPGA Lineup and previous UltraScale Series “Our IP cores are target independent and Different FPGA vendor devicesWe follow customer requirements regarding logic resources and inference speed,” the team emphasizes.
The distinctive features of RaiderChip's solution are: IP core plug-and-playIt uses a minimum number of industry-standard AXI interfaces. IP blocks make GenAI v1 a simple peripheral: Fully controllable from customer's software.
The introduction of FPGAs for generative AI acceleration expands the options available for local AI inference of LLM models, and its reprogrammable nature makes it ideal in the context of the explosive innovation in the AI field. New models and algorithm upgrades available weekly, with FPGAs allowing in-field updates of a system that has already been implemented.
For more information, visit https://raiderchip.ai/technology/hardware-ai-accelerators.
