Introducing a distributed inference system to highly efficient AMD and announces collaboration with Tenstorrent and Sglang
Seoul, Korea and Santa Clara, California., September 11, 2025 / prnewswire/ – morethe AI infrastructure software company announced a distributed inference system at AMD and showed the progress of the collaboration between Tenstorrent and Sglang at the AI Infra Summit 2025. Santa ClaraIt will be held from September 9th to 11th.
Moreh and Sglang are teaming up to showcase a distributed inference system at AMD at AI Infra Summit 2025
AI Infra Summit is the world's largest and most established AI conference specializing in the infrastructure layers of AI and machine learning. The summit, which was launched as an AI Hardware Summit in 2018, has evolved from a semiconductor-centric conference to a full-stack AI infrastructure event.
The 2025 summit brought together 3,500 participants and over 100 partners to build fast, efficient and affordable AI with content designed for hardware providers, hyperschools and all enterprise IT and AI infrastructure specialists.
Enterprise AI Session September 10thMoreh CEO Gangwon Jo has presented benchmark results demonstrating that it has implemented the company's distributed inference system and optimized the latest deep learning models such as Deepseek more efficiently than Nvidia. He also unveiled the next generation of AI semiconductor systems that combine MoreH software with Tenstorrent's hardware, offering a variety of cost-competitive alternatives to NVIDIA.
During the summit, More co-hosted a presentation with Sglang, the leader of the deep learning inference software ecosystem, and held a booth and networking session together. This serves as an opportunity to further strengthen our collaboration with the global AI ecosystem, particularly in the North American market. Furthermore, More plans to jointly develop an AMD-based distributed inference system with Sglang to accelerate the expansion of the rapidly growing deep learning inference market.
Moreh CEO Gangwon Jo said, “Moreh has the most powerful technical capabilities among AMD's global software partners and is currently running proof of concept (POC) projects with several leading LLM companies,” and through close collaborations with AMD, TenStorrent and Sglang, we aim to establish a global company to train global companies to train customers.
Moreh develops its own AI infrastructure engine and ensures comprehensive technical capabilities across the model domain through Motif Technologies, a subsidiary of Foundation LLM. At the same time, the company is marking its global market through collaborations with key partners such as AMD, Tenstorrent and Sglang.
Moreh CEO Gangwon Jo will be giving a presentation at AI Infra Summit 2025 on the afternoon of September 10th.
