FuriosaAI

Get notified first about RNGD Sign up

Tech giants begin RNGD sampling

News

Sampling cover

Share this article

Six weeks after unveiling RNGD (“Renegade”) at Hot Chips and just five months after receiving our first raw silicon samples from TSMC, we’re excited to share updates about our first PoCs underway now.

Earlier this month, Furiosa delivered servers equipped with RNGD, Furiosa’s second-gen chip for data center inference, to one of our early access partners – a multinational technology corporation based in Seoul.

Several other customers are already sampling RNGD or preparing to receive RNGD servers for testing with their models. (We’ll share more about those evaluations soon.)

RNGD is built for high-performance, highly efficient data center inference with large language models (LLMs) and multimodal models. The chip went into production this summer and is now running advanced models like Meta’s Llama 3.1 70B at full speed using Furiosa’s RNGD software stack.

“I'm really proud of our team for the rapid RNGD bring up and I’m excited to kick off an important new chapter of customer sampling,” said Furiosa co-founder and CEO June Paik. “There’s much more to do to establish RNGD as the best inference solution, but we are racing forward quickly.” (Just look at June’s hair blowing in the wind in the photo below.)

June Yungbum RNGD

Furiosa CEO June Paik and APAC Tech Sales Director Yungbum Jung.

Over the coming weeks and months, these customers will test RNGD’s performance in demanding real world applications, where latency, throughput, and energy consumption are all crucial considerations.

Initial feedback has been very positive, but we have more work to do to demonstrate the chip’s capabilities. Our APAC Tech Sales Director, Yungbum Jung (he’s the one without the flowy hair in the photo), and his team will work closely with customers to onboard them smoothly and collect their feedback.

Sampling blog

Accelerating development of our AI accelerator

Our goal has been to bring RNGD to market as quickly as possible and we plan to keep up the pace.

Additionally, RNGD is now installed in two other data centers, the Artificial Intelligence Industry Cluster Agency (AICA) and the National IT Industry Promotion Agency (NIPA), through the K-Cloud led by the Korean government. And since cloud vendors play a key role in supplying advanced AI compute to businesses, we are preparing to work with leading providers to sample RNGD in their deployments. Earlier this month Aramco announced the signing of an MoU with Furiosa.

We’ll provide updates on these evaluations soon. And our engineering team is preparing to share more technical details about RNGD soon, including third-party benchmarks.

Sign up here to get updates on RNGD or reach out to us to learn more.

Installing RNGD at our customer’s Seoul data center. One small push for a heavy rack-mounted server, one giant leap for efficient AI inference.

Share this article

Get the latest updates on FuriosaAI