D-Matrix Unveils JetStream I/O Accelerators for AI Inference
D-Matrix Corp. has announced the launch of its JetStream I/O accelerators, designed to enable ultra-low latency for AI inference at scale. Announced in a press release, the JetStream, when combined with D-Matrix's Corsair accelerators and Aviator software, can deliver up to 10 times the speed, three times better cost-performance, and three times higher energy efficiency compared to traditional GPU-based solutions.
The JetStream I/O accelerator is a custom I/O card that supports state-of-the-art models exceeding 100 billion parameters. It is packaged in an industry-standard PCIe format and is compatible with off-the-shelf Ethernet switches, making it easy to deploy within existing data centers without the need for costly infrastructure replacements.
Samples of the JetStream NICs, which are full-height PCIe Gen5 cards with a maximum bandwidth of 400Gbps, are available now, with full production expected by the end of the year. This addition to D-Matrix's product portfolio positions the company as a comprehensive AI infrastructure provider, offering solutions that span compute, software, and networking.
We hope you enjoyed this article.
Consider subscribing to one of our newsletters like Silicon Brief or Daily AI Brief.
Also, consider following us on social media:
More from: Data Centers
Subscribe to Silicon Brief
Weekly coverage of AI hardware developments including chips, GPUs, cloud platforms, and data center technology.
Market report
AI’s Time-to-Market Quagmire: Why Enterprises Struggle to Scale AI Innovation
The 2025 AI Governance Benchmark Report by ModelOp provides insights from 100 senior AI and data leaders across various industries, highlighting the challenges enterprises face in scaling AI initiatives. The report emphasizes the importance of AI governance and automation in overcoming fragmented systems and inconsistent practices, showcasing how early adoption correlates with faster deployment and stronger ROI.
Read more