Kumo Unveils KumoRFM-2, a Foundation Model Scaling to 500 Billion Rows
Kumo has launched KumoRFM-2, a foundation model that surpasses fully supervised machine learning on enterprise relational data, announced in a press release. The model is designed to eliminate the need for manual feature engineering and custom model training, enabling users to query data in plain English.
KumoRFM-2 is built on a new Relational Graph Transformer architecture and can process data at 5 GB per second, handling over 20 million lookups each second. It operates directly on connected tables without flattening and scales to more than 500 billion rows.
On benchmarks such as Stanford RelBenchV1, KumoRFM-2 outperformed its predecessor by 10% and exceeded top supervised machine learning models by 5% in classification and regression tasks. It also achieved state-of-the-art results on the SAP SALT benchmark, surpassing tabular model ensembles and other foundation models. Fine-tuning further improved performance by 13%, with the model demonstrating robustness to noise, missing data, and structural inconsistencies.
We hope you enjoyed this article.
Consider subscribing to one of our newsletters like Enterprise AI Brief, Sales & Marketing AI Weekly or Daily AI Brief.
Also, consider following us on social media:
More from: Enterprise
Subscribe to Enterprise AI Brief
Weekly report on AI business applications, enterprise software releases, automation tools, and industry implementations.
Market report
AI’s Time-to-Market Quagmire: Why Enterprises Struggle to Scale AI Innovation
The 2025 AI Governance Benchmark Report by ModelOp provides insights from 100 senior AI and data leaders across various industries, highlighting the challenges enterprises face in scaling AI initiatives. The report emphasizes the importance of AI governance and automation in overcoming fragmented systems and inconsistent practices, showcasing how early adoption correlates with faster deployment and stronger ROI.
Read more