Databricks has invested in a new AI hardware company founded by Naveen Rao, a serial entrepreneur known for building Nervana (acquired by Intel) and MosaicML (acquired by Databricks). The company remains in stealth mode, but will focus on rethinking the foundations of computing for AI, with a special emphasis on energy efficiency and performance scalability.
A bold bet on AI infrastructure
The announcement was shared by Databricks CEO Ali Ghodsi in a LinkedIn post, where he expressed strong conviction about Rao’s ability to reshape the AI hardware landscape. “If anyone can pull this off, it’s Naveen,” he wrote, highlighting his leadership record and continued advisory role with Databricks.
The move comes as enterprises increasingly look to optimise cost, speed, and energy consumption in large-scale AI deployments. Rao’s company is expected to build machines that deliver what he calls “brain-scale efficiency,” signalling breakthroughs in both architecture and throughput per watt.
Also read: Tendulkar-Backed Fab Strengthens India’s Chip Goals
Context: Databricks’ growth and enterprise AI focus
Databricks recently closed a $10 billion Series J round, taking its valuation to $62 billion and positioning itself as a leader in the enterprise AI stack. The company’s revenue is approaching a $3 billion annual run rate, powered by adoption of the Databricks Data Intelligence Platform and its integrated GenAI accelerators.
By backing Rao’s new venture, Databricks is not just funding innovation — it’s also investing in the infrastructure layer that could unlock further gains in enterprise AI performance, especially as model sizes and inference demands continue to grow.
Reimagining the compute foundation
While details of the product roadmap remain undisclosed, the new company aims to deliver AI hardware that moves beyond incremental gains. The focus is on designing systems that drastically reduce energy requirements without sacrificing computational power — a critical challenge as training costs skyrocket and data centers face capacity and sustainability constraints.
The collaboration reflects a broader industry trend: cloud-native AI software players increasingly funding or partnering with hardware innovators to push performance boundaries at scale.
