Datacenter Architectures Face Shifting Demands
LIQID is set to showcase advancements in composable infrastructure, focusing on the pooling and sharing of GPU and CXL Memory resources. The company's approach aims to address what it frames as the inadequacy of traditional server designs in meeting the escalating demands of modern workloads, particularly those in AI, IoT, cloud, and HPC environments.
LIQID's solutions aggregate PCIe modules, including GPU and FPGA accelerators, NVMe storage, and DPU-based network cards, creating resource pools that can be dynamically allocated to multiple servers. The integration with platforms like VMware's vCenter underscores a move towards more flexible datacenter management. A key development highlighted is the anticipated leverage of Compute Express Link (CXL) bus technology, which extends resource sharing beyond PCIe by enabling the pooling of RAM across servers.

Enhanced Performance Metrics Cited
The company points to significant performance improvements when its solutions are combined with technologies like NVIDIA's GPUDirect Storage (GDS). Benchmarks suggest that enabling GDS through peer-to-peer communication between PCIe modules can dramatically boost IOPS from 179,000 to 2.9 million, increase bandwidth from 9 to 49 GB/s, and reduce latency from 583 microseconds to just 80. This points to a strategy of optimizing data flow for high-performance computing tasks.
Read More: Microsoft Continues Selling AI to Clients, Not Pentagon After Ban
The 'Composable Infrastructure' Premise
Founded by individuals with backgrounds in datacenter architecture and software design, LIQID champions the concept of 'composable infrastructure.' This entails constructing bespoke solutions by combining software and hardware components, offering what the company describes as "complete agility of datacenter resources." The strategy involves developing complementary offerings and engaging in business development and targeted marketing campaigns.
The company also mentions providing comprehensive consulting services, from pre-sales evaluation to solution design and ongoing support. Its business model appears to involve collaborating with partners, including service providers, cloud service providers, and managed service providers, to deliver these solutions across various industry sectors. The target market includes clients engaged in artificial intelligence, IoT deployments, cloud environments, edge computing, high-performance computing (HPC), DevOps, and other emerging, high-value applications.