AIC will showcase its newest server and AI storage platforms this week at Supercomputing 2025 (SC25) in St. Louis, highlighting options for quick rising AI, HPC, and enterprise workloads. Cease by AIC’s sales space (#305) to see new Gen5 techniques, companion built-in options, and a sneak preview of AIC’s upcoming Ethernet JBOD (eBOD) platform for scalable, Ethernet connected capability. Moreover, AIC will likely be featured at a number of companions’ cubicles throughout the present flooring.
At Micron’s sales space (#3516), AIC and Micron will reveal an ultra-dense configuration pairing the AIC F2032-01-G5 with Micron 6600 ION 245TB E3.L SSDs—supporting 140+ PB per rack to satisfy the information calls for of contemporary AI and HPC.
H3 Platform will likely be co-exhibiting with AIC at our sales space, presenting a joint GPU-accelerated AI platform constructed on the Falcon 6048 system. The platform combines PCIe 6.0, high-density NVMe storage, and as much as 6 GPUs, DPUs, or NICs to push NVMe efficiency to greater than 200 million IOPs and take away storage bottlenecks for AI and HPC workloads. H3 can also be showcasing its Falcon C5022 CXL reminiscence pooling resolution on AIC servers, enabling as much as 5.5 TB of shared reminiscence with easy monitoring and management so information facilities can scale reminiscence capability extra effectively.
AIC has additionally teamed up with Seagate to present SC25 attendees a primary take a look at AIC’s new Ethernet JBOD (eBOD) know-how—a contemporary method to constructing excessive capability, Ethernet connected storage for AI workloads. On this collaboration, AIC gives the Ethernet-based JBOD, Seagate provides the high-capacity HDD. Leveraging a DPU and an HBA card, the eBOD features as a disaggregated storage enclosure that permits NVMe-oF on SAS/SATA-based drives, particularly high-capacity HDDs. Collectively, the design ingests information at community pace, accelerates resilient checkpoint tiers for coaching, and scales out over commonplace Ethernet—lowering information motion overhead and complete price of possession for big AI pipelines.
“SC25 pushes AI infrastructure ahead,” mentioned Michael Liang, President & CEO of AIC. “We’re right here to indicate actual progress in storage and compute, reconnect with companions and clients, and see what’s subsequent. I’m particularly excited to see our new Ethernet-based JBOD (eBOD) operating on the ground—a sensible step towards less complicated, sooner information pipelines at scale.”
Additionally Learn: AiThority Interview That includes: Pranav Nambiar, Senior Vice President of AI/ML and PaaS at DigitalOcean
AIC can even spotlight a compact integration with ATTO Know-how, that includes ExpressSAS® 24Gb/s SAS HBAs linking an AIC server and an AIC JBOD for top bandwidth, low latency growth. The setup delivers predictable latency and sustained bandwidth for scale out AI coaching and HPC simulation workloads on AIC platforms.
“AIC’s showcase of our ExpressSAS at SC25 highlights its unmatched function in driving efficiency and reliability for HPC and AI,” mentioned Tim Klein, president and CEO of ATTO Know-how. “With superior options and know-how like ADS and ATTO360 help, ExpressSAS units the usual for storage connectivity in high-performance computing environments.”
Key AIC techniques on show at SC25 embody:
- F2026-01-G5 — excessive density storage server for AI information lakes, backup/restore, and analytics.
- SB102-CA — GPU prepared platform for AI inference and media acceleration.
- SB201-SU (Hybrid) — balanced compute + capability for virtualization and analytics.
- HA2026-HC — resilient storage for mission crucial and video workloads.
Moreover, AIC will seem with our companions Solidigm, Ma Labs, and VAST Information. To expertise these techniques firsthand, go to AIC at sales space #305 or discover AIC know-how at companion cubicles throughout the present flooring
Additionally Learn: The Finish Of Serendipity: What Occurs When AI Predicts Each Selection?
