a

AI Data Center Scale-Up Architecture with Optical I/O

Optical I/O boosts AI scale-up with higher bandwidth, increased power efficiency, and lower latency for enhanced AI performance.

Ayar Labs Optical I/O Meets the Demands of AI Scale-Up Infrastructure to Break Through the Limitations of AI Training and Inference

Large-scale AI training and inference workloads strain current computing infrastructure, leading to cost, power, and scalability challenges. Both scale-out (networking between clusters) and scale-up (communication within clusters) architectures are under increasing pressure to meet AI performance demands.

The solution for both the AI data center and in-house infrastructure is co-packaged optics (CPO), integrating photonics and electronics in one package to boost AI performance and dramatically cut energy consumption. Network switches with CPO have been announced to address scale-out needs, but scale-up is the larger challenge, requiring at least 10x the bandwidth and 10x latency reduction to overcome AI data center limitations.

Breaking through these AI performance barriers requires integrating CPO directly into the GPU package. Ayar Labs optical I/O overcomes bandwidth density, reach, and power limitations of electrical links, enabling scale-up architecture to enhance AI inference performance, interactivity, and profitability.

“Optical interconnects are needed to solve power density challenges in scale-up AI fabrics. We recognized early on the potential for co-packaged optics, which positioned us to drive adoption of optical solutions in AI applications. As we continue to push the boundaries of optical technologies, we’re also bringing together the supply chain, manufacturing, and testing and validation processes needed for customers to deploy these solutions at scale.”

Mark Wade
CEO and Co-Founder of Ayar Labs

See the AI System Architecture Tool in Action

Discover how Ayar Labs optical I/O solution drives the profitability and interactivity of large AI workloads with our updated AI System Architecture Tool. The tool simulates performance and economics across GPU and network configurations for scenarios including agentic AI and mixture of experts (MoE) models. Visit our booth at the following events to experience the tool in person and see how GPU and network architecture choices affect throughput, interactivity, and profitability.

  • Supercomputing 2025: November 16-21, 2025

Figure 1. Ayar Labs optical I/O solution has the potential to increase profitability by 20x while improving interactivity by 3-4x for future GPT-X models. Optical I/O also makes agentic AI possible (turquoise line in the purple area above).

Understanding Scale-Up Architecture and Its Role in AI Infrastructure

Explore key concepts, technologies, and protocols behind scale-up, scale-out, and optical interconnects.

Blog

Let’s Get Serious: TeraPHY™ Optical Engine Passes the Test for AI Scale-Up at Volume

Blog

Addressing Frequently Asked Questions About Integrating Optical I/O into AI Product Designs

Blog

The Future of AI Infrastructure: A Path to Profitability with Optical I/O

Blog

AI and Optical I/O: Overcoming AI Scaling Challenges by Dispelling 3 Current Optical I/O Misconceptions
Computer Clusters with UCIe Optical Chiplets

Blog

AI Scale-Up and Memory Disaggregation: Two Use Cases Enabled by UCIe and Optical I/O

Video

Improving AI Infrastructure Performance and TCO with Optical I/O-Based Scale-Up Fabrics

Video

World’s First UCIe Optical Chiplet for AI Scale-Up Architectures
Ayar Labs TeraPHY™ optical I/O chiplet is the world’s first UCIe optical chiplet. TeraPHY optical I/O chiplets are one kind of optical engine. Blue background.

Press Release

Ayar Labs Unveils World’s First UCIe Optical Chiplet for AI Scale-Up Architectures

Glossary of Terms Related to AI Scaling and Optical Interconnects

Glossary

Scale-Up (AI/ML)

Glossary

Scale-Out (AI/ML)

Glossary

Co-Packaged Optics (CPO)

Glossary

In-Package Optical I/O

Glossary

AI Compute Cluster

Glossary

AI Compute Rack

Learn More about Optical I/O for AI Infrastructure

Contact us at [email protected] to learn more.