Exploring the Potential of AGI: Hardware and Software Synergy
Iris Coleman Dec 17, 2025 06:09
Dan Fu from together.ai argues that artificial general intelligence (AGI) is achievable by optimizing software-hardware co-design, enhancing current chip utilization, and overcoming perceived hardware limitations.
The debate surrounding the potential for achieving artificial general intelligence (AGI) is intensifying, with Dan Fu, Vice President of Kernels at together.ai, providing an optimistic outlook. According to together.ai, Fu challenges the notion that advancements in AI are being stymied by hardware limitations. Instead, he posits that current chips are significantly underutilized and that a strategic approach to software-hardware co-design could unlock substantial performance improvements.
Current Limitations and Future Potential
As the AI landscape evolves, concerns about reaching the limits of digital computation are becoming more prevalent. Some experts suggest that hardware constraints, particularly in GPUs, might impede progress towards developing generally useful AI. In contrast, Fu presents a more hopeful perspective in his publication, "Yes, AGI Can Happen – A Computational Perspective," which argues that the ceiling has not yet been reached for AI capabilities.
Underutilization of Existing Hardware
Fu highlights that state-of-the-art AI training runs, such as DeepSeek-V3 or Llama-4, often achieve only about 20% Mean FLOP Utilization (MFU), with inference utilization sometimes in the single digits. These figures suggest a significant opportunity to enhance efficiency through better integration of software and hardware, as well as innovations like FP4 training.
Advancements in Computational Models
Current AI models are based on older hardware, and the potential of newer computational resources has not been fully realized. Fu emphasizes that massive clusters of the latest generation GPUs, numbering over 100,000, have yet to be fully integrated into AI development processes, indicating a promising horizon for future advancements.
Present-Day Utility and Future Implications
Despite the perceived limitations, existing AI models are already revolutionizing complex workflows, such as writing high-performance GPU kernels with human assistance. This transformation underscores the immediate utility of AI technologies and hints at the vast potential for future applications.
For those interested in the intersection of systems engineering, hardware efficiency, and AI scaling, Fu's analysis provides valuable insights. The full analysis can be accessed on the together.ai website.
Image source: Shutterstock