The optimal intelligent model could be disaggregation Joseph Martins Thu 13 May 2021 // 07:30 UTC Share Copy Sponsored Over the last two decades, enterprises have gotten datacenter management down to a fine art. Standardization and automation means improved efficiency, both in terms of raw compute and of power consumption. Technologies such as virtualization and containerization mean users and developers can make more efficient use of resources, to the point of enabling self-service deployment. However, the general purpose x86 architectures that fuel modern datacenters are simply not appropriate for running AI workloads. AI researchers got round this by repurposing GPU technology to accelerate AI operations, and this is what has fuelled the breakneck innovation in machine learning over the last decade or so.