Modern data centers handle many different tasks, and their operators need hardware that fits those specific needs. This is why almost all major cloud providers now design their own custom chips. To stay competitive in the coming years, AMD plans to expand its range of data center CPUs, creating processors for different types of work.
With its Zen 4-based 4th Generation EPYC chips, AMD already offers various processors for AI, cloud services, businesses, network edges, and small hosting providers. However, with the Zen 5 family, the range became a bit narrower. Now, AMD suggests it will move towards more specialized EPYC products. This means CPUs designed for specific tasks, possibly with different core counts, cache sizes, and connections. They will also tailor chips for AI inference, managing systems, low-latency AI, and setups with many GPUs. Lisa Su, AMD’s CEO, even hinted that this expansion goes beyond the current plans, potentially including Zen 7 and Zen 8 chip designs.
“The industry will need a wide variety of CPUs; not all CPUs are the same,” said Lisa Su, AMD’s chief executive and chairman, during a call with financial experts. “Honestly, you’ll need different CPUs depending on whether you’re talking about general tasks, main control nodes, or AI agent tasks.”
During the Q&A, Su repeatedly stressed that AMD no longer sees server CPUs as one simple category. Instead, the company now views the market as split into many segments, each needing specific CPUs. These segments include general computing, CPU control nodes for accelerators (like GPUs), and CPUs optimized for “agentic AI” workloads (AI that acts on its own). AMD even plans to offer variations within these categories to precisely meet customer demands.
“We’ve focused on building not just one type, but… chips optimized for high data flow, low power use, low cost, and AI infrastructure, as we’ve done with the Venice family,” Su explained.
For its 6th Generation EPYC processors, based on the Zen 6 design, AMD plans to offer its “Venice” CPU with up to 256 cores for general-purpose servers. They will also have “Verona” processors specifically for AI infrastructure. (Previously, AMD only introduced Verano CPUs as the processor that will power its next-generation rack-scale AI solutions). We still need to learn if CPUs for agentic AI tasks will use different chip designs or reuse existing ones.
“The Venice family covers a broad set of CPUs optimized for data throughput, performance per watt, and performance per dollar, including Verona, our first EPYC CPU built specifically for AI infrastructure,” Su said. Given that AMD now expects the server CPU market to grow by 35% each year and reach $120 billion by 2030, developing specialized models makes a lot of sense. This is true even though designing and making CPUs on the newest technology has become very expensive recently.
So, while AMD didn’t formally announce new CPU categories, its CEO clearly signaled that EPYC offerings will continue to grow and become more specialized, focusing on AI infrastructure and other market segments.











