Sign in to search Gartner Research
Server Processor Counts Will Rise to Levels That Strain the Ability of Current Software to Make Effective Use of That Capability
The relentless doubling of processors per microprocessor chip will drive the total processor counts of upcoming server generations to peaks well above the levels for which key software have been engineered, according to Gartner, Inc. Operating systems, middleware, virtualization tools and applications will all be affected, leaving organizations facing difficult decisions, hurried migrations to new versions and performance challenges as a consequence of this evolution.
"Looking at the specifications for these software products, it is clear that many will be challenged to support the hardware configurations possible today and those that will be accelerating in the future," said Carl Claunch, vice president and distinguished analyst at Gartner. "The impact is akin to putting a Ferrari engine in a go-cart; the power may be there, but design mismatches severely limit the ability to exploit it."
On average, organizations get double the number of processors in each chip generation, approximately every two years. Each generation of microprocessor, with its doubling of processor counts through some combination of more cores and more threads per core, turns the same number of sockets into twice as many processors. In this way a 32-socket, high-end server with eight core chips in the sockets would deliver 256 processors in 2009. In two years, with 16 processors per socket appearing on the market, the machine swells to 512 processors in total. Four years from now, with 32 processors per socket shipping, that machine would host 1,024 processors.
Gartner said that organizations need to take heed of the issue because there are real limits on the ability of the software to make use of all those processors. "Most virtualization software today cannot use all 64 processors, much less the 1,024 of the high-end box, and database software, middleware and applications all have their own limits on scalability," Mr. Claunch said. "There is a real risk that organizations will not be able to use all the processors that are thrust on them in only a few years time."
Mr. Claunch said that the software which runs today's servers has both hard and soft limits on the number of processors that the software can effectively handle. Hard limits are often documented by the vendor or creator of the product and are therefore relatively easy to discover. They are determined by implementation details inside the software that stop it from handling more processors. In this way, an operating system might use an eight-bit field to hold the processor number, meaning a hard limit exists of 256 processors. Soft limits, however, are uncovered only from word of mouth, real-world cases. They are caused by the characteristics of the software design, which may deliver poor incremental performance or, in many cases, yield a decrease in useful work as more processors are added.
Often the soft limit is noticeably below the hard limit for software, meaning that overheads and inefficiencies produce seriously diminished value for large processor counts that may technically be within the supported configurations of the software.
"There is little doubt that multicore microprocessor architectures are doubling the number of processors per server, which in theory opens up tremendous new processing power," concluded Mr. Claunch. "However, while hard limits are readily apparent, soft limits on the number of processors that server software can handle are learned only through trial and error, creating challenges for IT leaders. The net result will be hurried migrations to new operating systems in a race to help the software keep up with the processing power available on tomorrow's servers."
Additional information is available in the Gartner report "The Impact of Multicore Architectures on Server Scaling." The report is available on Gartner's Web site at http://www.gartner.com/DisplayDocument?ref=g_search&id=806412&subref=simplesearch.
Gartner, Inc. (NYSE: IT) is the world's leading information technology research and advisory company. Gartner delivers the technology-related insight necessary for its clients to make the right decisions, every day. From CIOs and senior IT leaders in corporations and government agencies, to business leaders in high-tech and telecom enterprises and professional services firms, to technology investors, Gartner is the valuable partner in over 13,000 distinct organizations. Through the resources of Gartner Research, Gartner Executive Programs, Gartner Consulting and Gartner Events, Gartner works with every client to research, analyze and interpret the business of IT within the context of their individual role. Founded in 1979, Gartner is headquartered in Stamford, Connecticut, U.S.A., and has 5,500 associates, including 1,400 research analysts and consultants, and clients in 85 countries. For more information, visit www.gartner.com.
Comments or opinions expressed on this blog are those of the individual contributors only, and do not necessarily represent the views of Gartner, Inc. or its management. Readers may copy and redistribute blog postings on other blogs, or otherwise for private, non-commercial or journalistic purposes. This content may not be used for any other purposes in any other formats or media. The content on this blog is provided on an "as-is" basis. Gartner shall not be liable for any damages whatsoever arising out of the content or use of this blog.