IBM has said it has developed the world’s first 2nm chip technology that should improve future computer chips in terms of power and speed.
The industry has developed increasingly smaller chip designs; the most modern chipsets used by Samsung, Apple and Qualcomm, among others, are now built on 5nm technology.
IBM’s breakthrough is another step in the quest for miniaturization and is expected to achieve 45 percent better performance or 75 percent lower power consumption than today’s most advanced 7nm node chips.
The tech company said the potential benefits of the chips could be quadrupling the battery life of cell phones or reducing the environmental footprint of data centers.
Data centers account for one percent of global energy consumption, a figure that is expected to increase dramatically in the coming years. Changing all of their servers to 2nm-based processors could potentially cut that number significantly, IBM said.
“The IBM innovation reflected in this new 2nm chip is essential to the entire semiconductor and IT industry,” said Darío Gil, director of IBM Research.
“It is the product of IBM’s approach to tackling tough technical challenges and a demonstration of how breakthroughs can result from sustained investment and a collaborative R&D ecosystem approach.
” By placing more transistors on a chip, processor designers also have more options to add innovative, specialized components to enhance capabilities for specific applications such as AI and cloud computing, as well as new avenues for hardware security and encryption.
IBM has achieved a number of semiconductor breakthroughs over the years, including the first implementation of 7nm and 5nm process technologies and single-cell DRAM.
It will likely take several years for the technology to hit the market. IBM used to be a major chip maker, but is now outsourcing its large-scale chip production to Samsung, while maintaining a chip manufacturing research center in Albany, New York that produces test runs of chips.
The advance helps secure the continuation of Moore’s Law, an observation first made by Intel founder Gordon Moore in 1965 that the number of transistors in a dense integrated circuit (IC) would double about every two years.
While the law has been largely enforced so far, innovation in this area slows down as the size of transistors reaches the near-atomic level.
In 2017, the director of future silicon technology for ARM Research told E&T that time is running out for Moore’s Law and predicted that advancements in chip technologies would begin to slow rapidly.