Gadgets

China Just Shrunk A Transistor to 1 nm, And It Might Revolutionize AI

Published by

A Peking University team has engineered a 1-nanometre gate electrode in a ferroelectric transistor that runs at a fraction of conventional voltage, opening a new path for AI chips that are both faster and far less power-hungry.

Researchers have announced what they describe as the world’s smallest and most energy-efficient transistor, a device that could reshape the next generation of AI hardware at a time when the industry’s power demands are spiraling.

The transistors, called ferroelectric field-effect transistors (FeFETs), are built to mimic the human brain by combining memory and processing in a single unit. Conventional semiconductor chips keep data storage and computation in separate areas, creating efficiency bottlenecks. FeFETs eliminate that divide entirely.

The long-standing problem with FeFETs has been their operating voltage. Traditional FeFETs require about 1.5 volts to write and erase data, while modern logic circuits typically run below 0.7 volts. The researchers described this gap as being like a heavy door that is difficult to push open. Threads

The Peking University team, led by Qiu Chenguang alongside Peng Lianmao of the Chinese Academy of Sciences, found a structural fix. Using advanced processing techniques, they scaled the gate electrode down to just 1 nanometer, achieving atomic-level precision. The result is a transistor that operates at as low as 0.6 volts and consumes roughly one-tenth the energy of the lowest reported FeFET values internationally, with a response time as fast as 1.6 nanoseconds. For context, a DNA molecule is approximately two nanometers wide.

Qiu said the in-memory computing capability of FeFETs “aligns closely with the future evolution of AI chips” and that the industry views them as among the most promising devices for brain-inspired neuromorphic computing. The research was published in the peer-reviewed journal Science Advances, and Peking University has already filed patents on the design.

Global data center electricity consumption is forecast to rise sharply through the decade, driven largely by AI workloads. Chips that can deliver more compute per watt would ease that pressure considerably. The result is a laboratory demonstration, not a production chip. Independent verification and the path to manufacturing at scale remain open questions.

Abdul Wasay

Abdul Wasay explores emerging trends across AI, cybersecurity, startups and social media platforms in a way anyone can easily follow.