From the punch card-driven looms of the 1800s to today’s smartphones, data storage has always relied on a simple principle: an object that can switch between “on” and “off” states can be used to store information.
In modern computers, binary code — ones and zeroes — takes different physical forms. In a laptop, transistors represent these states by operating at either high or low voltage. On a compact disc, a “one” appears where a tiny indented pit transitions to a flat surface, while a “zero” is a region with no change.
Traditionally, the physical size of these binary components has limited how much data a device can store. Now, researchers at the University of Chicago’s Pritzker School of Molecular Engineering (UChicago PME) have developed a method to encode ones and zeroes using crystal defects — imperfections at the atomic level. This breakthrough could significantly enhance the storage capacity of classical computer memory.
Their findings were published on February 14 in Nanophotonics.
To read more, click here.