Like so many people under coronavirus quarantine right now, there's a good chance you are reading this on an LCD screen. After all, most laptops, phones and TVs have them. If so, you are gazing upon a strange and hallucinogenic miracle of modern science and engineering.
These may look like words, but they are actually a scintillating pattern of microscopic dots, each containing an equally tiny quantity of a substance not quite solid or liquid, a strange state of matter known as a liquid crystal. Dozens of times a second, some portion of those liquid-crystal-containing cells flicker on or off in an ultrafast physical process, realigning their long, skinny molecules into one configuration or another, flitting back and forth like the wings of a hummingbird. And those molecules are based not on silicon—like the microchips that drive our displays—but carbon and hydrogen, the stuff of life.
Fifty years ago, in 1970, a handful of engineers, chemists, physicists and mathematicians nailed down the design for the world's first commercially viable liquid-crystal-display technology. You've seen the results of their handiwork a thousand times, from the earliest digital watches and portable calculators through today's big-screen televisions and smartphones. Though LCD technology has evolved, the basic principles that made those early displays possible still underlie the screens with which humanity has an almost perversely intimate relationship.
The story of how the LCD came to be is an unhappy one for the American businesses that missed an opportunity seized instead by foreign companies—as you can still see from most of the brands of LCD-equipped products you buy today. It's also an example of the choices that companies make, seemingly foolish in hindsight, when they can't benefit from expensive, large scale, publicly funded research and development.
FromThe Wall Street Journal
View Full Article
No entries found