Moore's Law, which observed that the number of transistors on a chip would double every two years, is reaching its limit, as it is becoming more and more difficult to pack more transistors into microchips, even at the nano level. As a result, engineers are scrambling to find new ways and methods to significantly boost computing power.
One emerging technology to increase computing power is optical computing (also known as photonic computing), which uses photons—light—instead of electrons to transfer data and perform tasks and calculations.
"Photonic computing is using the physics of light to do certain kinds of computational tasks more efficiently than what is possible with electronic signals," said Ryan Hamerly, Senior Scientist at NTT Research in Sunnyvale, CA, as well as a visiting scientist at the Department of Electrical Engineering and Computer Science Research Laboratory of Electronics at the Massachusetts Institute of Technology (MIT) in Cambridge, MA.
Hamerly explained a light-based chip is just one that is fabricated using standard semiconductor processes, but with a slightly different set of fabrication recipes that result in photonic structures rather than electronic structures. "With electronic structures, you're more interested in controlling the currents, and you care about resistances and capacitances, but in photonic structures, you care more about controlling the flow of light," Hamerly observed.
In traditional computing, electrons flow through a wire very slowly and data is propagated at very low frequencies (at gigahertz or tens of gigahertz speeds at the most), whereas optical photons are moving at frequencies of several hundred terahertz, according to Hamerly. Typically, he said, using lasers as a light source and fiber optics to accomplish photonic computing is much faster than traditional computing.
Hamer pointed out that light-based chips are fabricated using some variant of combined metal-oxide semiconductor (CMOS) processing, just like traditional semiconductors. "Even if it's not CMOS, you're using the same type of tools such as mass deposition, lithography, and so forth," he said.
Hamerly said he believes we will see some photonic cores that are efficient at specific things start to take off as components of larger systems in the next couple of years, adding that he expects photonic computing will mostly remain in the realm of the datacenter, rather than migrating to edge devices. He thinks the adoption of light-based computing will be a gradual process, and not apparent to end-users. "The engineers designing the hardware and software understand these advances, but users will only notice that things are speeding up," Hamerly said.
Maurice Steinman, vice president of engineering at Lightelligence, a photonics computing company located in Boston, MA, pointed out that since "photonic computing is a form of analog computing, so there is a noise component and inherent inaccuracy."
Currently, artificial intelligence (AI) is the main focus for photonic computing, according to Steinman. "The fact that AI computing workloads are statistical in nature means that if there are advantages in speed and power, any inaccuracies are well-tolerated, so it's a good fit with photonic computing," he said.
Nicholas Harris, co-founder and CEO of Lightmatter, a photonic computing company also based in Boston, observed that while traditional computing relies on transistors for computation, communication, and memory, "Photonic computing augments our toolkit by offering new ways to do addition, multiplication, and data movement. The upside of this new set of tools is continued computational progress—increased energy efficiency and throughput for both single chips and systems."
In addition, Harris said, photonics enables extremely low-latency computation, with a very high number of operations per second. "That's a hard combination to realize and is unique to photonics—enabled by the absence of parasitics including resistance, inductance, and capacitance, which set the characteristic time of electronic systems."
Harris said optical computing can be divided into two categories—computing and interconnect (data movement).
"There are many photonic computing accelerator architectures, and they share properties including a speed-of-light latency for completion of addition and multiplication for tensors, the ability to support extremely high clock frequencies, and also the ability to leverage multiple wavelengths of light simultaneously to parallelize the computation," Harris said.
On the interconnect side, he continued, photonics enables an extremely low-latency data transport, efficient communications between systems, disaggregation, and extremely large die sizes.
One of the key benefits of optical computing is its incredibly fast speed, as photons travel at the speed of light, far more quickly than electrons move through copper wires. For example, according to Lightelligence's Steinman, at the end of 2021 the company released its Photonic Arithmetic Computing Engine (PACE), which uses a vector matrix multiplier-type architecture, which many other photonic computing devices also use.
"We targeted PACE at the Ising problem," Steinman said, noting that Ising models often are used for difficult math problems characterized as computationally hard combinatorial optimizations with many different variables, such as traveling salesman problems (TSP). "We purpose-built the algorithm, it's not a programmable engine; it's hard-coded algorithm to do the Ising problem."
Steinman said when PACE was compared to a NVIDiA GPU, it was found to operate 800 times faster. "The key is finding the right algorithm, that's where we see the really dramatic speed-ups," Steinman said. "That's where the sweet spot is."
Steinman also said that with any electro-optical system, advanced packaging is a concern. "It's more than a standard CMOS chip or a flip chip in a package," Steinman said. "You've got the integration of an electrical chip, an optical chip, a light source, I/O, and power—it's everything possible, so advanced packaging is an area we spend a lot of time focusing on. This is one of the key technical barriers to photonic computing."
For the follow up to PACE, Lightelligence is adding in greater programmability so that when it is released to developers, they can identify apps on which PACE will work best for them, Steinman said.
Moving even further beyond classical computing, photonics can also be used for quantum computing. Christian Weedbrook, founder and CEO of Xanadu, a quantum computing company in Toronto, Canada, called quantum computers "the next generation of supercomputers."
Weedbrook explained that instead of working on a binary system like a classical computer that calculates digitally with transistors using 0s and 1s, quantum computers calculate using quantum bits (qubits) that can be 0 and 1 at the same time. In general, he said, computing power increases exponentially as the number of qubits available increases.
Weedbrook said that while there are several ways to build a quantum computer, Xanadu uses a photonic approach, utilizing optical components and fiber optics to network photonic chips together. "The benefits of a photonic approach include being scalable and modular, easily manufactured, and engineered to operate at room temperature," Weedbrook said. IN contrast, most supercomputers and quantum computers need to be kept at extremely low temperatures to operate at peak efficiency.
Lightmatter's Harris believes the world is going to get comfortable with photonics-augmented computing quickly, that photonics will silently power the growth of AI and computing in general and, if all goes well, no one will even notice it's there.
John Delaney is a freelance technology writer based in Manhattan, NY, USA.
No entries found