Posted by rbanffy 5 days ago
This team has made a nonlinear lattice that relies on something they call "Joule-Thomson-like expansion." The Joule-Thomsen effect is the ideal gas law in beginning science. PV=nRT. Compression heats a gas, expansion cools a gas.
Why they're studying the equivalent photonics principle [1] is that it focuses an array of inputs, "causing light to condense at a single spot, regardless of the initial excitation position." Usually the problem is that light is linearly independent: two beams blissfully ignore each other. To do useful switching or compute, one of the beams has to be able to act as a control signal.
A photon gas doesn't conserve the number of particles (n) like beginning physics would suggest. This lets the temperature of the gas control the output.
The temperature, driven by certain specific inputs, produces the nonlinear response. I didn't see a specific claim what gain they achieved.
This paper is more on the theoretical end of photonics research. Practical research such as at UBC Vancouver [2] where a device does "weight update speed of 60 GHz" and for clustering it can do "112 x 112-pixel images" - the tech doesn't compete well against electronics yet.
TSMC and NVidia are attempting photonics plays too. But they're only achieving raw I/O with photons. They can attach the fiber directly to the chip to save watts and boost speeds.
Basic physics gets in the way too. A photon's wavelength at near UV is 400 nanometers, but the transistors in a smartphone are measured at 7 nanometers ish. Electrical conduction is fundamentally smaller than a waveguide for light. Where light could maybe outshine electrons is in switching speed. But this research paper doesn't claim high switching speed.
This has interesting applications. For example, you can exploit this with dilute metal vapor in an expanding helium gas to cool the metal vapor to very low temperature - the Joule-Thomson expansion of helium increases the helium's temperature by converting the energy of the intermolecular forces into heat. This draws out energy from the metal vapor. If done in a vacuum chamber, then in the region before the shockwave formed by the helium, the supercooled metal atoms will form small van der Waals clusters that can be spectroscopically probed in the jet. This was an interesting area of study back in the 80s that advanced our understanding of van der Waals forces.
It's probably been six years since I looked at this space. The problem at the time for TSMC and several other people was that their solutions worked fairly well for firing photons vertically out of the chip and not well at all for firing them horizontally through the chip. I don't know if in the short term and mid term if an optical PCIe or memory bus is more overall horsepower than faster cross-chip communication in CPUs. But the solutions they were still chasing back then were good between chips, maybe between chiplets. Which could still be an interesting compromise.
> 400 nanometers, but the transistors in a smartphone are measured at 7 nanometers ish
The best em sensors need to be at least 1/10th the length of the frequency they are sending/receiving right? 40 nm isn't awful but it does suggest light for communication between functional units, rather than for assembling them.
Not really, "7 nm" is just a marketing name, the actual transistors are around 50 nm:
Probably not 7nm small, but not the full 50 nm either.
The details are probably fiddly though.
- what is a "photon gas"? Is this a state of matter? What is the matter if photons aren't matter?
- ideal gas law, PV=nRT not obeyed? Due to ionization or something? Photon pressure?
- Joule-Thompson Effect?
- Building computers out of light?
- Which thermodynamic properties or laws are being obeyed? Is this something like a Carnot cycle, but with photons?
If you can control the nonlinearity, you can control the wavelength change and so change properties such as the angle of refraction to change where the light goes (like in a rainbow/a prism, where the red light refracts more).
They let the photon gas move around inside a crystal where they behave nonlinearly so that photons are bound or repelled to some degree, so that they behave more like CO2 or a refrigerant than like helium.
It's yet another misleading use of the word "photon". A photon gas basically refers to the statistical behaviour of quantised oscillators (typically some atoms/molecules that can vibrate) in the walls of a closed box, called a "cavity". Since the oscillators de-excite by emitting radiation which stays inside the box until it's re-absorbed by the oscillators, you can sort of get away with thinking of it like the distribution of the modes of oscillations of the electromagnetic fields inside the cavity, which is what photons actually are.
You can engineer a waveguide if you understand the nonlinear theory they propose. There's no heat exchange involved, which is easy to get confused on because the writing in the article does not really understand "optical thermodynamics".
>if the routing is dynamically changeable
At this point probably not, it requires a finely engineered waveguide which has a well-defined "ground state"
>it works in reverse, eg light coming in can be routed to one of several output ports
In theory it works in reverse, as everything in this system is time-reversible (i.e., the "optical thermodynamics" is just an analogy and not real thermodynamics, which would break time reversibility). This is demonstrated via a simulation in the SI, but experimentally they did not achieve this (it may be difficult, I am not an experimentalist so cannot comment).
What is even less clear than the above is how is this being used.. Presumably it's not just about routing light to some fixed location, but rather allowing it to be switched, so perhaps(?!) the phototic lattice has multiple inputs that interact resulting in light being steered to one of many outputs? Light being used to switch light?
I dunno - it was clear as mud. I'm basically just guessing here.
Electronics has topped out in the gigahertz range. We keep cramming more cores and more ALU units and wider vectors onto a chip by making it smaller and we keep making it more energy efficient, but it's not getting faster in terms of linear compute and hasn't for a while. We've started hitting physical limits there.
Optics could, AFAIK, run in the terahertz range. That's thousands of gigahertz. So wouldn't that be like thousands of electronic cores, but it would accelerate all code including non-parallelizable code?
I've wondered if this might not be a bigger deal for general purpose compute than quantum computing.
Or is my understanding way off?