Remix.run Logo
cycomanic 3 days ago

People have done these sort of "optical computing" based demonstrations for decades, despite David Miller showing that fundamentally digital computing with optical photons will be immensely power hungry (I say digital here, because there are some applications where analog computing can make sense, but it almost never relies memory for bits).

Specifically this paper is based on simulations, and I've only skimmed the paper, but the power efficiency numbers sound great because they say 40 GHz read/write speeds, but these consume comparatively large powers even if not reading or writing (the lasers have to be running constantly). I also think they did not include the contributions of the modulation and the required drivers (typically you need quite large voltages)? Somebody already pointed out that the size of these is massive, and that's again fundamental.

As someone working in the broad field, I really wish people would stop these type of publications. While these numbers might sound impressive at a first glance, they really are completely unrealistic. There are lots of legitimate applications of optics and photonics, we don't need to resort to this sort of stuff.

embedding-shape 3 days ago | parent | next [-]

> showing that fundamentally digital computing with optical photons will be immensely power hungry

> they really are completely unrealistic

Unrealistic only because they're power hungry? That sounds like a temporary problem, kind of like when we come up with a bunch of ML approaches we couldn't actually do in the 80s/90s because of the hardware resources required, but today work fine.

Maybe even if the solution aren't useful today, they could be useful in the future? Or maybe with these results, there are more people being inspired to create solutions specifically about the power usage?

"we don't need to resort to this sort of stuff" makes it sound like this is all so beneath you and not deserving of attention, but why are you then paying attention to it?

gsf_emergency_6 3 days ago | parent | next [-]

Miller limit is fundamentally due to photons being bosons, not great for digital logic (switches) vs carrying info.

There are promising avenues to use "bosonic" nonlinearity to overtake traditional fermionic computing, but they are basically not being explored by EE departments despite (because of?) their oversized funding and attention

cycomanic 2 days ago | parent | prev | next [-]

> > showing that fundamentally digital computing with optical photons will be immensely power hungry > > > they really are completely unrealistic > > Unrealistic only because they're power hungry? That sounds like a temporary problem, kind of like when we come up with a bunch of ML approaches we couldn't actually do in the 80s/90s because of the hardware resources required, but today work fine. > > Maybe even if the solution aren't useful today, they could be useful in the future? Or maybe with these results, there are more people being inspired to create solutions specifically about the power usage? >

No they are fundamentally power hungry because you essentially need a nonlinear response, i.e. photons need to interact with each other. However photons are bosons and really dislike interacting with each other.

Same thing about the size of the circuits they are determined by the wavelength of light, so fundamentally they are much larger than electronic circuits.

> "we don't need to resort to this sort of stuff" makes it sound like this is all so beneath you and not deserving of attention, but why are you then paying attention to it?

That's not what I said, in fact they deserve my attention because they need to be called out, as the article clearly does not highlight the limitations.

scarmig 2 days ago | parent | prev | next [-]

Universities believe that constantly putting out pieces that sound like some research is revolutionary and will change everything increases public support of science. It doesn't, because the vast majority of science is incremental and mostly learning about some weird, niche thing that probably won't translate into applications. This causes the public to misunderstand the role of scientific research and lose faith in it when it doesn't deliver on its promises (made by the university press office, not the researcher).

cindyllm 3 days ago | parent | prev [-]

[dead]

gsf_emergency_6 3 days ago | parent | prev | next [-]

I only upvoted to send a msg to the moderators not to upweight uni/company press releases :) sadly the energy of VC-culture goes into refining wolf-crying despite all the talk of due dilligence,"thinking for yourself" and "understanding value"

The core section from paper (linked below) is pp8-9.

2mW for 100s of picosecs is huge.

(Also GIANT voltages,if only to illustrate how coarse their simulations are):

As shown in Figure 6(a), even with up to 1 V of noise on each Q and QB node (resulting in a 2 V differential between Q and QB), the pSRAM bitcell successfully regenerates the previously stored data. It is important to note that higher noise voltages increase the time required to restore the original state, but the bitcell continues to function correctly due to its regenerative behavior.

3 days ago | parent | prev | next [-]
[deleted]
fooker 2 days ago | parent | prev [-]

40GHZ memory/compute for 10-100x power sounds like a great idea to me.

We are going tohave energy abundant at some point.