207 points by rbanffy 1 day ago | 25 comments
patcon 1 day ago
Whenever I hear about neuromorphic computing, I think about the guy who wrote this article, who was working in the field:

Thermodynamic Computing https://knowm.org/thermodynamic-computing/

It's the most high-influence, low-exposure essay I've ever read. As far as I'm concerned, this dude is a silent prescient genius working quietly for DARPA, and I had a sneak peak into future science when I read it. It's affected my thinking and trajectory for the past 8 years

vpribish 35 minutes ago
I dunno, man. strikes me as crazy persecuted-genius syndrome. red flags go up from the very start.
iczero 23 hours ago
Isn't this just simulated annealing in hardware attached to a grandiose restatement of the second law of thermodynamics?
pclmulqdq 21 hours ago
Yes. This keeps showing up in hardware engineering labs, and never holds up in real tasks.
moralestapia 13 hours ago
It's not.

What an extremely uneducated guess.

saagarjha 9 hours ago
Educate, then.
moralestapia 1 hour ago
Happy to do it, mail in profile.

Rate is $1,000 USD/hour.

ahnick 21 hours ago
Is this what Extropic (https://www.extropic.ai/) is aiming to commercialize and bring to market?
4 hours ago
andrepd 9 hours ago
This is literally a borderline crank article. "A new framework for physics and computing" turns out to be quantum annealing for SAT lol

The explanations about quantum mechanics are also imprecise and go nowhere towards the point of the article. Add a couple janky images and the "crank rant" impression is complete.

afarah1 1 day ago
Interesting read, more so than the OP. Thank you.
lo_zamoyski 21 hours ago
I will say that the philosophical remarks are pretty obtuse and detract from the post. For example...

"Physics–and more broadly the pursuit of science–has been a remarkably successful methodology for understanding how the gears of reality turn. We really have no other methods–and based on humanity’s success so far we have no reason to believe we need any."

Physics, which is to say, physical methods have indeed been remarkably successful...for the types of things physical methods select for! To say it is exhaustive not only begs the question, but the claim itself is not even demonstrable by these methods.

The second claim contains the same error, but with more emphasis. This is just off-the-shelf scientism, and scientism, apart from what withering refutations demonstrate, should be obviously self-refuting. Is the claim that "we have no other methods but physics" (where physics is the paradigmatic empirical science; substitute accordingly) a scientific claim? Obviously not. It is a philosophical claim. That already refutes the claim.

Thus, philosophy has entered the chat, and this is no small concession.

dekhn 2 hours ago
Show us a real example of something that your putative non physics science can do that physics cannot, in a way that would be comprehensible to a sufficiently open minded science.

It seems unlikely you could suggest a concrete alternative to physics which explains observable phenomena as well and making generalizable predictions. Showing this would move your theoretical philosophy. In the meantime the rest of us will stick to physics because nobody has a coherent alternative which explains our observations.

vlovich123 19 hours ago
I’m not sure I understand what you’re trying to say. It’s not really questionable that science and math are the only things to come out of philosophy or any other academic pursuit that have actually shown us how to objectively understand reality.

Now physics vs other scientific disciplines sure. Physicists love to claim dominion just like mathematicians do. It is generally true however that physics = math + reality and that we don’t actually have any evidence of anything in this world existing outside a physical description (eg a lot of physics combined = chemistry, a lot of chemistry = biology, a lot of biology = sociology etc). Thus it’s reasonable to assume that the chemistry in this world is 100% governed by the laws of physics and transitively this is true for sociology too (indeed - game theory is one way we quantifiably explain the physical reality of why people behave the way they due). We also see this in math where different disciplines have different “bridges” between them. Does that mean they’re actually separate disciplines or just that we’ve chosen to name features on the topology as such.

MantisShrimp90 13 hours ago
Its just not that simple. The best way I can dovetail with the author is that you are thinking in terms of the abstraction but you have mistaken the abstraction for reality.

Physics, biological sciences, these are tools the mind uses to try and make guesses about the future based on past events. But the abstraction isn't perfect, and its questionable on whether or not it could or should one day be.

The clear example is that large breakthroughs in science often comes from rethinking this fundamental abstraction to explain problems that the old implementation had trouble with. Case in point being quantum physics which has warped how we original understood newtonian physics. Einstein fucking hated quantum because he felt it undermined the idea of objective reality.

The reality (pun intended) is that it is much more complex than our abstractions like science and we would do well to remember they are pragmatic tools and are ultimately unconcerned with the practice of metaphysics which is the underlying nature of reality.

This all seems like philosophy ramblings until we get to little lines like this. Scientism, or the belief that science is the primary and only necessary lens to understand the world falls for the same trap as religion of thinking that you have the answer to reality so anything else outside is either unnecessary or even dangerous to one who holds these views.

vlovich123 5 hours ago
I’m not sure I really understand this point. I believe that the scientific method (hypothesis, repeated tests, reality check) is the only successful method we’ve developed to advance our understanding of how the world works. I never claimed it’s perfect but that’s shaky footing that’s being injected onto this position. A counterclaim has to show that there’s something better than the scientific method that humans have been engaging in for attaining a better understanding of reality.

Often such attempts try to just wholly put themselves outside the realm of science which I don’t think puts them on strong footing. Just like updates to standard models still have to explain our current understandings of quantum and relativity, alternate methodologies for observing reality have to hold up to scientific scrutiny.

But I claim ignorance here. What better mechanisms has humanity developed for observing and understanding reality?

evolextra 1 day ago
Man, this article is incredible. So many ideas resonate with me, but I never can't formulate them. Thanks for sharing, all my friends have to read this.
epsilonic 23 hours ago
If you like this article, you’ll probably enjoy reading most publications from the Santa Fe Institute.
antonvs 9 hours ago
[flagged]
GregarianChild 1 day ago
I'd be interested to learn who paid for this machine!

Did Sandia pay list price? Or did SpiNNcloud Systems give it to Sandia for free (or at least for a heavily subsidsed price)? I conjecture the latter. Maybe someone from Sandia is on the list here and can provide detail?

SpiNNcloud Systems is known for making misleading claims, e.g. their home page https://spinncloud.com/ lists DeepMind, DeepSeek, Meta and Microsoft as "Examples of algorithms already leveraging dynamic sparsity", giving the false impression that those companies use SpiNNcloud Systems machines, or the specific computer architecture SpiNNcloud Systems sells. Their claims about energy efficiency (like "78x more energy efficient than current GPUs") seem sketchy. How do they measure energy consumption and trade it off against compute capacities: e.g. a Raspberry Pi uses less absolute energy than a NVIDIA Blackwell but is this a meaningful comparison?

I'd also like to know how to program this machine. Neuromorphic computers have so far been terribly difficult to program. E.g. have JAX, TensorFlow and PyTorch been ported to SpiNNaker 2? I doubt it.

SubiculumCode 2 hours ago
I don't know, but just wanted to say that my son got a job there as a mechanical engineer, and I couldn't be more proud. He can't tell me much because of classified status, but I can tell he loves his job and who he works with. Just sending praise to Sandia
floren 21 hours ago
As an ex-employee (and I even did some HPC) I am not aware of any instances of Sandia receiving computing hardware for free.
prpl 19 hours ago
no but sometimes they are for demonstration/evaluation, though that wouldn’t usually make a press release
rbanffy 7 hours ago
Unless the manufacturer makes it.
DonHopkins 10 hours ago
Deep Mind (Google’s reinforcement learning lab), Deep Seek (Alibaba’s LLM initiative), Deep Crack (EFF’s DES cracker), Deep Blue (IBM’s chess computer), and Deep Thought (Douglas Adams’ universal brain) all set the stage...

So naturally, this thing should be called Deep Spike, Deep Spin, Deep Discount, or -- given its storage-free design -- Deep Void.

If it can accelerated nested 2D FORTRAN loops, you could even call it Deep DO DO, and the next deeper version would naturally be called Deep #2.

JD Vance and Peter Thiel could gang up, think long and hard hard, go all in, and totally get behind vigorously pushing and fully funding a sexy supercomputer with more comfortably upholstered, luxuriously lubricated, passively penetrable cushioned seating than even a Cray-1, called Deep Couch. And the inevitable jealous break-up would be more fun to watch than the Musk-Trump Bromance!

justinclift 4 hours ago
> or -- given its storage-free design -- Deep Void.

Sounds like the big brother of Dev Null? :)

bob1029 1 day ago
I question how viable these architectures are when considering that accurate simulation of a spiking neural network requires maintaining strict causality between spikes.

If you don't handle effects in precisely the correct order, the simulation will be more about architecture, network topology and how race conditions resolve. We need to simulate the behavior of a spike preceding another spike in exactly the right way, or things like STDP will wildly misfire. The "online learning" promise land will turn into a slip & slide.

A priority queue using a quaternary min-heap implementation is approximately the fastest way I've found to serialize spikes on typical hardware. This obviously isn't how it works in biology, but we are trying to simulate biology on a different substrate so we must make some compromises.

I wouldn't argue that you couldn't achieve wild success in a distributed & more non-deterministic architecture, but I think it is a very difficult battle that should be fought after winning some easier ones.

mycall 18 hours ago
Artem Kirsanov provides some insights into the neurochemistry and types of neurons in his latest analysis [0] of distinct synaptic plasticity rules that operate across dendritic compartments. When simulating neurons in a more realistic approach, the timing can be deterministic.

[0] https://www.youtube.com/watch?v=9StHNcGs-JM

Joel_Mckay 2 hours ago
Many Neuromorphic computing platforms are robust at handling Globally asynchronous locally synchronous solver hardware.

There are some counterintuitive design choices that emerge, and it inevitably leads to vector machines with arbitrary order Tensor capabilities.

Have a wonderful day =)

latchkey 1 hour ago
I was tasked with getting 20,000 PS5 APU's online. ASRock BC-250. Each chassis had 12 independent blades and in order to save money, we decided to not put SSD/NVMe into them and iPXE boot them instead. Each APU had 16GB of memory, but the only issue was that it wasn't ECC, so they would randomly fail.

It took a lot of effort but it actually worked!

marsten 1 day ago
Interesting that they converged on a memory/network architecture similar to a rack of GPUs.

- 152 cores per chip, equivalent to ~128 CUDA cores per SM

- per-chip SRAM (20 MB) equivalent to SM high-speed shared memory

- per-board DRAM (96 GB across 48 chips) equivalent to GPU global memory

- boards networked together with something akin to NVLink

I wonder if they use HBM for the DRAM, or do anything like coalescing memory accesses.

mikewarot 22 hours ago
I see "storage-free"... and then learn it still has RAM (which IS storage) ugh.

John Von Neumann's concept of the instruction counter was great for the short run, but eventually we'll all learn it was a premature optimization. All those transistors tied up as RAM just waiting to be used most of the time, a huge waste.

In the end, high speed computing will be done on an evolution of FPGAs, where everything is pipelined and parallel as heck.

thyristan 22 hours ago
FPGAs are implemented as tons of lookup-tables (LUTs). Basically a special kind of SRAM.
mikewarot 22 hours ago
The thing about the LUT memory is that it's all accessed in parallel, not just a 64 bits at a time or so.
thyristan 11 hours ago
Not all, not always. FPGAs usually have more memory than is accessible in parallel (because memory cells are a lot cheaper than routing grid) and most customers want some blockram anyways. So what your synthesis tool will do with very high LUT usage is to do input or output multiplexing. Or even halving your effective clock and doubling your "number" of LUTs by doing multi-step lookups in then non-parallel memory.
timmg 1 day ago
Doesn't give a lot of information about what this is for or how it works :/
ymsodev 1 day ago
JKCalhoun 1 day ago
Love to see a simulator where you can at least run a plodding version of some code.
shrubble 1 day ago
You don’t have to write anything down if you can keep it in your memory…
realo 1 day ago
No storage? Wow!

Oh... 138240 Terabytes of RAM.

Ok.

crtasm 1 day ago
>In Sandia’s case, it has taken delivery of a 24 board, 175,000 core system

So a paltry 2,304 GB RAM

SbEpUBz2 1 day ago
I am reading it wrong, or the math doesn't add up? Shouldn't it be 138240 GB not 138240 TB?
divbzero 1 day ago
You’re right, OP got the math wrong. It should be:

  1,440 boards × 96 GB/board = 138,240 GB
CamperBob2 1 day ago
Either way, that doesn't exactly sound like a "storage-free" solution to me.
louthy 1 day ago
Just whatever you do, don't turn it off!
Nevermark 22 hours ago
"What does this button do?" Bmmmfff.

On the TRS-80 Model III, the reset button was a bright red recessed square to the right of the attached keyboard.

It was irresistible to anyone who had no idea what you were doing as you worked, lost in the flow, insensitive to the presence of another human being, until...

--

Then there was the Kaypro. Many of their systems had a bug, software or hardware, that would occasionally cause an unplanned reset the first time, after you turned it on, that you tried writing to the disk. Exactly the wrong moment.

DonHopkins 10 hours ago
Oh, the Apple ][ reset button beat the TRS-80 Model III "hands down" many years earlier, with its "Big Beautiful Reset Button" on the upper right corner of the keyboard.

It was comically vulnerable -- just begging to be pressed. The early models had one so soft and easy to trigger that your cat could reboot your Apple ][ with a single curious paw. Later revisions stiffened the spring a bit, but it was still a menace.

There was a whole cottage industry aftermarket of defensive accessories: plastic shields that slid over the reset key, mail-order kits to reroute it through an auxiliary switch, or firmware mods to require CTRL-RESET. You’d find those in the classified ads in Nibble or Apple Orchard magazines, nestled between ASCII art of wizards and promises to triple your RAM.

Because nothing says "I live dangerously" like writing your 6502 assembly in memory with the mini assembler without saving, then letting your little brother near the keyboard.

I got sweet sweet revenge by writing a "Flakey Keyboard Simulator" in assembly that hooked into the keyboard input vector, that drove him bonkers by occasionally missing, mistyping, and repeating keystrokes, indistinguishable from a real flakey keyboard or drunk typist.

rbanffy 7 hours ago
> Because nothing says "I live dangerously" like writing your 6502 assembly in memory with the mini assembler without saving, then letting your little brother near the keyboard.

RESET on the Apple II was a warm reset. You could set a value on page zero so that pressing it caused a cold start (many apps did that), but, even then, the memory is not fully erased on startup, so you'd probably be kind of OK.

dekhn 2 hours ago
The article hides the truth which is that it has no direct attached durable storage, but is connected via a Fabric to existing HPC hardware, which almost certainly loads data to the Spinnaker and stores results.

At the end of the day, processors really just load data, process, and store back to durable data (or generate some visible side effect).

Footpost 1 day ago
Well since Neuromorphic methods can show that 138240 = 0, should it come as as surprise that they enable blockchain on Mars?

https://cointelegraph.com/news/neuromorphic-computing-breakt...

jonplackett 1 day ago
Just don’t turn it off I guess…
rbanffy 1 day ago
At least not while it's computing something. It should be fine to turn it off after whatever results have been transferred to other computer.
rzzzt 1 day ago
I hear Georges Leclanché is getting close to a sort of electro-chemical discovery for this conundrum.
throwaway5752 1 day ago
I feel like there is a straightforward biological analogue for this.

But at in this case, one wouldn't subject to macro-scale nonlinear effects arising from the uncertainty principle when trying to "restore" the system.

HarHarVeryFunny 1 day ago
The original intent for this architecture was for modelling large spiking neural networks in real-time, although the hardware is really not that specialized - basically a bunch of ARM chips with high speed interconnect for message passing.

It's interesting that the article doesn't say that's what it's actually going to be used for - just event driven (message passing) simulations, with application to defense.

Onavo 1 day ago
Probably Ising models, phase transitions, condense matter stuff all to help make a bigger boom.
anArbitraryOne 15 hours ago
I love bio-inspired stuff, but can we (collectively) temper our usage of it? A better name for this would probably be something like distributed storage and computing architecture (or someone who really understands what this thing is please come up with a better name). If they want to say that part of the architecture is bio-inspired, or mammalian brain inspired, than fine, but let's be parsimonious
rahen 1 day ago
So if I understand correctly, the hardware paradigm is shifting to align with the now-dominant neural-based software model. This marks a major shift, from the traditional CPU + OS + UI stack to a fully neural-based architecture. Am I getting this right?
dedicate 1 day ago
I feel like we're just trading one bottleneck for another here. So instead of slow storage, we now have a system that's hyper-sensitive to any interruption and probably requires a dedicated power plant to run.

Cool experiment, but is this actually a practical path forward or just a dead end with a great headline? Someone convince me I'm wrong...

tokyolights2 1 day ago
Sandia National Labs is one of the few places in the country (on the planet?) doing blue-sky research. My first thought was similar to yours--If it doesn't have storage, what can I realistically even do with it!?

But sometimes you just have to let the academics cook for a few decades and then something fantastical pops out the other end. If we ever make something that is truely AGI, its architecture is probably going to look more like this SpiNNaker machine than anything we are currently using.

rbanffy 6 hours ago
> what can I realistically even do with it!?

It doesn't have built-in storage, but that doesn't mean it can't connect to external storage, or that its memory cannot be retrieved from a front-end computer.

fc417fc802 8 hours ago
HPC jobs generally don't stream data to disk in the first place. They write out (huge) snapshots periodically. So mount a network filesystem and be done with it. I don't see the issue.
JumpCrisscross 1 day ago
> we're just trading one bottleneck for another

If you have two systems with opposite bottlenecks you can build a composite system with the bottlenecks reduced.

moralestapia 13 hours ago
Usually, you get a state with two bottlenecks ...
rzzzt 11 hours ago
Think of L2 cache (small access time, small capacity) vs. memory modules (larger access time, large capacity) on a motherboard. You get the large capacity and an access time somewhere in between, depending on the hit rate.
DonHopkins 10 hours ago
Sounds like you need a massively parallel hardware regexp accelerator (a RePU), so you can have two millions of problems!
mipsISA69 1 day ago
This smells like a VC derived sentiment - the only value is from identifying the be all end all solution.

There's plenty to learn from endeavors like this, even if this particular approach isn't the one that e.g. achieves AGI.

1970-01-01 22 hours ago
The pessimist in me thinks someone will just use it to mine bitcoin after all the official research is completed.
hackyhacky 18 hours ago
The title describes this machine as "brain-like" but the article doesn't support that conclusion. Why is it brain-like?

I also don't understand why this machine is interesting. It has a lot of RAM.... ok, and? I could get a consumer-grade PC with a large amount of RAM (admittedly not quite as much), put my applications in a ramdisk, e.g. tmpfs, and get the same benefit.

In short, what is the big deal?

fasteddie31003 1 day ago
How much did this cost? I'd rather have CUDA cores.
rbanffy 1 day ago
Part of their job is to evaluate novel technologies. I find this quite exciting. CUDA is well understood. This is not.
fintler 1 day ago
They already have CUDA cores in production. This is a lab that's looking for the next big thing.
bee_rider 1 day ago
Sandia’s business model is different from NVIDIA for sure.
colordrops 1 day ago
> this work will explore how neuromorphic computing can be leveraged for the nation’s nuclear deterrence missions.

Wasn't that the plot of the movie War Games?

groby_b 18 hours ago
Calling 138240 TB of DRAM "storage-free" is... impressive.
rbanffy 6 hours ago
It's volatile storage. It needs to connect to other systems in order to operate.
laidoffamazon 22 hours ago
If it doesn’t have an OS, how does it…run? Is it just connected to a host machine and used like a giant GPU?
rbanffy 6 hours ago
Do GPUs have OSs? Or is it the host computer that sets up memory with data and programs and starts the processing units running?
StressedDev 17 hours ago
You can run software on bare metal without an OS. The downside is you have to write everything. That means drivers, networking code, the process abstraction (if you need it), etc.

One thing to remember is an operating system is just another computer program.

laidoffamazon 14 hours ago
At that point, you'd call the software stack an OS and not declare it to not have an OS
2OEH8eoCRo0 21 hours ago
How does an OS "run"?
14 hours ago
pier25 21 hours ago
Imagine if this is actually Skynet and the apocalyptic AI is called Sandia instead :)

(Sandia means watermelon in Spanish)

m463 21 hours ago
When I first learned that, the prestigious-sounding "sandia national labs" became "watermelon national labs" and I couldn't help but laugh.
1Sebastian 11 hours ago
[dead]
ChaoPrayaWave 15 hours ago
[dead]
throawayonthe 1 day ago
[dead]
curtisszmania 1 day ago
[dead]
isoprophlex 1 day ago
> the SpiNNaker 2’s highly parallel architecture has 48 SpiNNaker 2 chips per server board, each of which in turn carries 152 based cores and specialized accelerators.

NVIDIA step up your game. Now I want to run stuff on based cores.