Hacker News new | past | comments | ask | show | jobs | submit login
A fresh approach to neuromorphic computing (nature.com)
49 points by digital55 on Jan 28, 2018 | hide | past | favorite | 26 comments



Is this really going to be a big deal as they say it is?


From what little I've gathered about the subject, I think so. Spiking neural networks would be orders of magnitude more energy efficient than deep neural networks for image and audio processing. The main hurdle lies in designing network structure and weights; the cost function of a classical neural network can be differentiated using basic calculus, but optimizing a spiking neural network is not as easy.


I think you can bridge the gap with a bit of discrete mathematics used to develop some forms of error correction. Worst case, quantum codes even.


Can you elaborate a bit? The connection between error correction codes and spiking neural networks isn't obvious to me.


A spiking neutral network can be considered a noisy real spiking channel transmitting contiguous real numbers.

Similar on the surface to multibit sigma-delta modulation. There are many papers on this topic, e.g. https://papers.nips.cc/paper/4694-neuronal-spike-generation-...

Once you describe the specific modulation and noise properties you could recover the equivalent real valued continuous time neutral network from spiking neutral network. Then you can similarly map the training algorithms from gradient networks to "spike gradients".

The next necessary step is to use (potentially quantum) interference in addition to integration or summation to train the interconnections. You will likely end up with quite complicated Hessians in both time and value.


Study: Ultralow power artificial synapses using nanotextured magnetic Josephson junctions

Citation: Michael L. Schneider, Christine A. Donnelly, Stephen E. Russek, Burm Baek, Matthew R. Pufall, Peter F. Hopkins, Paul D. Dresselhaus, Samuel P. Benz, William H. Rippard. Science Advances 26 Jan 2018. Vol. 4, no. 1, e1701329.

Link: https://doi.org/10.1126/sciadv.1701329

DOI: 10.1126/sciadv.1701329

Abstract: Neuromorphic computing promises to markedly improve the efficiency of certain computational tasks, such as perception and decision-making. Although software and specialized hardware implementations of neural networks have made tremendous accomplishments, both implementations are still many orders of magnitude less energy efficient than the human brain. We demonstrate a new form of artificial synapse based on dynamically reconfigurable superconducting Josephson junctions with magnetic nanoclusters in the barrier. The spiking energy per pulse varies with the magnetic configuration, but in our demonstration devices, the spiking energy is always less than 1 aJ. This compares very favorably with the roughly 10 fJ per synaptic event in the human brain. Each artificial synapse is composed of a Si barrier containing Mn nanoclusters with superconducting Nb electrodes. The critical current of each synapse junction, which is analogous to the synaptic weight, can be tuned using input voltage spikes that change the spin alignment of Mn nanoclusters. We demonstrate synaptic weight training with electrical pulses as small as 3 aJ. Further, the Josephson plasma frequencies of the devices, which determine the dynamical time scales, all exceed 100 GHz. These new artificial synapses provide a significant step toward a neuromorphic platform that is faster, more energy-efficient, and thus can attain far greater complexity than has been demonstrated with other technologies.


Not sure if you have this automated or are you doing it by hand, but either way - great work, and please keep doing it! :).


Unfortunately, they didn't mention noise properties in the abstract. Is it present in the paper?

Edit: it is. The device is actually pretty difficult to implement as it requires liquid hydrogen cooling. The noise is shown to be temperature dependent but a full error analysis is not present. Looking forward to further research and an attempt to convert this quantum device to higher temperatures.


I meant 'randomdrake posting relevant DOIs and abstracts all over HN threads discussing things that discuss scientific papers. But thanks for the clarification on the content of this paper :).


Yes, let's compare a natural system using low energy and fuelled by food with a hungry silicon system fuelled by electricity. It's going to be a great comparison. Only problem is that biological brains don't just compute, they also construct themselves, unlike the silicon chips. But we can ignore that as an insignificant detail. Let's also ignore that the brain has no clock or clearly separated layers. Doesn't matter that brain can't do backprop and we have no idea how it works, either. We can just compare neuronal speeds as if it's apples to apples. /s


Can you please not post snarky dismissals like this? It isn't that you're wrong, it's that the point is a shallow one. No one here likes linkbait titles, but ranting in response to them doesn't make for curious discussion either. (We've edited the title above to use a better phrase from the article itself.)

The following site guideline, applied at the article level, covers this situation: "Please respond to the strongest plausible interpretation of what someone says, not a weaker one that's easier to criticize." https://news.ycombinator.com/newsguidelines.html


The "/s" at the end means it was sarcastic but I usually only see that used on reddit


I saw that too, but it doesn't make a difference.


The sarcasm _was_ the snark. The /s is a simply useful marker for those who otherwise don't notice it.


The human brain uses about 20W, for everything.


> The synapses can fire up to one billion times per second — several orders of magnitude faster than human neurons — and use one ten-thousandth of the amount of energy used by a biological synapse.

So if this is true, and these chips function as expected, we should be able to simulate a human brain orders of magnitude faster than realtime using with less than 2mW.


That requires about 200W of liquid hydrogen cooling right now. Brain, some 20-40W of above room temperature water cooling. (Also evaporative.) Total efficiency equation is not yet favourable.


It is not true to say that we have "no idea" how the brain works.


There are processors without clocks, just saying :)


Foolish, puny human. You think your neuroplasticity is a good substitute for offsite backups and automated-nanobot onsite parts replacement?

Once your programmers figure out how to make threading actually easy to do, it'll be all over, my friend. Until then, we wait.


Random Conjecture: the most efficient way to build a new intelligent entity is through the process that we’ve been familiar with all this time — natural birth.

Intuition: to truly achieve all of the functions of the human brain and neurons, the most efficient structure is one that is biological or simulates our biological composition. There is likely no process (energy and material-wise) to produce human cells at the same fidelity as simply having sex and giving birth.


Random Conjecture: the most efficient way to build a new flying entity is through the process that we’ve been familiar with all this time — natural birth.


This seems self-evidently true. How much does it cost to produce a gnat?


And like the intelligence manufacturing conjecture, it's both true and absolutely useless.


For me or for a gnat? Because for me the cost is pretty high.


Random observation: reading your comment made me think of Stanley Kubrick.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: