Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Memristor – The missing circuit element (1971) [pdf] (cpmt.org)
84 points by dayve on June 16, 2017 | hide | past | favorite | 26 comments


Oh man, if you like that paper do I have the list for you. We used this list while doing some literature review when we were doing research on HPs memristors http://webee.technion.ac.il/people/skva/new_full_reference_l.... That covers a pretty wide range of topics related to memristors


Beautiful!

I have a question for you, lots of people have been claiming memristors are right around the corner for almost half a century at this point, what do you think about the viability of the memristor in the future?


In the enterprise storage market I think they're probably quite close to being a volume product. The versions I was working with were a few years back and admittedly a different chemistry than is used these days. The issue I think they'll have in the short run is that these will be quite expensive relative to NAND. I don't think you're going to have a memristor based disk in your home PC for quite a while. I do know that there are already some space systems that utilize memristors and reram already. The really novel applications (as in not just big hunks of fast memory) are probably much further out if they even go anywhere.


So do memristors exist or not? Wiki claims they are still 'hypothetical'.


They 100% exist. We had a wafer worth of cutup die of the TiO_2-x type devices all the way back in 2007. HP officially published results of a physical device as well (http://www.hpl.hp.com/news/2008/apr-jun/memristor.html)


My understanding of the TiO2-x devices is that they are strongly non-linear; which makes them very useful for binary storage, but less so for analogue memristor-y uses.


Wouldn't non-linear mem-resistors make pretty ideal neural nets? Most activation functions are modeled as non-linear functions anyway.

Edit: Appears some HP associated labs had success with just such a chip! Surprised it hasn't been more success in that field.

https://www.technologyreview.com/s/537211/a-better-way-to-bu...


Because no one wants neuormorphic hardware, we want 16-bit hardware that allows us to do math. Contrary to popular opinion, neural networks is more about math than biology. Having 1-bit or analog weights/activations is a royal pain.


That's a good point. Most of our current CS / Hardware tech and experience relies on digital computing. Makes me wonder if instead of using neuormorphic hardware as the deployment vehicle, it'd make sense to use it as a specialized high speed trainer. Whereas most of these research projects seem to be assuming that they'd be deployed as endpoint/client devices.

The alternative would be to use the annoying/tedious analog hardware to significantly (possibly) reduce the amount of time/energy costs to train a neural net. Then have special equipment to measure the learned weights and convert them to 16-bit weights that'd be easier to deploy.

Facebook/Google are showing that custom chips for deep learning make economic sense. That market feature could tip the scales for specialized neuormorphic training chips which deal with analog circuitry but offset those costs & complexities by driving efficiency and/or speed while still deploying via traditional digital chips.

Of course, this path relies on the assumption that an analog circuit could be much faster or energy efficient. It's not too far fetched though based on the current numbers and time for training vs biological system energy costs.

[edit: grammar]


Therein lies your problem, I've been telling anyone who'll listen that data movement, even on chip is our energy hog, not computation or even reading from the memory banks. How does analog hardware deal with that?


In the case I listed above, the data (i.e. the training weights) only needs to be read once at the end of training. It'd require move expensive instrumenting of the memristor's which is expensive and bad for general purpose computation unit.

However, comparing the energy needed to modify the neural net data in-situ via analog signals vs shuffling that same data in digital form back-and-forth repeatedly to simulate the analog process seems to provide a viable use case. A quick mental check seems to lean toward back-propagation being "cheaper" in the analog processes as the data doesn't need to be moved while the calculation is performed as part of the same signal propagation via the properties of the analog circuit.

In other words its cheaper to move the computation "units" to the data than to move the data to the computation engine for this particular case. Performing the training in digital form requires repeatedly shuffling all the weights for _each_ training iteration. That process is expensive. Luckily the inherent nature of the back-propagation algorithm adapts to the "sloppiness" of analog circuitry. Transferring the final weights to digital form could require final but light post-processing training to remove particulars of the underlying analog circuits.

But replicating and distributing the final trained model would be more efficient since it only requires a single shuffling of the weight data to apply the neural net and get an answer. Applying the trained model via standard digital means should be cheaper/easier for all the reasons you mentioned previously.

[edit: grammar & clarity]


Analog circuits are usually less accurate and more expensive to build than the equivalent digital. They also usually can't be as dense, as they are usually more prone to crosstalk.


Basically, in the science world, there is a lot of debate going on about exact definitions (memristive device vs. memristor). I can see if I can find the relevant papers if there is interest. Prof. Chua has updated and loosened his criteria a bit in response to the criticisms.

In the engineering world, if it talks like a memristor, and walks like a memristor if you squint a bit, we are damn well going to use it like a memristor. Hence we are happily building and testing nonvolatile storage (MRAM, ReRAM, PCRAM) and other applications - I'm gonna start research on applications as ML accelerators in May :-)


My understanding is that the current 'memristor' technologies are essentially combinations of materials which satisfy the linkage between flux and charge, thereby meeting the definition.

From the same wikipedia article:

> There is no such thing as a standard memristor. Instead, each device implements a particular function, wherein the integral of voltage determines the integral of current, and vice versa. A linear time-invariant memristor, with a constant value for M, is simply a conventional resistor.[1] Manufactured devices are never purely memristors (ideal memristor), but also exhibit some capacitance and resistance.

Just reading the 'Background' and following 'Memristor definition and criticism' sections gives a super clear and lucid answer that even a goober like me can mostly degest.

https://en.wikipedia.org/wiki/Memristor#Background


They absolutely exist. The question is whether they can be reliably fabricated in volume without having loads of faulty units and hopefully, to not be ridiculously expensive.


There's another similar such element - which has a somewhat confusingly similar name:

https://en.wikipedia.org/wiki/Memistor

It was developed by Bernard Widrow in 1960, whereas the memristor was envisioned and named in 1971 - but not found to actually exist until 2008.

The memistor was most "famously" used to develop a couple of artificial intelligence (neural network) systems, ADALINE and MADALINE:

https://en.wikipedia.org/wiki/ADALINE

Note that if you try to google "memistor", you'll maddeningly get suggestions and results for "memristors" instead - even when you tell google to show you only results for memistors only; this makes finding information about them online difficult. It's best to add to your search "adaline", "madaline", and/or "widrow".

The main difference between the two devices are that the memistor is a three-terminal device, versus the memristor being a two-terminal device. You can think of the memistor as being a "memory transistor" vs the memristor as a "memory resistor". This is gross simplification, of course.

One other interesting thing about ADALINE is how simple the memistors are to construct; you can effectively re-create ADALINE at home, as Widrow's paper shows:

http://www-isl.stanford.edu/~widrow/papers/t1960anadaptive.p...

On the topic of memristors - they can be DIY'd as well:

http://sparkbangbuzz.com/memristor/memristor.htm

Finally - if you're interested in this kind of thing (that is, implementing hardware analogs of brain functionality), google "neuromorphic computing"...


Fun fact: Leon Chua is Amy Chua's (of Tiger Mom fame) father.


Anybody knows what happened to to HP's "memristor"? Was it only vaporware?


Still in the lab, "the machine" switched to using DRAM until it gets cheap enough to scale. It's really mind-boggling how much we demand of new technologies, we've refined magnetic storage over many decades. These things at least a decade to go from laboratory ... proof-of-existence to commercially competitive product.


As far as I can tell there is more than just HP's memristor based storage level, random access, non-vol memory technologies stuck in R&D pipelines right now. Eventually (or perhaps hopefully) they'll make it to market and I think will make for a more interesting state of computer hardware than there has been in some time.


We are asking a brand-new technology to catch up with stuff that has had an entire industry optimizing it for decades. It's going to take decades for a single company to get things up to speed.


Thanks for the update.

But it was HP which promised commercial products soon, not me :)


Brains have the equivalent of memristors. Changing the value of the memristors gets harder the more charge it has. That is why spaced repetition as a learning method works as a memory technique.

When we sleep the brain organsizes and decides what to keep/forget. Artificial intelligence probably needs to emulate sleep.


Ok, I get that capacitors and inductors are opposites, but I've yet to understand the significance of why the memristor is the opposite of a resistor... Chalk it up to my primitive knowledge of analog design.

Couldn't theoretically there be a mempacitor and a memductor as well?



Here's a nice talk about memristors:

https://www.youtube.com/watch?v=bKGhvKyjgLY




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: