[D] What happen with the Neural Turing Machine / Differentiable Computer line of work?

Recently I saw some renewed interest in Algorithmic Reasoning by Petar Velickovic, essentially augment traditional "discrete" algorithms with "continuous" pattern recognition of DL, and it reminds me of the Neural Turing Machine / Differentiable Computer, mostly spearheaded by Alex Graves, which I believe share the same motivation with the Algorithmic Reasoning approach.

I haven't heard much about any major new work since the Neural Differential Computer and was wondering why? I was (and still am) fascinated by the idea and this research, but since I am not working on the topic, I'm not sure where are the challenges and pitfall?

I was aware of some instability in implementation but I thought the open-source code would have help there.

Anybody has insights on why this direction has not been explored more in recent years? Or is this one of those Schmidhubered's idea that too ahead of its time, and once people have squeezed out all the internal memory capacity from Transformer, this idea of external dynamic memory would bounce back?

πŸ‘︎ 73
πŸ’¬︎
πŸ‘€︎ u/lkhphuc
πŸ“…︎ Nov 18 2021
🚨︎ report
[R] Neural Turing Machines and their Relation to Bayesian Inference (The Open Review of Artificial Intelligence) openreview.net/forum?id=r…
πŸ‘︎ 3
πŸ’¬︎
πŸ“…︎ Feb 21 2021
🚨︎ report
[P] Implementing Neural Turing Machines in pytorch

I've been working on a Neural Turing Machine implementation in pytorch for a while, and I'm happy to make some results available:
https://clemkoa.github.io/paper/2020/05/27/neural-turing-machines-pytorch.html

NTMs are a special type of neural network that can read an write to a memory bank. It’s been introduced by Graves et al from Deepmind in 2014. NTMs can perform better than LSTM on some algorithmic tasks. Because they have fewer parameters, they are also able to converge faster!

The implementation is available on github here: https://github.com/clemkoa/ntm

Any feedback is welcome!

πŸ‘︎ 67
πŸ’¬︎
πŸ‘€︎ u/Clemkoa
πŸ“…︎ May 28 2020
🚨︎ report
[D] What might be the possible uses of neural turing machines (NTM) or neural differentiable computer (NDC)

After reading their papers, I noticed that most articles either discuss their implementation, or how hard is to train them. But none mention their value-increment. So what would NTM or NDC be bringing to the table? And what are their possible uses?

πŸ‘︎ 7
πŸ’¬︎
πŸ‘€︎ u/maroxtn
πŸ“…︎ Aug 28 2020
🚨︎ report
What are the more advanced variants of Neural Turing Machine, and what have they learned?

Neural Turing Machine is basically a neuralnet (normally LSTM) that gradually reads and writes scalars in an array. If there are 10 scalars per turing tape cell, then it jumps -10 or +10 array indexs, depending on one of the other outputs of LSTM. Some LSTM outputs choose how much to decay each of those 10 scalars toward a LSTM output, and as usual in LSTM it chooses how much to read those 10, and theres norming of those 10 scalars to have a constant sum. It learned to do simple things including copy, and sorting a list of integers whose binary digits are in the array. That was hill climbed by scoring it by how much sorted the array is.

Neural Turing Machine proves that neuralnets can operate a general computing process, but a turing machine is far slower than how normal computers work. Various kinds of external memories were tried. Where did that all lead?

πŸ‘︎ 11
πŸ’¬︎
πŸ‘€︎ u/BenRayfield
πŸ“…︎ May 03 2020
🚨︎ report
Does Neural Turing machine access memory after every time step in sequence.

Paper: https://arxiv.org/pdf/1410.5401.pdf

From those implementations it looks like that the controller runs the whole sequence before writing and reading memory, shouldn't there be a memory access after each time step or did I misunderstood the code?https://github.com/MarkPKCollier/NeuralTuringMachine/blob/master/ntm.py#L49https://github.com/loudinthecloud/pytorch-ntm/blob/master/ntm/ntm.py#L74

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/johnnydues
πŸ“…︎ Aug 12 2020
🚨︎ report
Are Neural Turing Machines even being used for any purpose?
πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/porygon93
πŸ“…︎ Aug 27 2020
🚨︎ report
[D] Any subsequent work on Neural Turing Machine (NTM and DNC) ?!

I recall the learning rate was very slow. I wonder if there has been subsequent work.

πŸ‘︎ 31
πŸ’¬︎
πŸ‘€︎ u/so_tiredso_tired
πŸ“…︎ Feb 18 2019
🚨︎ report
Implementing Neural Turing Machines in pytorch (r/MachineLearning) reddit.com/r/MachineLearn…
πŸ‘︎ 4
πŸ’¬︎
πŸ‘€︎ u/Peerism1
πŸ“…︎ May 29 2020
🚨︎ report
[1807.08518] Implementing Neural Turing Machines arxiv.org/abs/1807.08518
πŸ‘︎ 49
πŸ’¬︎
πŸ‘€︎ u/ihaphleas
πŸ“…︎ Jul 24 2018
🚨︎ report
Expect likely far jump ahead toward "singularity" soon, based on what any lstm qlearning/reinforcementlearning with any kind of external memory (such as neural turing machine or lambda function slots in LSTM)

So far it appears there is AGI (such as openai played the dota video game and various other tasks which required planning and general logic and statistics, and others using similar algorithms, which may have taken far fewer LSTM nodes if it had an external memory, and I'm not completely sure it didnt) BUT only of finite thinking ability as it can only remember what fits in for example a few thousand LSTM nodes and a weight between each. It can not, for example, build a new software with more than that many pieces of logic since it would be pigeonholed to forget some of them. But if such a finite AGI had added to its qlearn actions and LSTM read view, its thinking at any one time is finite but could put any part of a program its building together with other parts that it remembers when it finds them somewhere in the storage which other stored things organize (as lambdas are a forest of lambdas and can get to be any huge depth). Such an AGI, if someone asked it the right question in the form of a qscore or goal function, could probably easily cure any disease by giving to it the molecules and differential equations of the ratios between which molecules increase or decrease which other molecules and other statistics. It could figure out how to do near anything. But it will appear not so smart until someone asks it the right question, since until then it has no reason to quickly change the world or knowledge there is such a world.

EDIT: not neural turing machine, cuz while it can do all the same things, its squared times as slow and that builds up fast when its building a program/thought that keeps expanding. Its technically the same level of AGI but would find such solutions squared times later if you didnt run out of memory etc first.

πŸ‘︎ 5
πŸ’¬︎
πŸ‘€︎ u/BenRayfield
πŸ“…︎ Aug 26 2019
🚨︎ report
[R] Evolving Neural Turing Machines for Reward-based Learning

Abstract An unsolved problem in neuroevolution (NE) is to evolve artificial neural networks (ANN) that can store and use information to change their behavior online. While plastic neural networks have shown promise in this context, they have difficulties retaining information over longer periods of time and integrating new information without losing previously acquired skills. Here we build on recent work by Graves et al. who extended the capabilities of an ANN by combining it with an external memory bank trained through gradient descent. In this paper, we introduce an evolvable version of their Neural Turing Machine (NTM) and show that such an approach greatly simplifies the neural model, generalizes better, and does not require accessing the entire memory content at each time-step. The Evolvable Neural Turing Machine (ENTM) is able to solve a simple copy tasks and for the first time, the continuous version of the double T-Maze, a complex reinforcement-like learning problem. In the T-Maze learning task the agent uses the memory bank to display adaptive behavior that normally requires a plastic ANN, thereby suggesting a complementary and effective mechanism for adaptive behavior in NE.

pdf

https://dl.acm.org/citation.cfm?id=2908930

πŸ‘︎ 21
πŸ’¬︎
πŸ‘€︎ u/milaworld
πŸ“…︎ Jul 01 2019
🚨︎ report
[D] Explanation of DeepMind's Neural Turing Machine rylanschaeffer.github.io/…
πŸ‘︎ 147
πŸ’¬︎
πŸ‘€︎ u/RSchaeffer
πŸ“…︎ Apr 10 2017
🚨︎ report
Google's Secretive DeepMind Startup Unveils a "Neural Turing Machine" technologyreview.com/view…
πŸ‘︎ 331
πŸ’¬︎
πŸ‘€︎ u/ion-tom
πŸ“…︎ Nov 24 2014
🚨︎ report
Google's Secretive DeepMind Startup Unveils a "Neural Turing Machine" | MIT Technology Review technologyreview.com/view…
πŸ‘︎ 115
πŸ’¬︎
πŸ‘€︎ u/xamdam
πŸ“…︎ Oct 30 2014
🚨︎ report
Implementation of Neural Turing Machines github.com/shawntan/neura…
πŸ‘︎ 37
πŸ’¬︎
πŸ‘€︎ u/mlalma
πŸ“…︎ Nov 14 2014
🚨︎ report
Neural Turing Machine in pure numpy. Implements all 5 tasks from paper. github.com/DoctorTeeth/di…
πŸ‘︎ 114
πŸ’¬︎
πŸ‘€︎ u/doctorteeth2
πŸ“…︎ Oct 12 2015
🚨︎ report
A Neural Turing Machine implementation using Torch github.com/kaishengtai/to…
πŸ‘︎ 57
πŸ’¬︎
πŸ‘€︎ u/algebroic
πŸ“…︎ Feb 05 2015
🚨︎ report
[P] Neural Turing Machines (NTM) implemented in PyTorch github.com/loudinthecloud…
πŸ‘︎ 23
πŸ’¬︎
πŸ‘€︎ u/getlasterror
πŸ“…︎ Oct 04 2017
🚨︎ report
[P] The Neural Turing Machine as a Keras recurrent layer

Hello! I would like to plug my own project, a backend neutral version of the Neural Turing Machine as a Keras recurrent layer.

The project resulted out of the need of at least one working implementation of the NTM for a modern high level library like tensorflow or keras, as I had to give a talk for a seminar about that. Frustrated I tried to fix the problem.

My implementation has a few features:

  • It allows you to pass arbitrary Keras models as controllers (as long as they adhere to some rules described in the API). If no model is passed, a single dense layer is used (maybe in future LSTM).
  • A lot of time went into optimising robustness, with some degree of success.
  • It is accompanied with, I think, quite a lot of documentation, including a 6-pages .pdf roughly explaining the principle and discussing implementation difficulties and a #comment to code ratio which is about 1.
  • Its very easy to use.

How easy? Well, you can just use it as a recurrent layer, and the hyperparameters are more or less only the width, depth of the memory and the controller architecture. To be honest, I try to sell the shortcoming that it has no support for arbritray number of read and write heads yet (currently one of each) as a feature ;)

I see a future for it being used as a enhanced replacement of LSTM, so please enjoy!

https://github.com/flomlo/ntm_keras/tree/master

Please note that I would be happy about any kind constructive critique about the documentation. I've worked the last month on the project, I still have a hard time to figure out what is trivial to newcomers and what is not.

Edit:

A new version which supports multiple head and more freedom in the activation function is released. Yay!

πŸ‘︎ 30
πŸ’¬︎
πŸ‘€︎ u/fetter_oml
πŸ“…︎ Aug 21 2017
🚨︎ report
[R]Neural Turing Machines: Perils and Promise blog.talla.com/neural-tur…
πŸ‘︎ 63
πŸ’¬︎
πŸ‘€︎ u/raithism
πŸ“…︎ Dec 16 2016
🚨︎ report
Can one construct an explicit isomorphism between Turing Machines and neural networks?

Hi Compsci,

I'm having a hard time visualising how a neural network could be Turing Complete (if it is, I'm not entirely sure), since it feels so fuzzy compared to the strict step-by-step processing that a Turing Machine can make.

How could I construct an isomorphism between these two seemingly disparate objects?

πŸ‘︎ 19
πŸ’¬︎
πŸ‘€︎ u/Firetaffer
πŸ“…︎ Aug 19 2015
🚨︎ report
Google's secretive DeepMind Start-up unveils a "Neural Turing Machine", a neural network that can access an external memory like a conventional Turing machine. The result is a computer that mimics the short-term memory of the human brain technologyreview.com/view…
πŸ‘︎ 119
πŸ’¬︎
πŸ‘€︎ u/nastratin
πŸ“…︎ Oct 30 2014
🚨︎ report
Neural Turing Machines

Hey all,

I've been reading about "Neural Turing Machines" lately but information on them seems pretty sparse, so I was hoping some of you could shed some light on the details. From my understanding, it's just a system with your typical neural network setup (input, hidden layers, output) but the output is connected to memory, creating a sort of Von Neumann architecture. The network can modify read/write heads based on its output, which feed back into the network. A few questions: Could such a system be used to control executable memory, as opposed to just stored data? Why is this model not more widely used?

πŸ‘︎ 25
πŸ’¬︎
πŸ‘€︎ u/normally_i_lurk
πŸ“…︎ Apr 08 2016
🚨︎ report
NTM-Lasagne: A Library for Neural Turing Machines in Lasagne β€” Snips Blog medium.com/snips-ai/ntm-l…
πŸ‘︎ 27
πŸ’¬︎
πŸ‘€︎ u/oulipo
πŸ“…︎ Feb 24 2016
🚨︎ report
Demis Hassabis on adding memory to neural networks, the "Neural Turing Machine" youtube.com/watch?v=08Cl7…
πŸ‘︎ 21
πŸ’¬︎
πŸ‘€︎ u/5ives
πŸ“…︎ May 02 2016
🚨︎ report
Fancy Addressing for Neural Turing Machines (with code) doctorteeth.github.io/201…
πŸ‘︎ 15
πŸ’¬︎
πŸ‘€︎ u/doctorteeth2
πŸ“…︎ Nov 17 2015
🚨︎ report
Best of 2014: Google's Secretive DeepMind Startup Unveils a "Neural Turing Machine" technologyreview.com/view…
πŸ‘︎ 70
πŸ’¬︎
πŸ‘€︎ u/StoryClerk
πŸ“…︎ Dec 30 2014
🚨︎ report
Are neural nets more computationally powerful than Turing machines?

I've been Googling around but I was hoping to get a short and clear answer here. My motivation for asking is relative to the philosophy of mind. That is, if the mind is physical, and if TMs can't think, then what is the mind doing exactly that makes it able to think?

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/WhackAMoleE
πŸ“…︎ Nov 24 2015
🚨︎ report
Implementing Neural Turing Machines awawfumin.blogspot.de/201…
πŸ‘︎ 12
πŸ’¬︎
πŸ‘€︎ u/iori42
πŸ“…︎ May 18 2015
🚨︎ report
The evolution of read/write in a Lie Access Neural Turing Machine (over the course of learning) is strangely satisfying
πŸ‘︎ 9
πŸ’¬︎
πŸ‘€︎ u/jinpanZe
πŸ“…︎ Mar 06 2016
🚨︎ report
Rational-weighted Neural Networks can implement a Turing Machine. [PDF] math.rutgers.edu/~sontag/…
πŸ‘︎ 36
πŸ’¬︎
πŸ‘€︎ u/wtfftw
πŸ“…︎ Jan 05 2010
🚨︎ report
Neural Turing Machine

NTM takes an input and output,adjusts weights between neurons and trains like a regular neural network but also has the capability to access memory(not a particular memory location but the entire memory at once according to the weights generated).It can 'learn' an algorithm by looking at inputs and outputs.Can someone explain all this in context of the copy task?About what exactly is being written in the memory and what is being read and about assignment of weights. [In reference to the paper by Alex Graves]

πŸ‘︎ 4
πŸ’¬︎
πŸ‘€︎ u/____jasmine____
πŸ“…︎ Sep 30 2018
🚨︎ report
Neural Turing Machines arxiv.org/abs/1410.5401
πŸ‘︎ 13
πŸ’¬︎
πŸ‘€︎ u/alecradford
πŸ“…︎ Oct 21 2014
🚨︎ report
Any Open Source Implementations of Neural Turing Machines or Memory Networks?

I would like to play around with memory networks (Jason Weston) or neural turing machines (similar concept) for a difficult classification task I am working on. Does anyone know of any good open source implementations out there?

πŸ‘︎ 8
πŸ’¬︎
πŸ‘€︎ u/simonhughes22
πŸ“…︎ Jun 14 2015
🚨︎ report
Could a Neural Turing Machine be used to learn how to design better neural networks?

In designing neural networks we compose various types of blocks, structures and hyperparameters. Can we design a system that automates the search for the optimal design? Maybe it can find surprising solutions. What would be the minimal neural net that learns a given task and also learns to improve itself (the minimal GΓΆdel machine)? It would be as if we endowed neural nets with the power of procreation.

πŸ‘︎ 6
πŸ’¬︎
πŸ‘€︎ u/visarga
πŸ“…︎ May 14 2016
🚨︎ report
Implementation of Neural Turing Machines github.com/fumin/ntm#neur…
πŸ‘︎ 12
πŸ’¬︎
πŸ‘€︎ u/cypherx
πŸ“…︎ Aug 10 2015
🚨︎ report
Demis Hassabis on adding memory to neural networks, the "Neural Turing Machine" youtube.com/watch?v=08Cl7…
πŸ‘︎ 27
πŸ’¬︎
πŸ‘€︎ u/5ives
πŸ“…︎ May 02 2016
🚨︎ report
Google's secretive DeepMind Start-up unveils a "Neural Turing Machine", a neural network that can access an external memory like a conventional Turing machine. The result is a computer that mimics the short-term memory of the human brain technologyreview.com/view…
πŸ‘︎ 63
πŸ’¬︎
πŸ‘€︎ u/nastratin
πŸ“…︎ Oct 30 2014
🚨︎ report
Help me understand the difference between Neural Turing Machines and LSTMs

Are these two concepts of long term dependence and coupling a NN to an external memory somewhat related? Any help or resources appreciated.

πŸ‘︎ 5
πŸ’¬︎
πŸ‘€︎ u/dil8
πŸ“…︎ Nov 20 2015
🚨︎ report
Neural Turing Machines: How are all the parts connected?

I've been studying Neural Turing Machines (NTMs) recently, and am having a hard time understanding the overall flow of the model.

The paper (https://arxiv.org/abs/1410.5401) describes really well how individual components of the Turing machine have been substituted to make their parameters differentiable. However, I'm not able to understand how the read/write heads are connected to the controller. My guess is that the read and write heads are like two feed-forward networks connected to the same layer in the controller (if the controller is feed-forward). Also, once the controller issues a write operation, is it the updated contents of the memory or the value of the emitted write head that is compared to the desired output to calculate the error?

Another part of the model that I don't understand is the interpolation step. Why does the current memory read/write vector depend on the previous step?

Can anybody please help me with this?

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/PrithviJC
πŸ“…︎ Jun 09 2016
🚨︎ report
Reinforcement Learning Neural Turing Machines (paper+code) gitxiv.com/posts/ZuSYfufo…
πŸ‘︎ 18
πŸ’¬︎
πŸ‘€︎ u/samim23
πŸ“…︎ Jan 04 2016
🚨︎ report
From the latest Trends in Cognitive Sciences; "The human Turing machine: a neural framework for mental programs" [PDF] neurociencia.df.uba.ar/pa…
πŸ‘︎ 26
πŸ’¬︎
πŸ‘€︎ u/Burnage
πŸ“…︎ Jul 06 2011
🚨︎ report
Symposium: Deep Learning ,Neural Turing Machines - Alex Graves youtube.com/watch?v=_H0i0…
πŸ‘︎ 8
πŸ’¬︎
πŸ‘€︎ u/nigh8w0lf
πŸ“…︎ Jun 21 2016
🚨︎ report
SmoothTapeNet - opensource wave based neural turing machine with 1 tape per node and 4 edge types - move writeValue writeDecay stdDev - moves and bends like signal processing instead of quantized

The prototype is a fullyConnected (times 4 edge types) neuralnet of 8 tapes with random edges. The edges dont change in this version.

In each tape there are 4 vectors, sets of edges from all the tapes (variable size bellcurve view of its center). Each of these is weightedSummed then sigmoid then scaled into a range. MOVE controls direction and speed. WRITEVALUE is the target value to decay the center toward. WRITEDECAY is how much to decay toward that target. STDDEV defines a bellcurve to read and write at. Decay is actually bellcurve height times WRITEDECAY.

Video: https://www.youtube.com/watch?v=u5NjaUbjxYk

Download then doubleclick this file to run the interactive prototype seen in the video: https://github.com/benrayfield/smoothtapenet/releases/download/v0.1/SmoothTapeNet_0.1_doubleClickToRun_or_unzipToGetSource.jar

Code: https://github.com/benrayfield/smoothtapenet

Hold mouse button to pull the tape's center toward mouse height with bellcurve density left/right. Mouse left/right slides a tape. The 2 lines show 1 stdDev. If you draw random curves in some of the nodes, you will see it create nonlinear curves and movements back and forth. It normally stabilizes on a single direction or vibrating back and forth per node, since its a random neuralnet. It will look like a heartbeat sometimes. For longer turing completeness it has to be trained.

I'm looking for help designing a learning algorithm for time series. This is a new kind of AI thats well defined how it runs but not how to adjust the weights. This could be an AGI.

πŸ‘︎ 5
πŸ’¬︎
πŸ‘€︎ u/BenRayfield
πŸ“…︎ Mar 18 2017
🚨︎ report
[1607.00036] Dynamic Neural Turing Machine with Soft and Hard Addressing Schemes arxiv.org/abs/1607.00036
πŸ‘︎ 15
πŸ’¬︎
πŸ‘€︎ u/cesarsalgado
πŸ“…︎ Jul 04 2016
🚨︎ report
Neural Turing Machine

Hi folks, I have a few questions about NTM. Is there any extension to these models? and are there any implementation of these models in torch? Thanks

πŸ‘︎ 4
πŸ’¬︎
πŸ‘€︎ u/hassanzadeh
πŸ“…︎ Jun 18 2016
🚨︎ report
Google's Secretive DeepMind Start-up Unveils A "Neural Turing Machine" technologyreview.com/view…
πŸ‘︎ 21
πŸ’¬︎
πŸ‘€︎ u/SimUnit
πŸ“…︎ Oct 30 2014
🚨︎ report
Real Valued Neural Networks more powerful than Turing Machines gpickard.wordpress.com/20…
πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/werg
πŸ“…︎ May 30 2008
🚨︎ report
Introduction to neural Turing machines

Can anybody recommend a paper / (blog) article for an introduction to neural Turing machines?

πŸ‘︎ 6
πŸ’¬︎
πŸ‘€︎ u/themoosemind
πŸ“…︎ May 24 2016
🚨︎ report

Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.