Faster Neural Networks

ReLU is a switch, either pass through directly or off. The only other things in basic neural networks are dot products (weighted sum). There is no reason then not to throw in other dot product algorithms such as the FFT if they are more efficient in some way. If you are stuck for something to parameterize you can parameterize the activation functions themselves.
https://ai462qqq.blogspot.com/2019/11/artificial-neural-networks.html

Also you might be interested in this associative memory that you could use with Neural Turing Machines (neural networks+external memory bank.)
https://discourse.processing.org/t/experiment-with-associative-memory/9966