ReLU is a switch, either pass through directly or off. The only other things in basic neural networks are dot products (weighted sum). There is no reason then not to throw in other dot product algorithms such as the FFT if they are more efficient in some way. If you are stuck for something to parameterize you can parameterize the activation functions themselves.
Also you might be interested in this associative memory that you could use with Neural Turing Machines (neural networks+external memory bank.)