NeuralArithmetic.NACType
NAC(in::Int, out::Int; initW=glorot_uniform, initM=glorot_uniform)

Neural Accumulator as proposed in https://arxiv.org/abs/1808.00508

NeuralArithmetic.NALUType
NALU(in::Int, out::Int; initNAC=glorot_uniform, initG=glorot_uniform, initb=glorot_uniform)

Neural Arithmetic Logic Unit. Layer that is capable of learing multiplication, division, power functions, addition, and subtraction. As proposed in: https://arxiv.org/abs/1808.00508

NeuralArithmetic.NAUType
NAU(in::Int, out::Int; init=glorot_uniform)

Neural addition unit.

As suggested in https://openreview.net/pdf?id=H1gNOeHKPS

NeuralArithmetic.NMUType
NMU(in::Int, out::Int; init=rand)

Neural multiplication unit. Can represent multiplications between inputs. Weights are clipped to [0,1].

As introduced in in https://openreview.net/pdf?id=H1gNOeHKPS

NeuralArithmetic.NPUType

NPU(in::Int, out::Int; initRe=glorot_uniform, initIm=Flux.zeros, initg=init05)

Neural Power Unit that can learn arbitrary power functions by using a complex weights. Uses gating on inputs to simplify learning. In 1D the layer looks like:

g = min(max(g, 0), 1)
r = abs(x) + eps(T)
r = g*r + (1-g)*T(1)
k = r < 0 ? pi : 0.0
exp(Re*log(r) - Im*k) * cos(Re*k + Im*log(r))
NeuralArithmetic.NaiveNPUType
NaiveNPU(in::Int, out::Int; initRe=glorot_uniform, initIm=zeros)

NPU without relevance gating mechanism.

NeuralArithmetic.RealNPUType
RealNPU(in::Int, out::Int; initRe=glorot_uniform, initg=init05)

NPU without imaginary weights.

NeuralArithmetic.iNALUType
iNALU(in::Int, out::Int; initNAC=glorot_uniform, initG=glorot_uniform, ϵ=1e-7, ω=20)

Improved NALU that can process negative numbers by recovering the multiplication sign. Implemented as suggested in: https://arxiv.org/abs/2003.07629