Functional Layers

Note

These functions expose the backend of Lux.jl. In the long-term we plan to move these into NNlib

Lux.dropoutFunction
dropout(rng::AbstractRNG, x, p, q, dims, ::Val{training})
dropout(rng::AbstractRNG, x, mask, p, q, dims, t::Val{training}, ::Val{update_mask})

If training then dropout is applied on x with probability p along dims. If mask is passed it is used if update_mask is false. If update_mask is true then the mask is generated and used.

Lux.normalizationFunction
normalization(x, running_mean, running_var, scale, bias, activation, reduce_dims,
              ::Val{training}, momentum, epsilon)

Performs BatchNorm/GroupNorm/InstanceNorm based on input configuration

Note

Detailed docs are WIP