AugmentedGPLikelihoods.StudentTLikelihoodType
StudentTLikelihood(ν::Real, σ::Real)

Likelihood with a Student-T likelihood:

$$$p(y|f,\sigma, \nu) = \frac{\Gamma\left(\frac{\nu+1}{2}\right)}{\Gamma\left(\frac{\nu}{2}\right)\sqrt{\pi\nu}\sigma}\left(1 + \frac{1}{\nu}\left(\frac{x-\nu}{\sigma}\right)^2\right)^{-\frac{\nu+1}{2}}.$$$

Arguments

• ν::Real, number of degrees of freedom, should be positive and larger than 0.5 to be able to compute moments
• σ::Real, scaling of the inputs.
AugmentedGPLikelihoods.aug_loglikFunction
aug_loglik(lik::Likelihood, Ω, y, f) -> Real

Return the augmented log-likelihood with the given parameters. The augmented log-likelihood is of the form

$$$\log p(y,\Omega|f) = \log l(y,\Omega,f) + \log p(\Omega|y, f).$$$

To only obtain the $p(\Omega|y, f)$ part see aux_prior and see logtilt for $\log l(y, \Omega, f)$.

A generic fallback exists based on logtilt and aux_prior but specialized implementations are encouraged.

AugmentedGPLikelihoods.auglik_potentialFunction
auglik_potential(lik::Likelihood, Ω, y) -> Tuple

Given the augmented likelihood $l(\Omega,y,f) \propto \exp(\beta(\Omega,y) f + \gamma(\Omega,y)f^2)$, return the potential, $\beta(\Omega,y)$. Note that this equivalent to the shift of the first natural parameter $\eta_1 = \Sigma^{-1}\mu$. The Tuple contains a Vector for each latent.

AugmentedGPLikelihoods.auglik_precisionFunction
auglik_precision(lik::Likelihood, Ω, y) -> Tuple

Given the augmented likelihood $l(\Omega,y,f) \propto \exp(\beta(\Omega,y) f + \frac{\gamma(\Omega,y)}{2}f^2)$, return the precision, $\gamma(\Omega,y)$, note that this equivalent to the shift of the precision $\Lambda = \Sigma^{-1}$. The Tuple contains a Vector for each latent.

AugmentedGPLikelihoods.aux_full_conditionalFunction
aux_full_conditional(lik::Likelihood, y, f::Real) -> Distribution

Given the observation y and latent f, returns the full conditional on the auxiliary variables Ω.

AugmentedGPLikelihoods.aux_kldivergenceFunction
aux_kldivergence(lik::Likelihood, qΩ::For, pΩ::For) -> Real
aux_kldivergence(lik::Likelihood, qΩ::For, y) -> Real

Compute the analytical KL divergence between the auxiliary variables posterior $q(\Omega)$, obtained with aux_posterior and prior $p(\Omega)$, obtained with aux_prior.

AugmentedGPLikelihoods.aux_posteriorFunction
aux_posterior(lik::Likelihood, y, qf) -> AbstractProductMeasure

Compute the optimal posterior of the auxiliary variables in a new AbstractProductMeasure (For by default).

See aux_posterior! for more details

AugmentedGPLikelihoods.aux_posterior!Function
aux_posterior!(qΩ, lik::Likelihood, y, qf) -> AbstractProductMeasure

Compute the optimal posterior of the auxiliary variables $q^*(\Omega)$ given the marginal distributions qf by updating the variational parameters in-place using the formula

$$$q^*(\Omega) \propto \exp\left(E_{q(f)}\left[p(\Omega|f,y)\right]\right)$$$

AugmentedGPLikelihoods.aux_sampleFunction
aux_sample([rng::AbstractRNG], lik::Likelihood, y, f) -> TupleVector

Sample and allocate the auxiliary variables Ω in a TupleVector based on the full-conditional associated with the likelihood.

See aux_sample! for an in-place version.

AugmentedGPLikelihoods.aux_sample!Function
aux_sample!([rng::AbstractRNG], Ω, lik::Likelihood, y, f) -> TupleVector

Sample the auxiliary variables Ω in-place based on the full-conditional aux_full_conditional associated with the augmented likelihood:

$$$p(\Omega|y,f) \propto p(\Omega,y|f).$$$

See also aux_sample for an allocating version and aux_posterior! for variational inference.

AugmentedGPLikelihoods.expected_aug_loglikFunction
expected_aug_loglik(lik::Likelihood, qΩ, y, qf) -> Real

Return the expected augmented log-likelihood with the given parameters. The expected augmented log-likelihood is of the form

$$$E_{q(\Omega,f)}\left[\log p(y,\Omega|f)\right]$$$

To only obtain the $p(\Omega|y)$ part see aux_prior and see logtilt for $\log l(y, \Omega, f)$.

A generic fallback exists based on expected_logtilt and aux_kldivergence but specialized implementations are encouraged.

AugmentedGPLikelihoods.expected_auglik_potentialFunction
expected_auglik_potential(lik::Likelihood, qΩ, y) -> Tuple

Given the augmented likelihood $l(\Omega,y,f) \propto \exp(\beta(\Omega,y) f + \frac{\gamma(\Omega,y)}{2}f^2)$, return the expected potential, $E_{q(\Omega)}[\beta(\Omega,y)]$, note that this equivalent to the shift of the first variational natural parameter $\eta_1 = \Sigma^{-1}\mu$. The Tuple contains a Vector for each latent.

AugmentedGPLikelihoods.expected_auglik_precisionFunction
expected_auglik_precision(lik::Likelihood, qΩ, y) -> Tuple

Given the augmented likelihood $l(\Omega,y,f) \propto \exp(\beta(\Omega,y) f + \frac{\gamma(\Omega,y)}{2}f^2)$, return the expected precision, $E_{q(\Omega)}[\gamma(\Omega,y)]$, note that this equivalent to the shift of the variational precision $\Lambda = \Sigma^{-1}$. The Tuple contains a Vector for each latent.

AugmentedGPLikelihoods.expected_logtiltMethod
expected_logtilt(lik::Likelihood, qΩ, y, qf) -> Real

Compute the expectation of the quadratic part on $f$ of the augmented likelihood.

$$$E_{q(\Omega,f)}\left[\log C(\Omega, y) + \alpha(\Omega,y) + \beta(\Omega,y)f + \frac{\gamma(\Omega,y)}{2}f^2\right].$$$

AugmentedGPLikelihoods.init_aux_posteriorFunction
init_aux_posterior([T::DataType], lik::Likelihood, n::Int) -> AbstractProductMeasure

Initialize collections of n (independent) posteriors for the auxiliary variables in the context of variational inference. n should be the size of the data used every iteration. The real variational parameters will be given type T (Float64 by default)

AugmentedGPLikelihoods.init_aux_variablesFunction
init_aux_variables([rng::AbstractRNG], ::Likelihood, n::Int) -> TupleVector

Initialize collections of n auxiliary variables in a TupleVector to be used in the context of sampling. n should be the number of data inputs.

AugmentedGPLikelihoods.logtiltFunction
logtilt(lik::Likelihood, Ω, y, f) -> Real

Compute the quadratic part on $f$ of the augmented likelihood:

$$$\log C(\Omega, y) + \alpha(\Omega,y) + \beta(\Omega,y)f + \frac{\gamma(\Omega,y)}{2}f^2.$$$

AugmentedGPLikelihoods.SpecialDistributions.AbstractNTDistType

AbstractNTDist is an abstract type for a wrapper type around measure(s) and distributions(s). The main idea is that instead of rand, mean and other statistical tools wrapped objects return NamedTuples or TupleVector when having a collection of them.

The following API has to be implemented: Given π::AbstractNTDist and Π::AbstractVector{<:AbstractNTDist}

Necessary

• ntrand(rng, π) -> NamedTuple
• ntmean(π) -> NamedTuple
• MeasureBase.logdensity_def(π, x::NamedTuple) -> Real
• Distributions.kldivergence(π₀, π₁) -> Real

Optional

• tvrand(rng, Π) -> TupleVector
• tvmean(Π) -> TupleVector
AugmentedGPLikelihoods.SpecialDistributions.NegativeMultinomialType
NegativeMultinomial(x₀::Real, p::AbstractVector)

Negative Multinomial distribution defined as

$$$p(\boldsymbol{x}|x_0, \boldsymbol{p}) = \Gamma\left(\sum_{i=0}^M x_i \right)\frac{p_0^{x_0}}{\Gamma(x_0)}\prod_{i=1}^M \frac{p_i^{x_i}}{x_i!}$$$

where $p_0= 1-\sum_{i=1}^M p_i$.

For a detailed understanding of this distribution, see "Negative multinomial distribution" - Sibuya et al. - 1964

AugmentedGPLikelihoods.SpecialDistributions.PolyaGammaType
PolyaGamma(b::Real, c::Real) <: ContinuousUnivariateDistribution

Arguments

• b::Real
• c::Real exponential tilting

Create a PolyaGamma sampler with parameters b and c. Note that sampling will differ if b is a Int or a Real.

AugmentedGPLikelihoods.SpecialDistributions.PolyaGammaNegativeMultinomialType
PolyaGammaNegativeMultinomial(y::BitVector, c::AbstractVector{<:Real}, p::AbstractVector{<:Real})

A multivariate distribution, used as hierachical prior as:

$$$p(\boldsymbol{\omega}, \boldsymbol{n}) = \operatorname{NM}(\boldsymbol{n}|1, \boldsymbol{p})\prod_{i=1}^K\operatorname{PG}(\omega|y_i + n_i, c).$$$

Random samples as well as statistics from the distribution will returned as a NamedTuple : (;ω, n).

This structured distributions is needed for the CategoricalLikelihood with a LogisticSoftMaxLink.

AugmentedGPLikelihoods.SpecialDistributions.PolyaGammaPoissonType
PolyaGammaPoisson(y::Real, c::Real, λ::Real)

A bivariate distribution, used as hierachical prior as:

$$$p(\omega, n) = \operatorname{PG}(\omega|y + n, c)\operatorname{Po}(n|\lambda).$$$

Random samples as well as statistics from the distribution will returned as NamedTuple : (;ω, n).

This structured distributions is needed for example for the PoissonLikelihood.