Developer API

Summary

This page describes how to create a custom AbstractMuseProblem type. You might want to do this if you are creating a new interface between MuseInference and some PPL package that is not currently supported, or if you have a problem which SimpleMuseProblem cannot handle. You do not need to do this if you have a model you can describe with a supported PPL package (Turing or Soss), or if SimpleMuseProblem is sufficient for you.

As a reminder, MUSE works on joint posteriors of the form,

\[\mathcal{P}(x,z\,|\,\theta) \mathcal{P}(\theta)\]

where $x$ represents one or more observed variables, $z$ represents one or more latent variables, and $\theta$ represents one of more hyper parameters which will be estimated by MUSE. The interface below more or less involves mapping these variables from your original problem to the form expected by MuseInference. The minimum functions you need to implement to get MUSE working are:

The $(x,z,\theta)$ can be any types which support basic arithmetic.

Internally in the function MuseInference.ẑ_at_θ, MuseInference does a maximization over $z$ using logLike_and_∇z_logLike and Optim.jl's LBFGS solver. If you'd like, you can customize the entire maximization by directly implementing ẑ_at_θ yourself, in which case you do not need to implement logLike_and_∇z_logLike at all.

MuseInference assumes $z$ and $\theta$ have support on $(-\infty,\infty)$. For some problems, this may not be the case, e.g. if you have a $\theta \sim {\rm LogNormal}$, then $\theta$ only has support on $(0,\infty)$. If this is the case for your problem, you have three options:

  • If none of the internal solvers "bump up" against the edges of the support, then you don't need to do anything else.

  • Outside of MuseInference, you can perform a change-of-variables for $\theta$ and/or $z$ such that the transformed variables have support on $(-\infty,\infty)$, and implement the functions above in terms of the transformed variables. In this case, MuseInference never knows (or needs to know) about the transformation, and the returned estimate of $\theta$ will be an estimate of the transformed $\theta$ (which if desired you can transform back outside of MuseInference).

  • If you would like MuseInference itself to return an estimate of the untransformed $\theta$, then you can implement:

MuseInference doesn't provide an estimate of $z$, so if necessary, you should handle transforming it to $(-\infty,\infty)$ outside of MuseInference.

Once your define the custom AbstractMuseProblem, you can use MuseInference.check_self_consistency to run some self-consistency checks on it.

Index

Contents

MuseInference.transform_θFunction
transform_θ(prob::AbstractMuseProblem, θ)

Map θ to a space where its domain is $(-\infty,\infty)$. Defaults to identity function.

MuseInference.inv_transform_θFunction
inv_transform_θ(prob::AbstractMuseProblem, θ)

Map θ from the space where its domain is $(-\infty,\infty)$ back to the original space. Defaults to identity function.

MuseInference.sample_x_zFunction

Return a tuple (x,z) with data x and latent space z which are a sample from the joint likelihood, given θ. The signature of the function should be:

sample_x_z(prob::AbstractMuseProblem, rng::AbstractRNG, θ)

Random numbers generated internally should use rng.

The θ argument to this function will always be in the un-transfored θ space.

MuseInference.∇θ_logLikeFunction

Return the gradient of the joint log likelihood with respect to hyper parameters θ, evaluated at data x and latent space z. The signature of the function should be:

∇θ_logLike(prob::AbstractMuseProblem, x, z, θ)

If the problem needs a transformation of θ to map its domain to $(-\infty,\infty)$, then it should instead implement:

∇θ_logLike(prob::AbstractMuseProblem, x, z, θ, θ_space)

where θ_space will be either Transformedθ() or UnTransformedθ(). In this case, the θ argument will be passed in the space given by θ_space and the gradient should be w.r.t. to θ in that space.

z must have domain $(-\infty,\infty)$. If a transformation is required to make this the case, that should be handled internal to this function and z will always refer to the transformed z.

MuseInference.logLike_and_∇z_logLikeFunction

Return a tuple (logLike, ∇z_logLike) which give the log likelihood and its gradient with respect to the latent space z, evaluated at hyper parameters θ and data x . The signature of the function should be:

logLike_and_∇z_logLike(prob::AbstractMuseProblem, x, z, θ)

z must have domain $(-\infty,\infty)$. If a transformation is required to make this the case, that should be handled internal to this function and z will always refer to the transformed z.

The θ argument to this function will always be in the un-transfored θ space.

Note

Alternatively, custom problems can implement ẑ_at_θ directly and forego this method. The default ẑ_at_θ runs LBFGS with Optim.jl using logLike_and_∇z_logLike.

MuseInference.logPriorθFunction

Return the log-prior at θ. The signature of the function should be:

logPriorθ(prob::AbstractMuseProblem, θ)

If the problem needs a transformation of θ to map its domain to $(-\infty,\infty)$, then it should instead implement:

logPriorθ(prob::AbstractMuseProblem, θ, θ_space)

where θ_space will be either Transformedθ() or UnTransformedθ(). In this case, the θ argument will be passed in the space given by θ_space.

Defaults to zero log-prior.

MuseInference.ẑ_at_θFunction

Return the best-fit latent space z given data x and parameters θ. The signature of the function should be:

ẑ_at_θ(prob::AbstractMuseProblem, x, z₀, θ; ∇z_logLike_atol)

The return value should be (ẑ, info) where info can be any extra diagonstic info which will be saved in the MUSE result.

The θ argument to this function will always be in the un-transfored θ space.

The z₀ should be used as a starting guess for the solution.

z must have domain $(-\infty,\infty)$. If a transformation is required to make this the case, that should be handled internal to this function, and the return value should refer to the transformed z.

The default implementation of this method uses logLike_and_∇z_logLike and Optim.jl's LBFGS to iteratively maximize the log likelihood. Custom problems are free to override this default if desired, in which case logLike_and_∇z_logLike does not need to be implemented.

MuseInference.standardizeθFunction

Pre-process a user-provided θ into the data-structure used internally in the computation. E.g. this allow the user to pass a NamedTuple to functions like muse or get_J! while internally converting it to a ComponentVector. The signature of the function should be:

standardizeθ(prob::AbstractMuseProblem, θ)
MuseInference.check_self_consistencyFunction
check_self_consistency(
    prob, 
    θ;
    fdm = central_fdm(3, 1),
    atol = 1e-3,
    rng = Random.default_rng(),
    has_volume_factor = true
)

Checks the self-consistency of a defined problem at a given θ, e.g. check that inv_transform_θ(prob, transform_θ(prob, θ)) ≈ θ, etc... This is mostly useful as a diagonostic when implementing a new AbstractMuseProblem.

A random x and z are sampled from rng. Finite differences are computed using fdm and atol set the tolerance for . has_volume_factor determines if the transformation includes the logdet jacobian in the likelihood.