mig package

The mig package provides utilities for kernel density estimation for random vectors using the multivariate inverse Gaussian distribution defined over the half space \(\mathcal{H}_d(\boldsymbol{\beta}) = \{\boldsymbol{x} \in \mathbb{R}^d: \boldsymbol{\beta}^\top \boldsymbol{x} > 0\}\) with location vector \(\boldsymbol{\xi}\), scale matrix \(\boldsymbol{\Omega}\), whose density is \[\begin{align*} \frac{\boldsymbol{\beta}^\top\boldsymbol{\xi}}{(2\pi)^{d/2}|\boldsymbol{\Omega}|}(\boldsymbol{\beta}^\top\boldsymbol{x})^{-d/2-1}\exp \left\{-\frac{(\boldsymbol{x} - \boldsymbol{\xi})^\top \boldsymbol{\Omega}^{-1}(\boldsymbol{x}-\boldsymbol{\xi})}{2\boldsymbol{\beta}^\top\boldsymbol{x}}\right\}, \qquad \boldsymbol{x} \in \mathcal{H}_d(\boldsymbol{\beta}). \end{align*}\]

Random number generation

Minami (2003) provides a constructive characterization of the inverse Gaussian as the hitting time of a particular hyperplane by a correlated Brownian motion, simulation requires discretization of the latter, and more accurate simulations come at increased costs.

Let \(\boldsymbol{\beta} \in \mathbb{R}^d\) be the vector defining the halfspace and consider a \((d-1) \times d\) matrix \(\mathbf{Q}_2\), such that \(\mathbf{Q}_2^\top\boldsymbol{\beta} = \boldsymbol{0}_{d-1}\) and \(\mathbf{Q}_2\mathbf{Q}_2^\top = \mathbf{I}_{d-1}\). Theorem 1 (3) of Minami (2003) implies that, for \(\mathbf{Q} = (\boldsymbol{\beta}, \mathbf{Q}_2^\top)\vphantom{Q}^{\top}\) and \[\begin{align*} Z_1 &\sim \mathsf{MIG}(\boldsymbol{\beta}^\top\boldsymbol{\xi}, \boldsymbol{\beta}^\top\boldsymbol{\Omega}\boldsymbol{\beta}) \\ \boldsymbol{Z}_2 \mid Z_1 = z_1 &\sim \mathsf{Norm}_{d-1}\left[\mathbf{Q}_2\{\boldsymbol{\xi} + \boldsymbol{\Omega}\boldsymbol{\beta}/(\boldsymbol{\beta}^\top\boldsymbol{\Omega}\boldsymbol{\beta})(z_1-\boldsymbol{\beta}^\top\boldsymbol{\xi})\}, z_1(\mathbf{Q}_2\boldsymbol{\Omega}^{-1}\mathbf{Q}_2^\top)^{-1}\right], \end{align*}\] we have \(\mathbf{Q}^{-1}\boldsymbol{Z} \sim \mathsf{MIG}(\boldsymbol{\beta}, \boldsymbol{\xi}, \boldsymbol{\Omega})\).

Consider the symmetric orthogonal projection matrix \(\mathbf{M}_{\boldsymbol{\beta}}=\mathbf{I}_d - \boldsymbol{\beta}\boldsymbol{\beta}^\top/(\boldsymbol{\beta}^\top\boldsymbol{\beta})\) of rank \(d-1\) due to the linear dependency. We build \(\mathbf{Q}_2\) from the set of \(d-1\) eigenvectors associated to the non-zero eigenvalues of \(\mathbf{M}_{\boldsymbol{\beta}}\). We can then perform forward sampling of \(Z_1\) and \(\boldsymbol{Z}_2 \mid Z_1\) and compute the resulting vectors.

# Create projection matrix onto the orthogonal complement of beta
d <- 5L # dimension of vector
n <- 1e4L # number of simulations
beta <- rexp(d)
xi <- rexp(d)
Omega <- matrix(0.5, d, d) + diag(d)
# Project onto orthogonal complement of vector
Mbeta <- (diag(d) - tcrossprod(beta)/(sum(beta^2)))
# Matrix is rank-deficient: compute eigendecomposition 
# Shed matrix to remove the eigenvector corresponding to the 0 eigenvalue
Q2 <- t(eigen(Mbeta, symmetric = TRUE)$vectors[,-d])
# Check Q2 satisfies the conditions
all.equal(rep(0, d-1), c(Q2 %*% beta)) # check orthogonality
#> [1] TRUE
all.equal(diag(d-1), tcrossprod(Q2)) # check basis is orthonormal
#> [1] TRUE
Qmat <- rbind(beta, Q2)
covmat <- solve(Q2 %*% solve(Omega) %*% t(Q2))

# Compute mean and variance for Z1
mu <- sum(beta*xi)
omega <- sum(beta * c(Omega %*% beta))
Z1 <- rmig(n = n, xi = mu, Omega = omega) # uses statmod, with mean = mu and shape mu^2/omega
# Generate Gaussian vectors in two-steps (vectorized operations)
Z2 <- sweep(TruncatedNormal::rtmvnorm(n = n, mu = rep(0, d-1), sigma = covmat), 1, sqrt(Z1), "*")
Z2 <- sweep(Z2, 2, c(Q2 %*% xi), "+") + tcrossprod(Z1 - mu, c(Q2 %*% c(Omega %*% beta)/omega))
# Compute inverse of Q matrix (it is actually invertible)
samp <- t(solve(Qmat) %*% t(cbind(Z1, Z2)))
# Check properties
mle <- mig::fit_mig(samp, beta = beta)
max(abs(mle$xi - xi))
#> [1] 0.04395097
norm(mle$Omega - Omega, type = "f")
#> [1] 0.1189646
max(abs(1 - mle$Omega/Omega))
#> [1] 0.08561322

References

Minami, M. 2003. “A Multivariate Extension of Inverse Gaussian Distribution Derived from Inverse Relationship.” Communications in Statistics. Theory and Methods 32 (12): 2285–2304.