pymc.variational.operators.KSD#
- class pymc.variational.operators.KSD(approx, temperature=1)[source]#
Operator based on Kernelized Stein Discrepancy.
- Input: A target distribution with density function \(p(x)\)
and a set of initial particles \(\{x^0_i\}^n_{i=1}\)
Output: A set of particles \(\{x_i\}^n_{i=1}\) that approximates the target distribution.
\[\begin{split}x_i^{l+1} \leftarrow \epsilon_l \hat{\phi}^{*}(x_i^l) \\ \hat{\phi}^{*}(x) = \frac{1}{n}\sum^{n}_{j=1}[k(x^l_j,x) \nabla_{x^l_j} logp(x^l_j)/temp + \nabla_{x^l_j} k(x^l_j,x)]\end{split}\]- Parameters:
- approx: :class:`Approximation`
Approximation used for inference
- temperature: float
Temperature for Stein gradient
References
Qiang Liu, Dilin Wang (2016) Stein Variational Gradient Descent: A General Purpose Bayesian Inference Algorithm arXiv:1608.04471
Methods
KSD.__init__(approx[, temperature])KSD.apply(f)Operator itself.
Attributes
Tdatalogpdatalogp_normhas_test_functioninputslogplogp_normlogqlogq_normmodelrequire_logqreturns_losssupports_aevbvarlogpvarlogp_norm