正则化Stein变分梯度下降的有限粒子速率

📄 中文摘要

针对He等人(2024)提出的正则化Stein变分梯度下降(R-SVGD)算法,推导了其有限粒子收敛速率。R-SVGD通过对核化Wasserstein梯度应用预处理算子(resolvent-type preconditioner),有效修正了传统SVGD算法中存在的常数阶偏差。对于由此产生的N粒子相互作用系统,建立了时间平均(退火)经验测度的显式非渐近界限,证明了其收敛性。这些结果为理解R-SVGD在实际应用中,尤其是在有限粒子数量下的性能提供了坚实的理论基础,并揭示了该算法在解决复杂概率分布采样问题上的有效性。

🏷️ 相关标签

📄 English Summary

Finite-Particle Rates for Regularized Stein Variational Gradient Descent

This paper derives finite-particle convergence rates for the Regularized Stein Variational Gradient Descent (R-SVGD) algorithm, which corrects the constant-order bias of SVGD by applying a resolvent-type preconditioner to the kernelized Wasserstein gradient. For the resulting interacting N-particle system, we establish explicit non-asymptotic bounds for time-averaged (annealed) empirical measures, demonstrating its convergence.

Powered by Cloudflare Workers + Payload CMS + Claude 3.5

数据源: OpenAI, Google AI, DeepMind, AWS ML Blog, HuggingFace 等