JAWS:通过空间自适应雅可比正则化增强神经算子的长期推广
📄 中文摘要
数据驱动的代理模型提高了连续动力系统模拟的效率,但其自回归推广常常受到不稳定性和谱膨胀的限制。虽然全局正则化技术可以强制执行收缩动态,但它们均匀地抑制高频特征,导致收缩-耗散困境。此外,明确纠正漂移的长期轨迹优化方法受到内存限制的瓶颈。JAWS(Jacobian-Adaptive Weighting for Stability)是一种旨在缓解这些局限性的概率正则化策略。通过将算子学习框架设定为具有空间异方差不确定性的最大后验估计,JAWS 提供了一种新的解决方案,以提高长时间范围内的稳定性和效率。
📄 English Summary
JAWS: Enhancing Long-term Rollout of Neural Operators via Spatially-Adaptive Jacobian Regularization
Data-driven surrogate models enhance the efficiency of simulating continuous dynamical systems; however, their autoregressive rollouts are often constrained by instability and spectral blow-up. While global regularization techniques can enforce contractive dynamics, they uniformly dampen high-frequency features, leading to a contraction-dissipation dilemma. Additionally, long-horizon trajectory optimization methods that explicitly correct drift face bottlenecks due to memory constraints. JAWS (Jacobian-Adaptive Weighting for Stability) is proposed as a probabilistic regularization strategy to address these limitations. By framing operator learning as Maximum A Posteriori (MAP) estimation with spatially heteroscedastic uncertainty, JAWS offers a novel solution to enhance stability and efficiency over extended time horizons.
Powered by Cloudflare Workers + Payload CMS + Claude 3.5
数据源: OpenAI, Google AI, DeepMind, AWS ML Blog, HuggingFace 等