不再幻想:基于物理的核网络用于几何基础的神经计算

📄 中文摘要

研究提出了一种新的核算子yat-product,该算子结合了二次对齐和反平方接近性。证明其为Mercer核,具有解析性、在有界域上满足Lipschitz条件,并且自我正则化,能够唯一地嵌入再生核希尔伯特空间(RKHS)。神经物质网络(NMNs)将yat-product作为唯一的非线性激活函数,用几何基础的操作替代传统的线性激活归一化模块。这种架构简化保留了通用逼近能力,同时将归一化过程转移到核内部,通过分母实现,而不再依赖于单独的归一化层。实验证明,基于NMN的分类器在MNIST数据集上与线性基线相匹配,同时展现出有限的原型演化和叠加鲁棒性。

📄 English Summary

No More DeLuLu: Physics-Inspired Kernel Networks for Geometrically-Grounded Neural Computation

This research introduces a novel kernel operator called the yat-product, which combines quadratic alignment with inverse-square proximity. It is proven to be a Mercer kernel, analytic, Lipschitz on bounded domains, and self-regularizing, allowing for a unique embedding in the Reproducing Kernel Hilbert Space (RKHS). Neural Matter Networks (NMNs) utilize the yat-product as the sole non-linearity, replacing conventional linear activation normalization blocks with a single geometrically grounded operation. This architectural simplification preserves universal approximation while shifting normalization into the kernel itself via the denominator, rather than relying on separate normalization layers. Empirical results show that NMN-based classifiers match linear baselines on the MNIST dataset while exhibiting bounded prototype evolution and robustness in superposition.

Powered by Cloudflare Workers + Payload CMS + Claude 3.5

数据源: OpenAI, Google AI, DeepMind, AWS ML Blog, HuggingFace 等