关于去个体化神经元:连续对称性使动态拓扑成为可能

📄 中文摘要

提出了一种新方法,通过利用基于对称性的各类原语——各向同性激活函数,构建动态网络。这种方法使得神经网络能够实时根据任务需求进行生长和收缩。网络结构的变化在对称重参数化下保持不变,从而在神经生成过程中保持计算一致性,并在神经退化过程中得到良好近似。利用各向同性原语的基独立性,消除了隐含在逐元素函数形式中的个体化神经元。各向同性因此允许对层的分解和解释具有更大的自由度。

📄 English Summary

On De-Individuated Neurons: Continuous Symmetries Enable Dynamic Topologies

A novel methodology for dynamic networks is introduced by leveraging a symmetry-principled class of primitives known as isotropic activation functions. This approach enables real-time neuronal growth and shrinkage of architectures in response to task demands. Structural changes in the network remain invariant under symmetry reparameterizations, ensuring identical computations during neurogenesis and well-approximated computations during neurodegeneration. By utilizing the basis independence property of isotropic primitives, the methodology eliminates the individuated neurons implicit in the elementwise functional form. Isotropy thus allows greater freedom in the basis to which layers are decomposed and interpreted.

Powered by Cloudflare Workers + Payload CMS + Claude 3.5

数据源: OpenAI, Google AI, DeepMind, AWS ML Blog, HuggingFace 等