LWM-Temporal:用于无线信道表示学习的稀疏时空注意力

📄 中文摘要

LWM-Temporal 是大型无线模型(LWM)系列的新成员,专注于无线信道的时空特性。作为一种任务无关的基础模型,LWM-Temporal 学习通用的信道嵌入,能够捕捉由移动性引起的演变,并可在各种下游任务中重复使用。为了实现这一目标,LWM-Temporal 在角度-延迟-时间域中操作,引入了稀疏时空注意力(SSTA),这是一种传播对齐的注意力机制,限制了交互仅在物理上合理的邻域内进行,从而将注意力复杂度降低了一个数量级,同时保持几何一致的依赖关系。LWM-Temporal 采用自监督方式进行预训练,使用物理信息掩码进行训练。

📄 English Summary

LWM-Temporal: Sparse Spatio-Temporal Attention for Wireless Channel Representation Learning

LWM-Temporal is a new member of the Large Wireless Models (LWM) family that focuses on the spatiotemporal characteristics of wireless channels. Designed as a task-agnostic foundation model, LWM-Temporal learns universal channel embeddings that capture mobility-induced evolution and are reusable across various downstream tasks. To achieve this goal, LWM-Temporal operates in the angle-delay-time domain and introduces Sparse Spatio-Temporal Attention (SSTA), a propagation-aligned attention mechanism that restricts interactions to physically plausible neighborhoods. This approach reduces attention complexity by an order of magnitude while preserving geometry-consistent dependencies. LWM-Temporal is pretrained in a self-supervised manner using a physics-informed mask.

Powered by Cloudflare Workers + Payload CMS + Claude 3.5

数据源: OpenAI, Google AI, DeepMind, AWS ML Blog, HuggingFace 等