去中心化注意力无法处理中心化信号:重新思考医疗时间序列的变换器

📄 中文摘要

医疗时间序列(MedTS)数据的准确分析在医疗应用中至关重要,尤其是在脑部和心脏疾病的诊断中。MedTS数据通常表现出两个关键模式:单通道内的时间依赖性和多个通道之间的通道依赖性。尽管近年来深度学习的进展使得基于变换器的模型能够有效捕捉时间依赖性,但在建模通道依赖性方面却面临挑战。这一局限性源于结构不匹配:MedTS信号本质上是中心化的,而变换器的注意力机制则是去中心化的,因此在捕捉通道依赖性时效果不佳。

📄 English Summary

Decentralized Attention Fails Centralized Signals: Rethinking Transformers for Medical Time Series

Accurate analysis of medical time series (MedTS) data, such as electroencephalography (EEG) and electrocardiography (ECG), is crucial for healthcare applications, particularly in diagnosing brain and heart diseases. MedTS data typically exhibit two critical patterns: temporal dependencies within individual channels and channel dependencies across multiple channels. Recent advancements in deep learning have enabled Transformer-based models to effectively capture temporal dependencies; however, they struggle with modeling channel dependencies. This limitation arises from a structural mismatch: MedTS signals are inherently centralized, while the Transformer's attention mechanism is decentralized, rendering it less effective in capturing channel dependencies.

Powered by Cloudflare Workers + Payload CMS + Claude 3.5

数据源: OpenAI, Google AI, DeepMind, AWS ML Blog, HuggingFace 等