知识图谱与超图变换器:基于库注意力和基于旅程的角色传输
📄 中文摘要
该研究提出了一种简洁的架构,旨在对句子和结构化数据进行联合训练,同时保持知识表示和语言表示的可分离性。该模型将知识图谱和超图视为具有角色槽的结构实例,并将其编码为语言变换器可以关注的键值库。注意力机制通过基于旅程的角色传输进行调节,统一了边标签知识图谱遍历、超边遍历和句子结构。研究概述了双流架构、具有实例局部、邻域和全局混合注意力的层级组、对单独库的检索以及涵盖掩蔽语言建模、链接预测和角色一致性去噪的多任务目标。
📄 English Summary
Knowledge Graph and Hypergraph Transformers with Repository-Attention and Journey-Based Role Transport
This research presents a concise architecture for joint training on sentences and structured data while maintaining the separability of knowledge and language representations. The model treats knowledge graphs and hypergraphs as structured instances with role slots, encoding them into a key-value repository that a language transformer can attend to. Attention is conditioned by journey-based role transport, which unifies edge-labeled knowledge graph traversal, hyperedge traversal, and sentence structure. The study outlines a dual-stream architecture, hierarchical layer groups with instance-local, neighborhood, and global mixing attention, retrieval over a separate repository, and multi-task objectives that span masked language modeling, link prediction, and role-consistency denoising.
Powered by Cloudflare Workers + Payload CMS + Claude 3.5
数据源: OpenAI, Google AI, DeepMind, AWS ML Blog, HuggingFace 等