<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/"><channel><title>自然语言处理 on Awen's Paper Libarary</title><link>https://a23wen.github.io/paper-libarary/categories/%E8%87%AA%E7%84%B6%E8%AF%AD%E8%A8%80%E5%A4%84%E7%90%86/</link><description>Recent content in 自然语言处理 on Awen's Paper Libarary</description><generator>Hugo</generator><language>zh-cn</language><lastBuildDate>Fri, 10 Apr 2026 17:54:53 +0800</lastBuildDate><atom:link href="https://a23wen.github.io/paper-libarary/categories/%E8%87%AA%E7%84%B6%E8%AF%AD%E8%A8%80%E5%A4%84%E7%90%86/index.xml" rel="self" type="application/rss+xml"/><item><title>DeepSeek-R1: Incentivizing Reasoning Capability in LLMs via Reinforcement Learning</title><link>https://a23wen.github.io/paper-libarary/papers/deepseek-r1/</link><pubDate>Fri, 10 Apr 2026 17:50:00 +0800</pubDate><guid>https://a23wen.github.io/paper-libarary/papers/deepseek-r1/</guid><description>提出 DeepSeek-R1-Zero 与 DeepSeek-R1，展示大语言模型可以通过可验证任务上的大规模强化学习自发形成长链推理、反思与验证能力，并进一步通过多阶段训练和蒸馏把强推理能力迁移到更小模型。</description></item><item><title>Attention Is All You Need</title><link>https://a23wen.github.io/paper-libarary/papers/attention-is-all-you-need/</link><pubDate>Mon, 15 Jan 2024 10:00:00 +0800</pubDate><guid>https://a23wen.github.io/paper-libarary/papers/attention-is-all-you-need/</guid><description>提出了完全基于注意力机制的 Transformer 架构，摒弃了循环和卷积结构，在机器翻译任务上取得了 SOTA 性能，成为现代 NLP 的基石。</description></item></channel></rss>