{"id":3018,"date":"2025-12-10T00:21:20","date_gmt":"2025-12-09T16:21:20","guid":{"rendered":"https:\/\/www.tiptinker.com\/attention-is-all-you-need-a-visual-guide-to-the-transformer-architecture\/"},"modified":"2025-12-10T21:35:01","modified_gmt":"2025-12-10T13:35:01","slug":"attention-is-all-you-need-a-visual-guide-to-the-transformer-architecture","status":"publish","type":"post","link":"https:\/\/www.tiptinker.com\/zh-hans\/attention-is-all-you-need-a-visual-guide-to-the-transformer-architecture\/","title":{"rendered":"\u56fe\u89e3 Transformer\uff1a\u5f7b\u5e95\u641e\u61c2 AI \u65f6\u4ee3\u7684\u5960\u57fa\u4e4b\u4f5c (Attention Is All You Need)"},"content":{"rendered":"<p>ChatGPT\u3001Claude\u3001Gemini\u2014\u2014\u8fd9\u4e9b\u6539\u53d8\u4e16\u754c\u7684 AI \u6a21\u578b\u80cc\u540e\uff0c\u90fd\u7ad9\u7740\u540c\u4e00\u4e2a\u5de8\u4eba\uff1a2017 \u5e74 Google \u56e2\u961f\u53d1\u8868\u7684\u8bba\u6587 <strong>\u300aAttention Is All You Need\u300b<\/strong>\u3002<\/p>\n<p>\u5728\u8fd9\u7bc7\u6587\u7ae0\u53d1\u8868\u4e4b\u524d\uff0c\u81ea\u7136\u8bed\u8a00\u5904\u7406\uff08NLP\uff09\u662f\u5faa\u73af\u795e\u7ecf\u7f51\u7edc\uff08RNN\uff09\u548c LSTM \u7684\u5929\u4e0b\u3002\u7136\u800c\uff0c\u8fd9\u4e9b\u65e7\u67b6\u6784\u50cf\u662f\u4e00\u6761\u5355\u884c\u9053\uff0c\u5904\u7406\u957f\u6587\u672c\u65f6\u6548\u7387\u4f4e\u4e0b\u4e14\u5bb9\u6613\u201c\u9057\u5fd8\u201d\u3002Transformer \u7684\u51fa\u73b0\uff0c\u5f7b\u5e95\u6253\u7834\u4e86\u987a\u5e8f\u5904\u7406\u7684\u67b7\u9501\uff0c\u5f00\u542f\u4e86\u5e76\u884c\u8ba1\u7b97\u7684 AI \u9ec4\u91d1\u65f6\u4ee3\u3002<\/p>\n<p>\u672c\u6587\u5c06\u5265\u79bb\u590d\u6742\u7684\u6570\u5b66\u516c\u5f0f\uff0c\u901a\u8fc7\u53ef\u89c6\u5316\u56fe\u89e3\u548c\u6838\u5fc3\u4ee3\u7801\uff0c\u5e26\u4f60\u5f7b\u5e95\u770b\u61c2 Transformer \u7684\u5185\u90e8\u6784\u9020\u3002<\/p>\n<h2>\u6838\u5fc3\u6982\u5ff5\uff1a\u4e3a\u4ec0\u4e48\u662f Transformer\uff1f<\/h2>\n<p>\u5728 Transformer \u51fa\u73b0\u4e4b\u524d\uff0cNLP \u6a21\u578b\u5904\u7406\u53e5\u5b50\u50cf\u662f\u5728\u8bfb\u4e00\u672c\u4e66\uff1a\u5fc5\u987b\u8bfb\u5b8c\u4e0a\u4e00\u9875\uff0c\u624d\u80fd\u8bfb\u4e0b\u4e00\u9875\uff08\u987a\u5e8f\u5904\u7406\uff09\u3002\u5982\u679c\u4f60\u8bfb\u5230\u7b2c 100 \u9875\u65f6\u5fd8\u4e86\u7b2c 1 \u9875\u7684\u5185\u5bb9\uff0c\u6a21\u578b\u5c31\u5d29\u584c\u4e86\u3002<\/p>\n<p>Transformer \u505a\u51fa\u7684\u6700\u5927\u6539\u53d8\u662f<strong>\u5e76\u884c\u5316\uff08Parallelization\uff09<\/strong>\u3002\u5b83\u80fd\u50cf\u62e5\u6709\u201c\u4e0a\u5e1d\u89c6\u89d2\u201d\u4e00\u6837\uff0c\u540c\u65f6\u770b\u5230\u53e5\u5b50\u4e2d\u7684\u6bcf\u4e00\u4e2a\u8bcd\uff0c\u5e76\u77ac\u95f4\u8ba1\u7b97\u51fa\u8bcd\u4e0e\u8bcd\u4e4b\u95f4\u7684\u5173\u8054\u5f3a\u5ea6\u3002\u8fd9\u79cd\u673a\u5236\u7684\u6838\u5fc3\uff0c\u5c31\u662f <strong>\u81ea\u6ce8\u610f\u529b\u673a\u5236\uff08Self-Attention\uff09<\/strong>\u3002<\/p>\n<h3>Transformer \u5b8f\u89c2\u67b6\u6784\u56fe<\/h3>\n<div class=\"easy-mermaid-wrapper\">\n<pre><code class=\"language-mermaid\">graph TD\r\n    subgraph \"Encoder (\u7f16\u7801\u5668)\"\r\n        A[\"\u8f93\u5165 (Inputs)\"] --&gt; B[\"\u8f93\u5165\u5d4c\u5165 (Input Embeddings)\"]\r\n        B --&gt; C[\"\u4f4d\u7f6e\u7f16\u7801 (Positional Encoding)\"]\r\n        C --&gt; D[\"\u591a\u5934\u6ce8\u610f\u529b (Multi-Head Attention)\"]\r\n        D --&gt; E[\"\u5c42\u5f52\u4e00\u5316 (Add &amp; Norm)\"]\r\n        E --&gt; F[\"\u524d\u9988\u795e\u7ecf\u7f51\u7edc (Feed Forward)\"]\r\n        F --&gt; G[\"\u5c42\u5f52\u4e00\u5316 (Add &amp; Norm)\"]\r\n    end\r\n\r\n    subgraph \"Decoder (\u89e3\u7801\u5668)\"\r\n        H[\"\u8f93\u51fa (Outputs)\"] --&gt; I[\"\u8f93\u51fa\u5d4c\u5165 (Output Embeddings)\"]\r\n        I --&gt; J[\"\u4f4d\u7f6e\u7f16\u7801 (Positional Encoding)\"]\r\n        J --&gt; K[\"\u63a9\u7801\u591a\u5934\u6ce8\u610f\u529b (Masked Multi-Head Attention)\"]\r\n        K --&gt; L[\"\u5c42\u5f52\u4e00\u5316 (Add &amp; Norm)\"]\r\n        L --&gt; M[\"\u591a\u5934\u6ce8\u610f\u529b (Multi-Head Attention)\"]\r\n        M --&gt; N[\"\u5c42\u5f52\u4e00\u5316 (Add &amp; Norm)\"]\r\n        N --&gt; O[\"\u524d\u9988\u795e\u7ecf\u7f51\u7edc (Feed Forward)\"]\r\n        O --&gt; P[\"\u5c42\u5f52\u4e00\u5316 (Add &amp; Norm)\"]\r\n    end\r\n    \r\n    G --&gt; M\r\n    P --&gt; Q[\"\u7ebf\u6027\u5c42 &amp; Softmax (Linear &amp; Softmax)\"]\r\n    Q --&gt; R[\"\u6700\u7ec8\u8f93\u51fa\u6982\u7387 (Output Probabilities)\"]\r\n<\/code><\/pre>\n<\/div>\n<p>\u4e0a\u9762\u7684\u6d41\u7a0b\u56fe\u5c55\u793a\u4e86 Transformer \u7684\u7ecf\u5178 Encoder-Decoder\uff08\u7f16\u7801\u5668-\u89e3\u7801\u5668\uff09\u7ed3\u6784\u3002<\/p>\n<h2>\u6838\u5fc3\u7ec4\u4ef6\u89e3\u6790\uff1aQ, K, V \u7684\u79d8\u5bc6<\/h2>\n<p>Transformer \u4e2d\u6700\u62bd\u8c61\u4e5f\u6700\u91cd\u8981\u7684\u6982\u5ff5\u662f <strong>Query (\u67e5\u8be2)<\/strong>\u3001<strong>Key (\u952e)<\/strong> \u548c <strong>Value (\u503c)<\/strong>\u3002<\/p>\n<p>\u60f3\u8c61\u4f60\u5728\u56fe\u4e66\u9986\u627e\u8d44\u6599\uff1a<\/p>\n<ol>\n<li><strong>Query (Q)<\/strong>\uff1a\u4f60\u624b\u4e2d\u7684\u4e66\u5355\uff08\u4f60\u60f3\u627e\u4ec0\u4e48\uff09\u3002<\/li>\n<li><strong>Key (K)<\/strong>\uff1a\u4e66\u810a\u4e0a\u7684\u6807\u7b7e\uff08\u4e66\u7684\u5206\u7c7b\u4fe1\u606f\uff09\u3002<\/li>\n<li><strong>Value (V)<\/strong>\uff1a\u4e66\u91cc\u7684\u5b9e\u9645\u5185\u5bb9\uff08\u4f60\u771f\u6b63\u9700\u8981\u7684\u77e5\u8bc6\uff09\u3002<\/li>\n<\/ol>\n<p>\u6ce8\u610f\u529b\u673a\u5236\u5c31\u662f\u8ba1\u7b97 <strong>Query<\/strong> \u548c <strong>Key<\/strong> \u7684\u5339\u914d\u5ea6\uff08\u76f8\u4f3c\u6027\uff09\u3002\u5339\u914d\u5ea6\u8d8a\u9ad8\uff0c\u4f60\u5bf9\u8fd9\u672c\u4e66\u7684 <strong>Value<\/strong> \u6295\u5165\u7684\u6ce8\u610f\u529b\u5c31\u8d8a\u591a\u3002<\/p>\n<h3>\u6570\u5b66\u516c\u5f0f\uff08\u53ef\u89c6\u5316\u7406\u89e3\uff09<\/h3>\n<p>\u6ce8\u610f\u529b\u5206\u6570\u7684\u8ba1\u7b97\u516c\u5f0f\u5982\u4e0b\uff1a<\/p>\n<div class=\"easy-katex-wrapper easy-katex-block\" id=\"katex-1\" data-formula=\"\\text{Attention}(Q, K, V) = \\text{softmax}\\left(\\frac{QK^T}{\\sqrt{d_k}}\\right)V\" data-display=\"true\"><\/div>\n<ul>\n<li><strong>$QK^T$<\/strong>\uff1a\u8ba1\u7b97\u67e5\u8be2\u4e0e\u6240\u6709\u952e\u7684\u76f8\u4f3c\u5ea6\uff08\u70b9\u79ef\uff09\u3002<\/li>\n<li><strong>$\\sqrt{d_k}$<\/strong>\uff1a\u7f29\u653e\u56e0\u5b50\uff0c\u9632\u6b62\u68af\u5ea6\u6d88\u5931\u3002<\/li>\n<li><strong>Softmax<\/strong>\uff1a\u5c06\u5206\u6570\u5f52\u4e00\u5316\u4e3a\u6982\u7387\uff08\u6240\u6709\u5206\u6570\u52a0\u8d77\u6765\u7b49\u4e8e 1\uff09\u3002<\/li>\n<li><strong>$V$<\/strong>\uff1a\u6839\u636e\u6982\u7387\u52a0\u6743\u6c42\u548c\uff0c\u63d0\u53d6\u91cd\u8981\u4fe1\u606f\u3002<\/li>\n<\/ul>\n<h3>\u6838\u5fc3\u4ee3\u7801\u5b9e\u73b0 (PyTorch)<\/h3>\n<p>\u4e0d\u8981\u88ab\u7406\u8bba\u5413\u5012\uff0c\u7528 Python \u5b9e\u73b0\u6838\u5fc3\u7684\u201c\u7f29\u653e\u70b9\u79ef\u6ce8\u610f\u529b\u201d\u5176\u5b9e\u975e\u5e38\u7b80\u6d01\uff1a<\/p>\n<pre><code class=\"language-python\">import torch\r\nimport torch.nn.functional as F\r\nimport math\r\n\r\ndef scaled_dot_product_attention(query, key, value, mask=None):\r\n    \"\"\"\r\n    \u8ba1\u7b97\u6ce8\u610f\u529b\u673a\u5236\u7684\u6838\u5fc3\u51fd\u6570\r\n    query, key, value \u7684\u7ef4\u5ea6: (batch_size, num_heads, seq_len, depth)\r\n    \"\"\"\r\n    d_k = query.size(-1)\r\n    \r\n    # 1. \u8ba1\u7b97 Q \u548c K \u7684\u70b9\u79ef (QK^T)\r\n    scores = torch.matmul(query, key.transpose(-2, -1))\r\n    \r\n    # 2. \u7f29\u653e (Scale)\r\n    scores = scores \/ math.sqrt(d_k)\r\n    \r\n    # 3. \u5e94\u7528\u63a9\u7801 (Masking) - \u7528\u4e8e\u89e3\u7801\u5668\uff0c\u9632\u6b62\u770b\u5230\u672a\u6765\u7684\u8bcd\r\n    if mask is not None:\r\n        scores = scores.masked_fill(mask == 0, -1e9)\r\n    \r\n    # 4. Softmax \u5f52\u4e00\u5316\r\n    attention_weights = F.softmax(scores, dim=-1)\r\n    \r\n    # 5. \u4e0e V \u76f8\u4e58\u5f97\u5230\u6700\u7ec8\u8f93\u51fa\r\n    output = torch.matmul(attention_weights, value)\r\n    \r\n    return output, attention_weights\r\n<\/code><\/pre>\n<h2>\u5173\u952e\u673a\u5236\u5206\u6b65\u62c6\u89e3<\/h2>\n<h3>1. \u591a\u5934\u6ce8\u610f\u529b (Multi-Head Attention)<\/h3>\n<p>\u5982\u679c\u53ea\u7528\u4e00\u7ec4 Q\u3001K\u3001V\uff0c\u53ef\u80fd\u53ea\u80fd\u6355\u6349\u5230\u4e00\u79cd\u8bed\u4e49\u5173\u7cfb\uff08\u4f8b\u5982\u8bed\u6cd5\u5173\u7cfb\uff09\u3002<br \/>\n<strong>\u591a\u5934\u6ce8\u610f\u529b<\/strong> \u5c31\u50cf\u662f\u8ba9 8 \u4e2a\u4e0d\u540c\u89c6\u89d2\u7684\u4e13\u5bb6\u540c\u65f6\u8bfb\u4e00\u53e5\u8bdd\uff1a<\/p>\n<ul>\n<li>\u4e13\u5bb6 A \u5173\u6ce8\u8bed\u6cd5\u7ed3\u6784\u3002<\/li>\n<li>\u4e13\u5bb6 B \u5173\u6ce8\u6307\u4ee3\u5173\u7cfb\uff08\u4f8b\u5982 &#8220;\u5b83&#8221; \u6307\u7684\u662f\u8c01\uff09\u3002<\/li>\n<li>\u4e13\u5bb6 C \u5173\u6ce8\u60c5\u611f\u8272\u5f69\u3002<br \/>\n\u6700\u540e\u5c06\u8fd9 8 \u4e2a\u4e13\u5bb6\u7684\u7ed3\u679c\u62fc\u63a5\u8d77\u6765\uff0c\u901a\u8fc7\u4e00\u4e2a\u7ebf\u6027\u5c42\u878d\u5408\u3002<\/li>\n<\/ul>\n<h3>2. \u4f4d\u7f6e\u7f16\u7801 (Positional Encoding)<\/h3>\n<p>\u56e0\u4e3a Transformer \u662f\u5e76\u884c\u5904\u7406\u6240\u6709\u8bcd\u7684\uff0c\u5b83\u672c\u8eab\u4e0d\u77e5\u9053 &#8220;\u6211\u7231\u4f60&#8221; \u548c &#8220;\u4f60\u7231\u6211&#8221; \u4e2d\u8bcd\u5e8f\u7684\u533a\u522b\u3002<br \/>\n<strong>\u4f4d\u7f6e\u7f16\u7801<\/strong> \u662f\u7ed9\u6bcf\u4e2a\u8bcd\u7684\u5411\u91cf\u52a0\u4e0a\u4e00\u4e2a\u72ec\u7279\u7684\u201c\u4f4d\u7f6e\u6307\u7eb9\u201d\uff08\u57fa\u4e8e\u6b63\u5f26\u548c\u4f59\u5f26\u51fd\u6570\uff09\uff0c\u8ba9\u6a21\u578b\u77e5\u9053\u8bcd\u5728\u53e5\u5b50\u4e2d\u7684\u5148\u540e\u987a\u5e8f\u3002<\/p>\n<h3>3. \u6b8b\u5dee\u8fde\u63a5\u4e0e\u5f52\u4e00\u5316 (Add &amp; Norm)<\/h3>\n<ul>\n<li><strong>\u6b8b\u5dee\u8fde\u63a5 (Residual Connection)<\/strong>\uff1a\u5c06\u8f93\u5165\u76f4\u63a5\u52a0\u5230\u8f93\u51fa\u4e0a ($x + \\text{Layer}(x)$)\uff0c\u5c31\u50cf\u7ed9\u4fe1\u606f\u4fee\u4e86\u4e00\u6761\u9ad8\u901f\u516c\u8def\uff0c\u9632\u6b62\u6df1\u5c42\u7f51\u7edc\u4e2d\u4fe1\u606f\u4e22\u5931\u3002<\/li>\n<li><strong>\u5c42\u5f52\u4e00\u5316 (Layer Normalization)<\/strong>\uff1a\u7a33\u5b9a\u8bad\u7ec3\u8fc7\u7a0b\uff0c\u52a0\u901f\u6536\u655b\u3002<\/li>\n<\/ul>\n<h2>\u6df1\u5ea6\u5bf9\u6bd4\uff1aRNN vs Transformer<\/h2>\n<p>\u4e3a\u4e86\u66f4\u76f4\u89c2\u5730\u7406\u89e3 Transformer \u7684\u4f18\u52bf\uff0c\u6211\u4eec\u6765\u770b\u4e00\u7ec4\u5bf9\u6bd4\u6570\u636e\uff1a<\/p>\n<table>\n<thead>\n<tr>\n<th align=\"left\">\u7279\u6027<\/th>\n<th align=\"left\">RNN \/ LSTM (\u65e7\u65f6\u4ee3)<\/th>\n<th align=\"left\">Transformer (\u65b0\u65f6\u4ee3)<\/th>\n<\/tr>\n<\/thead>\n<tbody>\n<tr>\n<td align=\"left\"><strong>\u8ba1\u7b97\u65b9\u5f0f<\/strong><\/td>\n<td align=\"left\">\u4e32\u884c (Sequential)<\/td>\n<td align=\"left\">\u5e76\u884c (Parallel)<\/td>\n<\/tr>\n<tr>\n<td align=\"left\"><strong>\u957f\u8ddd\u79bb\u4f9d\u8d56<\/strong><\/td>\n<td align=\"left\">\u5f31 (\u8ddd\u79bb\u8d8a\u8fdc\u8d8a\u5bb9\u6613\u9057\u5fd8)<\/td>\n<td align=\"left\">\u5f3a (\u4efb\u610f\u4e24\u4e2a\u8bcd\u8ddd\u79bb\u4e3a 1)<\/td>\n<\/tr>\n<tr>\n<td align=\"left\"><strong>\u8bad\u7ec3\u901f\u5ea6<\/strong><\/td>\n<td align=\"left\">\u6162 (\u65e0\u6cd5\u5145\u5206\u5229\u7528 GPU)<\/td>\n<td align=\"left\">\u5feb (\u6781\u5ea6\u9002\u5408 GPU \u5e76\u884c)<\/td>\n<\/tr>\n<tr>\n<td align=\"left\"><strong>\u4e0a\u4e0b\u6587\u7406\u89e3<\/strong><\/td>\n<td align=\"left\">\u5355\u5411\u6216\u4f2a\u53cc\u5411<\/td>\n<td align=\"left\">\u771f\u6b63\u7684\u53cc\u5411 (Bidirectional)<\/td>\n<\/tr>\n<tr>\n<td align=\"left\"><strong>\u5178\u578b\u4ee3\u8868<\/strong><\/td>\n<td align=\"left\">Seq2Seq, Google Translate (Old)<\/td>\n<td align=\"left\">BERT, GPT-4, Llama<\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<h2>\u4e13\u5bb6\u5efa\u8bae<\/h2>\n<p>\u5982\u679c\u4f60\u6b63\u5728\u5b66\u4e60\u6216\u5c1d\u8bd5\u590d\u73b0 Transformer\uff0c\u8bf7\u6ce8\u610f\u4ee5\u4e0b\u51e0\u70b9\uff1a<\/p>\n<ol>\n<li><strong>Warmup \u7b56\u7565\u81f3\u5173\u91cd\u8981<\/strong>\uff1aTransformer \u5bf9\u5b66\u4e60\u7387\u975e\u5e38\u654f\u611f\u3002\u8bad\u7ec3\u521d\u671f\u9700\u8981\u4f7f\u7528 <code>Warmup<\/code> \u7b56\u7565\uff0c\u5373\u5148\u7ebf\u6027\u589e\u52a0\u5b66\u4e60\u7387\uff0c\u518d\u6309\u5e73\u65b9\u6839\u5012\u6570\u8870\u51cf\uff0c\u5426\u5219\u6a21\u578b\u5f88\u96be\u6536\u655b\u3002<\/li>\n<li><strong>\u63a9\u7801 (Masking) \u522b\u641e\u9519<\/strong>\uff1a\u5728 Decoder \u8bad\u7ec3\u65f6\uff0c\u5fc5\u987b\u4f7f\u7528 <code>Look-ahead Mask<\/code>\uff08\u524d\u77bb\u63a9\u7801\uff09\uff0c\u786e\u4fdd\u9884\u6d4b\u7b2c $t$ \u4e2a\u8bcd\u65f6\uff0c\u53ea\u80fd\u770b\u5230 $t$ \u4e4b\u524d\u7684\u4fe1\u606f\uff0c\u7edd\u5bf9\u4e0d\u80fd\u201c\u5077\u770b\u201d\u7b54\u6848\u3002<\/li>\n<li><strong>Embedding \u7ef4\u5ea6<\/strong>\uff1a\u539f\u59cb\u8bba\u6587\u4e2d $d_{model}=512$\uff0c\u4f46\u5728\u5b9e\u9645\u5fae\u8c03\u5c0f\u4efb\u52a1\u65f6\uff0c\u964d\u4f4e\u7ef4\u5ea6\uff08\u5982 256 \u6216 128\uff09\u5f80\u5f80\u80fd\u9632\u6b62\u8fc7\u62df\u5408\u4e14\u901f\u5ea6\u66f4\u5feb\u3002<\/li>\n<\/ol>\n<p>\u300aAttention Is All You Need\u300b\u4e0d\u4ec5\u4ec5\u662f\u4e00\u7bc7\u8bba\u6587\uff0c\u5b83\u662f\u73b0\u4ee3\u751f\u6210\u5f0f AI \u7684\u57fa\u77f3\u3002\u5b83\u544a\u8bc9\u6211\u4eec\uff1a\u53ea\u8981\u6709\u8db3\u591f\u7684\u8ba1\u7b97\u8d44\u6e90\u548c\u6570\u636e\uff0c<strong>\u6ce8\u610f\u529b\u673a\u5236<\/strong> \u8db3\u4ee5\u6355\u6349\u4eba\u7c7b\u8bed\u8a00\u7684\u590d\u6742\u89c4\u5f8b\u3002\u4e0d\u8981\u6b62\u6b65\u4e8e\u7406\u8bba\u3002\u6253\u5f00\u4f60\u7684 Python \u7f16\u8f91\u5668\uff08\u6216 Colab\uff09\uff0c\u5c1d\u8bd5\u8c03\u7528 <code>torch.nn.Transformer<\/code> \u6a21\u5757\uff0c\u8dd1\u901a\u4e00\u4e2a\u7b80\u5355\u7684\u673a\u5668\u7ffb\u8bd1 Demo\u3002\u53ea\u6709\u4eb2\u624b\u770b\u5230 Loss \u4e0b\u964d\uff0c\u4f60\u624d\u7b97\u771f\u6b63\u7406\u89e3\u4e86\u5b83\u3002<\/p>\n<h3>\u53c2\u8003\u8d44\u6599<\/h3>\n<ul>\n<li><a href=\"https:\/\/arxiv.org\/abs\/1706.03762\">Vaswani et al., &#8220;Attention Is All You Need&#8221;, NeurIPS 2017<\/a><\/li>\n<li><a href=\"http:\/\/jalammar.github.io\/illustrated-transformer\/\">The Illustrated Transformer by Jay Alammar<\/a><\/li>\n<li><a href=\"https:\/\/pytorch.org\/docs\/stable\/nn.html#transformer-layers\">PyTorch Documentation: Transformer Layers<\/a><\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"<p>ChatGPT\u3001Claude\u3001Gemini\u2014\u2014\u8fd9\u4e9b\u6539\u53d8\u4e16\u754c\u7684 AI \u6a21\u578b\u80cc\u540e\uff0c\u90fd\u7ad9\u7740\u540c\u4e00\u4e2a\u5de8\u4eba\uff1a2017 \u5e74 Google \u56e2\u961f\u53d1\u8868\u7684\u8bba\u6587 \u300aAttention Is All You Need\u300b\u3002 \u5728\u8fd9\u7bc7\u6587\u7ae0\u53d1\u8868\u4e4b\u524d\uff0c\u81ea\u7136\u8bed\u8a00\u5904\u7406\uff08NLP\uff09\u662f\u5faa\u73af\u795e\u7ecf\u7f51\u7edc\uff08RNN\uff09\u548c LSTM \u7684\u5929\u4e0b\u3002\u7136\u800c\uff0c\u8fd9\u4e9b\u65e7\u67b6\u6784\u50cf\u662f\u4e00\u6761\u5355\u884c\u9053\uff0c\u5904\u7406\u957f\u6587\u672c\u65f6\u6548\u7387\u4f4e\u4e0b\u4e14\u5bb9\u6613\u201c\u9057\u5fd8\u201d\u3002Transformer \u7684\u51fa\u73b0\uff0c\u5f7b\u5e95\u6253\u7834\u4e86\u987a\u5e8f\u5904\u7406\u7684\u67b7\u9501\uff0c\u5f00\u542f\u4e86\u5e76\u884c\u8ba1\u7b97\u7684 AI \u9ec4\u91d1\u65f6\u4ee3\u3002 \u672c\u6587\u5c06\u5265\u79bb\u590d\u6742\u7684\u6570\u5b66\u516c\u5f0f\uff0c\u901a\u8fc7\u53ef\u89c6\u5316\u56fe\u89e3\u548c\u6838\u5fc3\u4ee3\u7801\uff0c\u5e26\u4f60\u5f7b\u5e95\u770b\u61c2 Transformer \u7684\u5185\u90e8\u6784\u9020\u3002 \u6838\u5fc3\u6982\u5ff5\uff1a\u4e3a\u4ec0\u4e48\u662f Transformer\uff1f \u5728 Transformer \u51fa\u73b0\u4e4b\u524d\uff0cNLP \u6a21\u578b\u5904\u7406\u53e5\u5b50\u50cf\u662f\u5728\u8bfb\u4e00\u672c\u4e66\uff1a\u5fc5\u987b\u8bfb\u5b8c\u4e0a\u4e00\u9875\uff0c\u624d\u80fd\u8bfb\u4e0b\u4e00\u9875\uff08\u987a\u5e8f\u5904\u7406\uff09\u3002\u5982\u679c\u4f60\u8bfb\u5230\u7b2c 100 \u9875\u65f6\u5fd8\u4e86\u7b2c 1 \u9875\u7684\u5185\u5bb9\uff0c\u6a21\u578b\u5c31\u5d29\u584c\u4e86\u3002 Transformer \u505a\u51fa\u7684\u6700\u5927\u6539\u53d8\u662f\u5e76\u884c\u5316\uff08Parallelization\uff09\u3002\u5b83\u80fd\u50cf\u62e5\u6709\u201c\u4e0a\u5e1d\u89c6\u89d2\u201d\u4e00\u6837\uff0c\u540c\u65f6\u770b\u5230\u53e5\u5b50\u4e2d\u7684\u6bcf\u4e00\u4e2a\u8bcd\uff0c\u5e76\u77ac\u95f4\u8ba1\u7b97\u51fa\u8bcd\u4e0e\u8bcd\u4e4b\u95f4\u7684\u5173\u8054\u5f3a\u5ea6\u3002\u8fd9\u79cd\u673a\u5236\u7684\u6838\u5fc3\uff0c\u5c31\u662f \u81ea\u6ce8\u610f\u529b\u673a\u5236\uff08Self-Attention\uff09\u3002 Transformer \u5b8f\u89c2\u67b6\u6784\u56fe graph TD subgraph &#8220;Encoder (\u7f16\u7801\u5668)&#8221; A[&#8220;\u8f93\u5165 (Inputs)&#8221;] &#8211;&gt; B[&#8220;\u8f93\u5165\u5d4c\u5165 (Input Embeddings)&#8221;] B &#8211;&gt; C[&#8220;\u4f4d\u7f6e\u7f16\u7801 (Positional Encoding)&#8221;] C &#8211;&gt; D[&#8220;\u591a\u5934\u6ce8\u610f\u529b (Multi-Head [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":3017,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[55],"tags":[],"class_list":["post-3018","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-tips-tutorials"],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/www.tiptinker.com\/zh-hans\/wp-json\/wp\/v2\/posts\/3018","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.tiptinker.com\/zh-hans\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.tiptinker.com\/zh-hans\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.tiptinker.com\/zh-hans\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.tiptinker.com\/zh-hans\/wp-json\/wp\/v2\/comments?post=3018"}],"version-history":[{"count":0,"href":"https:\/\/www.tiptinker.com\/zh-hans\/wp-json\/wp\/v2\/posts\/3018\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.tiptinker.com\/zh-hans\/wp-json\/wp\/v2\/media\/3017"}],"wp:attachment":[{"href":"https:\/\/www.tiptinker.com\/zh-hans\/wp-json\/wp\/v2\/media?parent=3018"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.tiptinker.com\/zh-hans\/wp-json\/wp\/v2\/categories?post=3018"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.tiptinker.com\/zh-hans\/wp-json\/wp\/v2\/tags?post=3018"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}