GPT-2
外观
原作者 | OpenAI |
---|---|
首次发布 | 2019年2月14日 |
当前版本 |
|
源代码库 | https://github.com/openai/gpt-2 |
前任 | GPT-1 |
繼任 | GPT-3 |
类型 | |
许可协议 | |
网站 | openai |
生成式预训练变换模型2(英語:Generative Pre-trained Transformer 2,简称 GPT-2)是OpenAI于2019年2月创建的开源人工智能。[2] [3] [4] [5] GPT-2能够翻译文本、回答问题、总结段落,[6]并生成文本输出。虽然其输出内容有时与人类相似,[7]但在生成长段落时输出内容可能会变得重复或无意义。[8]GPT-2 是一个通用学习器,没有经过专门训练来执行任何特定的任务,[9] [6] 并且是作为 OpenAI 2018 GPT 模型的“直接扩展”而创建的,[10]其参数数量和训练数据集的大小均增加了十倍。[5]
参考資料
[编辑]- ^ https://openai.com/blog/gpt-2-1-5b-release/.
- ^ Piper, Kelsey. A poetry-writing AI has just been unveiled. It's ... pretty good.. Vox. 15 May 2019 [19 December 2020]. (原始内容存档于7 November 2020).
- ^ Johnson, Khari. OpenAI releases curtailed version of GPT-2 language model. VentureBeat. 20 August 2019 [19 December 2020]. (原始内容存档于18 December 2020).
- ^ Vincent, James. OpenAI has published the text-generating AI it said was too dangerous to share. The Verge. 7 November 2019 [19 December 2020]. (原始内容存档于11 June 2020).
- ^ 5.0 5.1 Better Language Models and Their Implications. OpenAI. 14 February 2019 [19 December 2020]. (原始内容存档于19 December 2020).
- ^ 6.0 6.1 Hegde. Unsupervised Paraphrase Generation using Pre-trained Language Models. arXiv:2006.05477 .
- ^ Kaiser, Caleb. Too big to deploy: How GPT-2 is breaking servers. Towards Data Science. 31 January 2020 [27 February 2021]. (原始内容存档于15 February 2020).
- ^ Hern, Alex. New AI fake text generator may be too dangerous to release, say creators. The Guardian. 14 February 2019 [19 December 2020]. (原始内容存档于14 February 2019).
- ^ Radford, Alec; Wu, Jeffrey; Child, Rewon; Luan, David; Amodei, Dario; Sutskever, Ilua. Language models are unsupervised multitask learners (PDF) 1 (8). 14 February 2019 [19 December 2020]. (原始内容存档 (PDF)于6 February 2021).
- ^ Radford, Alec; Narasimhan, Karthik; Salimans, Tim; Sutskever, Ilya. Improving Language Understanding by Generative Pre-Training (PDF). OpenAI: 12. 11 June 2018 [23 January 2021]. (原始内容存档 (PDF)于26 January 2021).