GPT-3 (Q95726734): Difference between revisions
Jump to navigation
Jump to search
Created claim: Know Your Meme ID (P6760): 40858 Tag: Wikidata user interface |
Added [uk] description: Модель мови 2020, що генерує текст Tags: Mobile edit Mobile app edit Android app edit Suggested Edits edit |
||
description / uk | description / uk | ||
Модель мови 2020, що генерує текст |
Revision as of 20:06, 2 August 2022
2020 Transformer-based language model
- Generative Pre-trained Transformer 3
- Generative Pretrained Transformer 3
- GPT3
Language | Label | Description | Also known as |
---|---|---|---|
English | GPT-3 |
2020 Transformer-based language model |
|
Statements
28 May 2020
0 references
125M
Small
125,000,000 parameter
1 reference
350M
Medium
350,000,000 parameter
1 reference
760M
Large
760,000,000 parameter
1 reference
1.3B
1,300,000,000 parameter
1 reference
175B
GPT-3
175,000,000,000 parameter
1 reference
To study the dependence of ML performance on model size, we train 8 different sizes of model, ranging over three orders of magnitude from 125 million parameters to 175 billion parameters, with the last being the model we call GPT-3. (English)
22 July 2020