GPT-3 (Q95726734): Difference between revisions
Jump to navigation
Jump to search
Added qualifier: point in time (P585): 23 May 2023, update subreddit data |
Created claim: social media followers (P8687): 408,995, add reddit sub count |
||||||||||||||
Property / social media followers | |||||||||||||||
408,995
| |||||||||||||||
Property / social media followers: 408,995 / rank | |||||||||||||||
Preferred rank | |||||||||||||||
Property / social media followers: 408,995 / qualifier | |||||||||||||||
point in time: 23 May 2023
| |||||||||||||||
Property / social media followers: 408,995 / qualifier | |||||||||||||||
Revision as of 16:44, 23 May 2023
2020 Transformer-based language model
- Generative Pre-trained Transformer 3
- Generative Pretrained Transformer 3
- GPT3
Language | Label | Description | Also known as |
---|---|---|---|
English | GPT-3 |
2020 Transformer-based language model |
|
Statements
28 May 2020
0 references
125M
Small
125,000,000 parameter
1 reference
350M
Medium
350,000,000 parameter
1 reference
760M
Large
760,000,000 parameter
1 reference
1.3B
1,300,000,000 parameter
1 reference
175B
GPT-3
175,000,000,000 parameter
1 reference
To study the dependence of ML performance on model size, we train 8 different sizes of model, ranging over three orders of magnitude from 125 million parameters to 175 billion parameters, with the last being the model we call GPT-3. (English)
22 July 2020
Identifiers
Generative Pretrained Transformers
17 July 2020
23 May 2023
0 references
Sitelinks
Wikipedia(25 entries)
- arwiki جي بي تي-3
- bgwiki GPT-3
- cawiki GPT-3
- ckbwiki جی-پی-تی-٣
- cswiki GPT-3
- enwiki GPT-3
- eswiki GPT-3
- etwiki GPT-3
- euwiki GPT-3
- fawiki جیپیتی-۳
- fiwiki GPT-3
- frwiki GPT-3
- hewiki GPT-3
- hiwiki जीपीटी3
- itwiki GPT-3
- jawiki GPT-3
- kowiki GPT-3
- nlwiki GPT-3
- ptwiki GPT-3
- quwiki GPT-3
- ruwiki GPT-3
- svwiki GPT-3
- trwiki GPT-3
- ukwiki GPT-3
- zhwiki GPT-3