GPT-3 (Q95726734): Difference between revisions

From Wikidata
Jump to navigation Jump to search
BorkedBot (talk | contribs)
Added qualifier: point in time (P585): 23 May 2023, update subreddit data
BorkedBot (talk | contribs)
Created claim: social media followers (P8687): 408,995, add reddit sub count
Property / social media followers
 
408,995
Amount408,995
Unit1
Property / social media followers: 408,995 / rank
 
Preferred rank
Property / social media followers: 408,995 / qualifier
 
point in time: 23 May 2023
Timestamp+2023-05-23T00:00:00Z
Timezone+00:00
CalendarGregorian
Precision1 day
Before0
After0
Property / social media followers: 408,995 / qualifier
 

Revision as of 16:44, 23 May 2023

2020 Transformer-based language model
  • Generative Pre-trained Transformer 3
  • Generative Pretrained Transformer 3
  • GPT3
Language Label Description Also known as
English
GPT-3
2020 Transformer-based language model
  • Generative Pre-trained Transformer 3
  • Generative Pretrained Transformer 3
  • GPT3

Statements

0 references
28 May 2020
0 references
0 references
GPT-3.5 (multiple languages)
0 references
0 references
0 references
125M
125,000,000 parameter
350M
350,000,000 parameter
760M
760,000,000 parameter
1.3B
1,300,000,000 parameter
2.7B
2,700,000,000 parameter
6.7B
6,700,000,000 parameter
13B
13,000,000,000 parameter
175B
175,000,000,000 parameter
1 reference
To study the dependence of ML performance on model size, we train 8 different sizes of model, ranging over three orders of magnitude from 125 million parameters to 175 billion parameters, with the last being the model we call GPT-3. (English)
22 July 2020
0 references
0 references
0 references
408,995
0 references

Identifiers

0 references
Generative Pretrained Transformers
17 July 2020
23 May 2023
0 references