GPT-3 (Q95726734): Difference between revisions

From Wikidata
Jump to navigation Jump to search
Created claim: Know Your Meme ID (P6760): 40858
Tag: Wikidata user interface
Added [uk] description: Модель мови 2020, що генерує текст
Tags: Mobile edit Mobile app edit Android app edit Suggested Edits edit
description / ukdescription / uk
 
Модель мови 2020, що генерує текст

Revision as of 20:06, 2 August 2022

2020 Transformer-based language model
  • Generative Pre-trained Transformer 3
  • Generative Pretrained Transformer 3
  • GPT3
Language Label Description Also known as
English
GPT-3
2020 Transformer-based language model
  • Generative Pre-trained Transformer 3
  • Generative Pretrained Transformer 3
  • GPT3

Statements

0 references
0 references
0 references
0 references
0 references
28 May 2020
0 references
0 references
0 references
125M
125,000,000 parameter
350M
350,000,000 parameter
760M
760,000,000 parameter
1.3B
1,300,000,000 parameter
2.7B
2,700,000,000 parameter
6.7B
6,700,000,000 parameter
13B
13,000,000,000 parameter
175B
175,000,000,000 parameter
1 reference
To study the dependence of ML performance on model size, we train 8 different sizes of model, ranging over three orders of magnitude from 125 million parameters to 175 billion parameters, with the last being the model we call GPT-3. (English)
22 July 2020
0 references
0 references
0 references

Identifiers