You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: intermediate_source/dynamic_quantization_bert_tutorial.rst
+3-3Lines changed: 3 additions & 3 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -79,7 +79,7 @@ Mac:
79
79
80
80
.. code:: shell
81
81
82
-
yes y | pip uninstall torch tochvision
82
+
yes y | pip uninstall torch torchvision
83
83
yes y | pip install --pre torch -f https://download.pytorch.org/whl/nightly/cu101/torch_nightly.html
84
84
85
85
@@ -206,7 +206,7 @@ in `examples <https://github.com/huggingface/transformers/tree/master/examples#m
206
206
--save_steps 100000 \
207
207
--output_dir $OUT_DIR
208
208
209
-
We provide the fined-tuned BERT model for MRPC task `here <https://download.pytorch.org/tutorial/MRPC.zip>`_.
209
+
We provide the fine-tuned BERT model for MRPC task `here <https://download.pytorch.org/tutorial/MRPC.zip>`_.
210
210
To save time, you can download the model file (~400 MB) directly into your local folder ``$OUT_DIR``.
211
211
212
212
2.1 Set global configurations
@@ -273,7 +273,7 @@ We load the tokenizer and fine-tuned BERT sequence classifier model
273
273
2.3 Define the tokenize and evaluation function
274
274
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
275
275
276
-
We reuse the tokenize and evaluation function from `Huggingface<https://github.com/huggingface/transformers/blob/master/examples/run_glue.py>`_.
276
+
We reuse the tokenize and evaluation function from `HuggingFace<https://github.com/huggingface/transformers/blob/master/examples/run_glue.py>`_.
0 commit comments