Skip to content
This repository was archived by the owner on Dec 6, 2023. It is now read-only.

Commit 4cf9dd7

Browse files
authored
fix API reference link in the introduction (#178)
1 parent 8dae0f9 commit 4cf9dd7

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

doc/intro.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -70,7 +70,7 @@ AdaGrad
7070
Stochastic averaged gradient (SAG and SAGA)
7171
-------------------------------------------
7272

73-
:class:`classification.classification.SAGClassifier`, :class:`classification.SAGAClassifier`, :class:`regression.SAGRegressor`, :class:`regression.SAGARegressor`
73+
:class:`classification.SAGClassifier`, :class:`classification.SAGAClassifier`, :class:`regression.SAGRegressor`, :class:`regression.SAGARegressor`
7474

7575
- Main idea: instead of using the full gradient (average of sample-wise gradients), compute gradient for a randomly selected sample and use out-dated gradients for other samples
7676
- Non-smooth losses: Yes (:class:`classification.SAGAClassifier` and :class:`regression.SAGARegressor`)

0 commit comments

Comments
 (0)