Skip to content

Commit d786052

Browse files
committed
Pushing the docs to dev/ for branch: main, commit 6169cc9b383c7dc15d9039767ed28eb6fd944f06
1 parent 3320985 commit d786052

File tree

736 files changed

+2265
-2265
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

736 files changed

+2265
-2265
lines changed
Binary file not shown.

Diff for: dev/_downloads/4b8d8f0d50e5aa937ac9571a35eadc28/plot_kmeans_stability_low_dim_dense.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@
1010
1111
The first plot shows the best inertia reached for each combination
1212
of the model (``KMeans`` or ``MiniBatchKMeans``), and the init method
13-
(``init="random"`` or ``init="kmeans++"``) for increasing values of the
13+
(``init="random"`` or ``init="k-means++"``) for increasing values of the
1414
``n_init`` parameter that controls the number of initializations.
1515
1616
The second plot demonstrates one single run of the ``MiniBatchKMeans``

Diff for: dev/_downloads/5a87b25ba023ee709595b8d02049f021/plot_kmeans_digits.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -114,7 +114,7 @@ def bench_k_means(kmeans, name, data, labels):
114114
#
115115
# We will compare three approaches:
116116
#
117-
# * an initialization using `kmeans++`. This method is stochastic and we will
117+
# * an initialization using `k-means++`. This method is stochastic and we will
118118
# run the initialization 4 times;
119119
# * a random initialization. This method is stochastic as well and we will run
120120
# the initialization 4 times;

Diff for: dev/_downloads/6bf322ce1724c13e6e0f8f719ebd253c/plot_kmeans_digits.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -58,7 +58,7 @@
5858
"cell_type": "markdown",
5959
"metadata": {},
6060
"source": [
61-
"## Run the benchmark\n\nWe will compare three approaches:\n\n* an initialization using `kmeans++`. This method is stochastic and we will\n run the initialization 4 times;\n* a random initialization. This method is stochastic as well and we will run\n the initialization 4 times;\n* an initialization based on a :class:`~sklearn.decomposition.PCA`\n projection. Indeed, we will use the components of the\n :class:`~sklearn.decomposition.PCA` to initialize KMeans. This method is\n deterministic and a single initialization suffice.\n\n"
61+
"## Run the benchmark\n\nWe will compare three approaches:\n\n* an initialization using `k-means++`. This method is stochastic and we will\n run the initialization 4 times;\n* a random initialization. This method is stochastic as well and we will run\n the initialization 4 times;\n* an initialization based on a :class:`~sklearn.decomposition.PCA`\n projection. Indeed, we will use the components of the\n :class:`~sklearn.decomposition.PCA` to initialize KMeans. This method is\n deterministic and a single initialization suffice.\n\n"
6262
]
6363
},
6464
{
Binary file not shown.

Diff for: dev/_downloads/77b640d8771f7ecdeb2dbc948ebce90a/plot_kmeans_plusplus.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@
2626
},
2727
"outputs": [],
2828
"source": [
29-
"from sklearn.cluster import kmeans_plusplus\nfrom sklearn.datasets import make_blobs\nimport matplotlib.pyplot as plt\n\n# Generate sample data\nn_samples = 4000\nn_components = 4\n\nX, y_true = make_blobs(\n n_samples=n_samples, centers=n_components, cluster_std=0.60, random_state=0\n)\nX = X[:, ::-1]\n\n# Calculate seeds from kmeans++\ncenters_init, indices = kmeans_plusplus(X, n_clusters=4, random_state=0)\n\n# Plot init seeds along side sample data\nplt.figure(1)\ncolors = [\"#4EACC5\", \"#FF9C34\", \"#4E9A06\", \"m\"]\n\nfor k, col in enumerate(colors):\n cluster_data = y_true == k\n plt.scatter(X[cluster_data, 0], X[cluster_data, 1], c=col, marker=\".\", s=10)\n\nplt.scatter(centers_init[:, 0], centers_init[:, 1], c=\"b\", s=50)\nplt.title(\"K-Means++ Initialization\")\nplt.xticks([])\nplt.yticks([])\nplt.show()"
29+
"from sklearn.cluster import kmeans_plusplus\nfrom sklearn.datasets import make_blobs\nimport matplotlib.pyplot as plt\n\n# Generate sample data\nn_samples = 4000\nn_components = 4\n\nX, y_true = make_blobs(\n n_samples=n_samples, centers=n_components, cluster_std=0.60, random_state=0\n)\nX = X[:, ::-1]\n\n# Calculate seeds from k-means++\ncenters_init, indices = kmeans_plusplus(X, n_clusters=4, random_state=0)\n\n# Plot init seeds along side sample data\nplt.figure(1)\ncolors = [\"#4EACC5\", \"#FF9C34\", \"#4E9A06\", \"m\"]\n\nfor k, col in enumerate(colors):\n cluster_data = y_true == k\n plt.scatter(X[cluster_data, 0], X[cluster_data, 1], c=col, marker=\".\", s=10)\n\nplt.scatter(centers_init[:, 0], centers_init[:, 1], c=\"b\", s=50)\nplt.title(\"K-Means++ Initialization\")\nplt.xticks([])\nplt.yticks([])\nplt.show()"
3030
]
3131
}
3232
],

Diff for: dev/_downloads/7e9cf82b8b60275dd7851470d151af5f/plot_kmeans_stability_low_dim_dense.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
"cell_type": "markdown",
1616
"metadata": {},
1717
"source": [
18-
"\n# Empirical evaluation of the impact of k-means initialization\n\nEvaluate the ability of k-means initializations strategies to make\nthe algorithm convergence robust, as measured by the relative standard\ndeviation of the inertia of the clustering (i.e. the sum of squared\ndistances to the nearest cluster center).\n\nThe first plot shows the best inertia reached for each combination\nof the model (``KMeans`` or ``MiniBatchKMeans``), and the init method\n(``init=\"random\"`` or ``init=\"kmeans++\"``) for increasing values of the\n``n_init`` parameter that controls the number of initializations.\n\nThe second plot demonstrates one single run of the ``MiniBatchKMeans``\nestimator using a ``init=\"random\"`` and ``n_init=1``. This run leads to\na bad convergence (local optimum), with estimated centers stuck\nbetween ground truth clusters.\n\nThe dataset used for evaluation is a 2D grid of isotropic Gaussian\nclusters widely spaced.\n"
18+
"\n# Empirical evaluation of the impact of k-means initialization\n\nEvaluate the ability of k-means initializations strategies to make\nthe algorithm convergence robust, as measured by the relative standard\ndeviation of the inertia of the clustering (i.e. the sum of squared\ndistances to the nearest cluster center).\n\nThe first plot shows the best inertia reached for each combination\nof the model (``KMeans`` or ``MiniBatchKMeans``), and the init method\n(``init=\"random\"`` or ``init=\"k-means++\"``) for increasing values of the\n``n_init`` parameter that controls the number of initializations.\n\nThe second plot demonstrates one single run of the ``MiniBatchKMeans``\nestimator using a ``init=\"random\"`` and ``n_init=1``. This run leads to\na bad convergence (local optimum), with estimated centers stuck\nbetween ground truth clusters.\n\nThe dataset used for evaluation is a 2D grid of isotropic Gaussian\nclusters widely spaced.\n"
1919
]
2020
},
2121
{

Diff for: dev/_downloads/fa03fd57e0f1a2cd66f3693283f7a6b3/plot_kmeans_plusplus.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@
2323
)
2424
X = X[:, ::-1]
2525

26-
# Calculate seeds from kmeans++
26+
# Calculate seeds from k-means++
2727
centers_init, indices = kmeans_plusplus(X, n_clusters=4, random_state=0)
2828

2929
# Plot init seeds along side sample data

Diff for: dev/_downloads/scikit-learn-docs.zip

2.25 KB
Binary file not shown.
-166 Bytes
-175 Bytes
339 Bytes
26 Bytes
-71 Bytes

Diff for: dev/_images/sphx_glr_plot_anomaly_comparison_001.png

214 Bytes
21 Bytes
87 Bytes
-73 Bytes

Diff for: dev/_images/sphx_glr_plot_calibration_curve_002.png

17 Bytes

Diff for: dev/_images/sphx_glr_plot_cluster_comparison_001.png

-415 Bytes
32 Bytes

Diff for: dev/_images/sphx_glr_plot_coin_segmentation_001.png

33 Bytes

Diff for: dev/_images/sphx_glr_plot_coin_segmentation_002.png

227 Bytes

Diff for: dev/_images/sphx_glr_plot_coin_segmentation_003.png

86 Bytes
4 Bytes
-185 Bytes
-234 Bytes

Diff for: dev/_images/sphx_glr_plot_dict_face_patches_001.png

17 Bytes
48 Bytes

Diff for: dev/_images/sphx_glr_plot_digits_pipe_001.png

27 Bytes

Diff for: dev/_images/sphx_glr_plot_digits_pipe_thumb.png

13 Bytes
83 Bytes
-4 Bytes

Diff for: dev/_images/sphx_glr_plot_gmm_init_001.png

67 Bytes

Diff for: dev/_images/sphx_glr_plot_gmm_init_thumb.png

13 Bytes
-2.12 KB
-1.61 KB
-563 Bytes
-468 Bytes
60 Bytes
173 Bytes
17 Bytes

Diff for: dev/_images/sphx_glr_plot_image_denoising_002.png

-114 Bytes

Diff for: dev/_images/sphx_glr_plot_image_denoising_003.png

171 Bytes

Diff for: dev/_images/sphx_glr_plot_image_denoising_004.png

57 Bytes

Diff for: dev/_images/sphx_glr_plot_image_denoising_005.png

201 Bytes
14 Bytes
-137 Bytes
-75 Bytes
68 Bytes
31 Bytes
-30 Bytes
-21 Bytes

Diff for: dev/_images/sphx_glr_plot_learning_curve_002.png

-826 Bytes

Diff for: dev/_images/sphx_glr_plot_learning_curve_003.png

1.05 KB

Diff for: dev/_images/sphx_glr_plot_linkage_comparison_001.png

-408 Bytes
-33 Bytes

Diff for: dev/_images/sphx_glr_plot_lle_digits_004.png

172 Bytes

Diff for: dev/_images/sphx_glr_plot_lle_digits_005.png

36 Bytes

Diff for: dev/_images/sphx_glr_plot_lle_digits_006.png

-161 Bytes

Diff for: dev/_images/sphx_glr_plot_lle_digits_007.png

179 Bytes

Diff for: dev/_images/sphx_glr_plot_lle_digits_008.png

16 Bytes

Diff for: dev/_images/sphx_glr_plot_lle_digits_009.png

76 Bytes

Diff for: dev/_images/sphx_glr_plot_lle_digits_010.png

-47 Bytes

Diff for: dev/_images/sphx_glr_plot_lle_digits_011.png

99 Bytes

Diff for: dev/_images/sphx_glr_plot_lle_digits_012.png

1 Byte

Diff for: dev/_images/sphx_glr_plot_lle_digits_013.png

295 Bytes

Diff for: dev/_images/sphx_glr_plot_lle_digits_014.png

-70 Bytes

Diff for: dev/_images/sphx_glr_plot_manifold_sphere_001.png

-163 Bytes

Diff for: dev/_images/sphx_glr_plot_manifold_sphere_thumb.png

85 Bytes

Diff for: dev/_images/sphx_glr_plot_mini_batch_kmeans_001.png

-164 Bytes
3 Bytes
-565 Bytes
-498 Bytes
458 Bytes
-475 Bytes
-101 Bytes
-44 Bytes
911 Bytes

Diff for: dev/_images/sphx_glr_plot_prediction_latency_001.png

152 Bytes

Diff for: dev/_images/sphx_glr_plot_prediction_latency_002.png

651 Bytes

Diff for: dev/_images/sphx_glr_plot_prediction_latency_003.png

1.24 KB

Diff for: dev/_images/sphx_glr_plot_prediction_latency_004.png

1.97 KB
209 Bytes
1.41 KB
382 Bytes
31 Bytes
-150 Bytes

Diff for: dev/_images/sphx_glr_plot_sgd_early_stopping_002.png

811 Bytes

Diff for: dev/_images/sphx_glr_plot_stack_predictors_001.png

14 Bytes

Diff for: dev/_images/sphx_glr_plot_stack_predictors_thumb.png

8 Bytes
-337 Bytes
-106 Bytes

Diff for: dev/_images/sphx_glr_plot_theilsen_001.png

36 Bytes

Diff for: dev/_images/sphx_glr_plot_theilsen_002.png

-31 Bytes

Diff for: dev/_images/sphx_glr_plot_theilsen_thumb.png

10 Bytes

Diff for: dev/_sources/auto_examples/applications/plot_cyclical_feature_engineering.rst.txt

+1-1

Diff for: dev/_sources/auto_examples/applications/plot_digits_denoising.rst.txt

+1-1

Diff for: dev/_sources/auto_examples/applications/plot_face_recognition.rst.txt

+5-5

Diff for: dev/_sources/auto_examples/applications/plot_model_complexity_influence.rst.txt

+15-15

0 commit comments

Comments
 (0)