Codigo Modelo
Codigo Modelo
y_test_pred = model.predict(X_test)
print(f"Accuracy = {accuracy_score(y_test,y_test_pred)}")
print(f"Recall = {recall_score(y_test,y_test_pred)}")
#Las métricas F1, precision and recall requieren que se establezca la convención
de cuál es la clase positiva (1)
print(f"F1 score = {f1_score(y_test,y_test_pred)}")
kf = KFold(n_splits=10)
scores = cross_val_score(model, X, y, cv=kf, scoring="f1")
print(f"Metricas cross_validation \n{scores.round(2)}")
print("Media de cross_validation", scores.mean().round(2))
# Función para obtener la métrica del AUC; área bajo la curva ROC
y_train.value_counts()
y_test.value_counts()
# Modelos
# Modelo Random Forest Classifier
model_randomforest
Accuracy = 0.8993516659606211
Recall = 0.8608809339667808
F1 score = 0.8953471670289473
importances = model_randomforest.feature_importances_
features = pd.Series(importances, index=X_train.columns)
plt.figure(figsize=(10, 25))
features.plot(kind="barh")
plt.show()
# Se define el encoder
n_inputs = X_train.shape[1]
visible = Input(shape=(n_inputs,))
# encoder level 1
e = Dense(n_inputs*2)(visible)
e = BatchNormalization()(e)
e = LeakyReLU()(e)
# encoder level 2
e = Dense(n_inputs)(e)
e = BatchNormalization()(e)
e = LeakyReLU()(e)
# bottleneck
n_bottleneck = n_inputs
bottleneck = Dense(n_bottleneck)(e)
model_RN.summary()
Model: "model"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_1 (InputLayer) [(None, 86)] 0
_________________________________________________________________
dense (Dense) (None, 172) 14964
_________________________________________________________________
batch_normalization (BatchNo (None, 172) 688
_________________________________________________________________
leaky_re_lu (LeakyReLU) (None, 172) 0
_________________________________________________________________
dense_1 (Dense) (None, 86) 14878
_________________________________________________________________
batch_normalization_1 (Batch (None, 86) 344
_________________________________________________________________
leaky_re_lu_1 (LeakyReLU) (None, 86) 0
_________________________________________________________________
dense_2 (Dense) (None, 86) 7482
_________________________________________________________________
dense_3 (Dense) (None, 86) 7482
_________________________________________________________________
batch_normalization_2 (Batch (None, 86) 344
_________________________________________________________________
leaky_re_lu_2 (LeakyReLU) (None, 86) 0
_________________________________________________________________
dense_4 (Dense) (None, 172) 14964
_________________________________________________________________
batch_normalization_3 (Batch (None, 172) 688
_________________________________________________________________
leaky_re_lu_3 (LeakyReLU) (None, 172) 0
_________________________________________________________________
dense_5 (Dense) (None, 86) 14878
=================================================================
Total params: 76,712
Trainable params: 75,680
Non-trainable params: 1,032
_________________________________________________________________
Epoch 1/10
15836/15836 - 41s - loss: 0.0135 - val_loss: 0.0040
Epoch 2/10
15836/15836 - 39s - loss: 0.0070 - val_loss: 0.0023
Epoch 3/10
15836/15836 - 39s - loss: 0.0049 - val_loss: 0.0021
Epoch 4/10
15836/15836 - 40s - loss: 0.0034 - val_loss: 0.0015
Epoch 5/10
15836/15836 - 40s - loss: 0.0026 - val_loss: 0.0011
Epoch 6/10
15836/15836 - 39s - loss: 0.0022 - val_loss: 0.0011
Epoch 7/10
15836/15836 - 41s - loss: 0.0020 - val_loss: 0.0013
Epoch 8/10
15836/15836 - 39s - loss: 0.0018 - val_loss: 0.0011
Epoch 9/10
15836/15836 - 40s - loss: 0.0016 - val_loss: 7.6036e-04
Epoch 10/10
15836/15836 - 44s - loss: 0.0015 - val_loss: 7.6939e-04