0% found this document useful (0 votes)
67 views2 pages

Untitled66 - Jupyter Notebook

The document loads libraries and the MNIST dataset for a convolutional neural network model to classify handwritten digits. It defines an input layer and applies convolutional, max pooling, and dense layers to create and compile the model. The model is trained for 5 epochs on the MNIST train and test data, showing decreasing loss and increasing accuracy on both the training and validation sets.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
67 views2 pages

Untitled66 - Jupyter Notebook

The document loads libraries and the MNIST dataset for a convolutional neural network model to classify handwritten digits. It defines an input layer and applies convolutional, max pooling, and dense layers to create and compile the model. The model is trained for 5 epochs on the MNIST train and test data, showing decreasing loss and increasing accuracy on both the training and validation sets.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

1/22/24, 4:18 PM Untitled66 - Jupyter Notebook

In [1]: #loading libraries


import numpy as np
import matplotlib.pyplot as plt
import seaborn as sns
from tensorflow.keras.datasets import mnist
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
from tensorflow.keras.utils import to_categorical
from sklearn.metrics import confusion_matrix, accuracy_score

In [2]: from keras.models import Model


from keras.layers import Input, Conv2D, MaxPooling2D, Dropout, Flatten, Dense, concatenate

In [3]: # Load the MNIST dataset


(train_images, train_labels), (test_images, test_labels) = mnist.load_data()

In [4]: # Preprocess the data


train_images = train_images.reshape((60000, 28, 28, 1)).astype('float32') / 255
test_images = test_images.reshape((10000, 28, 28, 1)).astype('float32') / 255

train_labels = to_categorical(train_labels)
test_labels = to_categorical(test_labels)

In [5]: # Define the input


input_layer = Input(shape=(28, 28, 1))

In [6]: # First convolutional layer


conv1 = Conv2D(64, (7, 7), activation='relu', padding='same')(input_layer)
pool1 = MaxPooling2D((2, 2))(conv1)

# Second convolutional layer
conv2 = Conv2D(64, (1, 1), activation='relu', padding='same')(pool1)
conv2 = Conv2D(192, (3, 3), activation='relu', padding='same')(conv2)
pool2 = MaxPooling2D((2, 2))(conv2)

# Inception module 1
inception1_1 = Conv2D(64, (1, 1), activation='relu', padding='same')(pool2)
inception1_2 = Conv2D(96, (1, 1), activation='relu', padding='same')(pool2)
inception1_2 = Conv2D(128, (3, 3), activation='relu', padding='same')(inception1_2)
inception1_3 = Conv2D(16, (1, 1), activation='relu', padding='same')(pool2)
inception1_3 = Conv2D(32, (5, 5), activation='relu', padding='same')(inception1_3)
inception1_4 = MaxPooling2D((3, 3), strides=(1, 1), padding='same')(pool2)
inception1_4 = Conv2D(32, (1, 1), activation='relu', padding='same')(inception1_4)
inception1 = concatenate([inception1_1, inception1_2, inception1_3, inception1_4], axis=-1)

# Flatten the output and add fully connected layers
flatten = Flatten()(inception1)
dense1 = Dense(128, activation='relu')(flatten)
output = Dense(10, activation='softmax')(dense1)

localhost:8888/notebooks/Untitled66.ipynb?kernel_name=python3 1/2
1/22/24, 4:18 PM Untitled66 - Jupyter Notebook

In [7]: # Create the model


model = Model(inputs=input_layer, outputs=output)

# Compile the model
model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])

# Train the model
model.fit(train_images, train_labels, epochs=5, batch_size=64, validation_data=(test_images, test_labels)

Epoch 1/5
938/938 [==============================] - 123s 128ms/step - loss: 0.1174 - accuracy: 0.9624 - val_loss:
0.0419 - val_accuracy: 0.9857
Epoch 2/5
938/938 [==============================] - 117s 124ms/step - loss: 0.0374 - accuracy: 0.9884 - val_loss:
0.0291 - val_accuracy: 0.9897
Epoch 3/5
938/938 [==============================] - 116s 124ms/step - loss: 0.0275 - accuracy: 0.9913 - val_loss:
0.0220 - val_accuracy: 0.9926
Epoch 4/5
938/938 [==============================] - 117s 125ms/step - loss: 0.0214 - accuracy: 0.9933 - val_loss:
0.0223 - val_accuracy: 0.9926
Epoch 5/5
938/938 [==============================] - 116s 124ms/step - loss: 0.0182 - accuracy: 0.9941 - val_loss:
0.0250 - val_accuracy: 0.9917

Out[7]: <keras.src.callbacks.History at 0x2bdb296cc10>

localhost:8888/notebooks/Untitled66.ipynb?kernel_name=python3 2/2

You might also like