EXP5 VGG16v2
EXP5 VGG16v2
import numpy as np
base_model.trainable = False
x = Flatten()(base_model.output)
x = Dense(256, activation='relu')(x)
x = Dropout(0.5)(x)
x = Dense(128, activation='relu')(x)
x = Dropout(0.5)(x)
model.compile(optimizer=Adam(learning_rate=0.0001),
loss='categorical_crossentropy', metrics=['accuracy'])
• Input size adjustment: CIFAR-10 images are 32x32, but VGG16 requires
224x224.
Key Points:
• Input Size: CIFAR-10 images are 32x32, but VGG16 requires 224x224.
• Final Layers: Fully connected layers adapted for CIFAR-10 (10 classes).
Implementation
import tensorflow as tf
import tensorflow.keras.backend as K
def build_vgg16():
model = Sequential()
# Block 1
model.add(MaxPooling2D((2,2), strides=(2,2)))
# Block 2
model.add(MaxPooling2D((2,2), strides=(2,2)))
# Block 3
model.add(MaxPooling2D((2,2), strides=(2,2)))
# Block 4
model.add(MaxPooling2D((2,2), strides=(2,2)))
# Block 5
model.add(MaxPooling2D((2,2), strides=(2,2)))
model.add(Flatten())
model.add(Dense(4096, activation='relu'))
model.add(Dropout(0.5))
model.add(Dense(4096, activation='relu'))
model.add(Dropout(0.5))
return model
# Create model
model = build_vgg16()
model.compile(optimizer=Adam(learning_rate=0.0001),
loss='categorical_crossentropy', metrics=['accuracy'])
Explanation:
• Fully Connected Layers: Uses 4096 neurons with Dropout (0.5) to prevent
overfitting.
Improvements: