0% found this document useful (0 votes)
23 views3 pages

Alexnet Structure

The document describes the structure of Alexnet, an early convolutional neural network. It details the layers of the network including convolutional layers, max pooling layers, fully connected layers, and dropout layers. The network takes images of shape 227x227x3 as input and outputs predictions for 1000 classes.

Uploaded by

Praveen J L
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views3 pages

Alexnet Structure

The document describes the structure of Alexnet, an early convolutional neural network. It details the layers of the network including convolutional layers, max pooling layers, fully connected layers, and dropout layers. The network takes images of shape 227x227x3 as input and outputs predictions for 1000 classes.

Uploaded by

Praveen J L
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

Alexnet Structure

image_shape
=(227,227,3)
np.random.seed(1000)
#Instantiate an empty model
model = Sequential()
# It starts here.
# 1st Convolutional Layer
model.add(Conv2D(filters=96, input_shape=image_shape,
kernel_size=(11,11), strides=(4,4), padding='valid'))
model.add(Activation('relu'))
# First layer has 96 Filters, the input shape is 227 x 227 x 3
# Kernel Size is 11 x 11, Striding 4 x 4, ReLu is the activation
function.
# Max Pooling
model.add(MaxPooling2D(pool_size=(3,3), strides=(2,2),
padding='valid'))

# 2nd Convolutional Layer


model.add(Conv2D(filters=256, kernel_size=(5,5),
strides=(1,1), padding='valid'))
model.add(Activation('relu'))
# Max Pooling
model.add(MaxPooling2D(pool_size=(3,3), strides=(2,2),
padding='valid'))

# 3rd Convolutional Layer


model.add(Conv2D(filters=384, kernel_size=(3,3),
strides=(1,1), padding='valid'))
model.add(Activation('relu'))

# 4th Convolutional Layer


model.add(Conv2D(filters=384, kernel_size=(3,3),
strides=(1,1), padding='valid'))
model.add(Activation('relu'))

# 5th Convolutional Layer


model.add(Conv2D(filters=256, kernel_size=(3,3),
strides=(1,1), padding='valid'))
model.add(Activation('relu'))
# Max Pooling
model.add(MaxPooling2D(pool_size=(3,3), strides=(2,2),
padding='valid'))
# Passing it to a Fully Connected layer, Here we do flatten!
model.add(Flatten())

# 1st Fully Connected Layer has 4096 neurons


model.add(Dense(4096, input_shape=(227*227*3,)))
model.add(Activation('relu'))
# Add Dropout to prevent overfitting
model.add(Dropout(0.4))

# 2nd Fully Connected Layer


model.add(Dense(4096))
model.add(Activation('relu'))

# Add Dropout
model.add(Dropout(0.4))

# Output Layer
model.add(Dense(1000))
model.add(Activation('softmax'))
model.summary()

# Compile the model


model.compile(loss=keras.losses.categorical_crossentropy,
optimizer='adam', metrics=["accuracy"])

You might also like