LAB SHEET 1 Basics
LAB SHEET 1 Basics
deep learning models. These frameworks handle the complex math and computations
behind neural networks, so you can focus on designing your model. Here are some of the
most popular ones explained simply:
1. TensorFlow
What it’s good for: Works for beginners and advanced users. Great for building large,
scalable models.
Why it’s useful: It provides pre-built tools (like Keras) for easy use, but you can also
dive deep into advanced features.
2. PyTorch
What it’s good for: Research and quick experimentation. It’s very user-friendly and
works well for small and medium-sized projects.
Why it’s useful: It feels natural to code in and is popular among researchers and
developers.
3. Keras
What it’s good for: Beginners who want to start with deep learning quickly.
Why it’s useful: It’s simple and intuitive, making it great for fast prototyping.
4. MXNet
Why it’s useful: It’s efficient for distributed computing and often used in cloud
environments.
5. Caffe
What it’s good for: Image processing tasks like recognizing objects in photos.
Why it’s useful: It’s optimized for speed and works well for computer vision
applications.
6. JAX
What it’s good for: Advanced computations and machine learning research.
Why it’s useful: Combines ease of use with powerful tools for faster execution.
7. Theano
What it’s good for: Learning the basics of deep learning (though it’s no longer
actively developed).
In summary:
Summary of Workflow:
# or
Deep learning usually starts with data. Most labs use preloaded datasets. For
example:
Neural networks work best with numbers between 0 and 1. Normalize your data:
model = Sequential([
Flatten(input_shape=(28, 28)),
Dense(128, activation='relu'),
Dense(10, activation='softmax')
])
Tell your model how to learn (loss function) and how to improve (optimizer):
model.compile(optimizer='adam',
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
9. Visualize Results
prediction = model.predict(x_test[0:1])