Hands-On Guide Running DeepSeek LLMs Locally
Hands-On Guide Running DeepSeek LLMs Locally
on
Janus-Pro-1B
Overview of DeepSeek Models
DeepSeek offers a diverse range of models, each
optimized for different tasks. Below is a breakdown of
which model suits your needs best:
*If you have downloaded the PDF, you can able to click on the link
You can explore other DeepSeek models available on
Ollama here: Ollama Model Search.
This step may take some time, so wait for the download
to complete.
Running DeepSeek-Janus-Pro-1B
on Google Colab
In this section, we’ll try out DeepSeek-Janus-Pro-1B
using Google Colab. Before starting, make sure to set the
runtime to T4 GPU for optimal performance.
Step 1: Clone the DeepSeek-Janus Repository
Run the following command in a Colab notebook:
Step 2: Install Dependencies
Navigate to the cloned directory and install the required
packages:
📷 Input Image
Initializing the Prompt and System Role