0% found this document useful (0 votes)
67 views2 pages

Ollama Ai Colab - Ipynb

Ollqma

Uploaded by

4qqfsvgcgm
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
67 views2 pages

Ollama Ai Colab - Ipynb

Ollqma

Uploaded by

4qqfsvgcgm
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
You are on page 1/ 2

{

"cells": [
{
"cell_type": "code",
"execution_count": null,
"id": "93f59dcb-c588-41b8-a792-55d88ade739c",
"metadata": {},
"outputs": [],
"source": [
"# Download and install ollama to the system\n",
"!curl https://ollama.ai/install.sh | sh"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "658c147e-c7f8-490e-910e-62b80f577dda",
"metadata": {},
"outputs": [],
"source": [
"!pip install aiohttp pyngrok\n",
"\n",
"import os\n",
"import asyncio\n",
"\n",
"# Set LD_LIBRARY_PATH so the system NVIDIA library \n",
"os.environ.update({'LD_LIBRARY_PATH': '/usr/lib64-nvidia'})\n",
"\n",
"async def run_process(cmd):\n",
" print('>>> starting', *cmd)\n",
" p = await asyncio.subprocess.create_subprocess_exec(\n",
" *cmd,\n",
" stdout=asyncio.subprocess.PIPE,\n",
" stderr=asyncio.subprocess.PIPE,\n",
" )\n",
"\n",
" async def pipe(lines):\n",
" async for line in lines:\n",
" print(line.strip().decode('utf-8'))\n",
"\n",
" await asyncio.gather(\n",
" pipe(p.stdout),\n",
" pipe(p.stderr),\n",
" )\n",
"\n",
"#register an account at ngrok.com and create an authtoken and place it here\
n",
"await asyncio.gather(\n",
" run_process(['ngrok', 'config', 'add-authtoken','your-auth-token'])\n",
")\n",
"\n",
"await asyncio.gather(\n",
" run_process(['ollama', 'serve']),\n",
" run_process(['ngrok', 'http', '--log', 'stderr', '11434', '--host-header',
'localhost:11434'])\n",
")"
]
},
{
"cell_type": "markdown",
"source": [
"Ngrok exposes a url, which you the have to export as OLLAMA_HOST\n",
"\n",
"`export OLLAMA_HOST=https://fd90-34-125-15-193.ngrok.io/`\n",
"\n",
"after that we can use ollama on our remote instance from our local machine."
],
"metadata": {
"collapsed": false
},
"id": "626cebf7be1f841b"
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.6"
}
},
"nbformat": 4,
"nbformat_minor": 5
}

You might also like