How to use Cursor with Ollama using SuperPOD
Overview
Teaching: 20 min
Exercises: 0 minQuestions
How to use Cursor with Ollama
Objectives
Cursor with OLLAMA on HPC: SuperPOD
Ollama
- Ollama is an open-source framework that enables users to run, create, and manage large language models (LLMs) locally on their computers and on HPC system
- How to use Ollama local LLM to run with Cursor with Jupyter Notebook file on SuperPOD?
Step by step:
- Use Jump Host in step 2 to open Cursor instance with SuperPOD
- In terminal following these step by step:
$ export OLLAMA_HOST=0.0.0.0
$ ollama serve &
$ ssh -R 80:127.0.0.1:11434 localhost.run
The following appear:
** your connection id is f567406f-1ffc-4793-8911-532f0b155ded, please mention it if you send me a message about an issue. **
authenticated as anonymous user
0b1c8d2351fa97.lhr.life tunneled with tls termination, https://0b1c8d2351fa97.lhr.life
create an account and add your key for a longer lasting domain name. see https://localhost.run/docs/forever-free/ for more information.
Copy the http link above and insert /v1 to the end:
https://0b1c8d2351fa97.lhr.life/v1
Cursor
- Back to Cursor, go to Cursor Setting: API Keys:
- Enable OpenAI API Key and use: ollama as the key
- Enable Overrider OpenAI Base URL and paste in the link copied from above
- In Add Custom Model, add gpt-oss:20b, qwen3.5:9b, llama3.1:8b (These are the models support tooling)
- Now you are ready to use Cursor with AI supported on SPOD
Key Points
Cursor, Ollama, SuperPOD,localhost.run