This lesson is being piloted (Beta version)

How to use PaperQA with SuperPOD

Overview

Teaching: 20 min
Exercises: 0 min
Questions
  • How to use PaperQA

Objectives
  • paperqa on SuperPOD

PaperQA

Open-source version

Requirement:

Ollama installed on SuperPOD

How to installa and use paperqa on SuperPOD

Step 1: request a compute node with 1 GPU:

$ srun -A Allocation -N1 -G1 --mem=64gb --time=12:00:00 --pty $SHELL

Step 2: Load Ollama model:

$ module load ollama

Step 3: Export path to Ollama model

Here we use Ollama models from STARS Project storage. Please inform me if you need access to that location.

$ export OLLAMA_MODELS=/projects/tuev/oit_rts_star/oit_rts_star_storage/Ollama_models

Step 4: Serve Ollama

$ ollama serve &

Step 5: Now Ollama has been loaded and served. Let’s check the local models:

$ ollama list

You should see the screen like this:

image

If there are any other models that you want us to download, please email me: tuev@smu.edu

Step 6: Download Ollama model

You can download any LLM model that you downloaded previously to chat:

$ ollama pull llama3:70

Step 7: Install paperqa from source

paperqa can be installed with pip

$ pip install paper-qa>=5

Step 8: Run paper-qa using CLI

$ pqa ask 'What manufacturing challenges are unique to bispecific antibodies?'
$ pqa --temperature 0.5 ask 'What manufacturing challenges are unique to bispecific antibodies?'

Step 9: Using via Python library

from paperqa import Settings, ask
from paperqa.settings import AgentSettings
import os

os.environ['OPENAI_API_KEY'] = "ollama"


local_llm_config = dict(
    model_list=[
        dict(
            model_name='ollama/llama3:70b',
            litellm_params=dict(
                model='ollama/llama3:70b',
                api_base="http://localhost:11434",
            ),
        )
    ]
)

answer = ask(
    "How do marketing activities drive firm revenues?",
    settings=Settings(
        llm='ollama/llama3:70b',
        llm_config=local_llm_config,
        summary_llm='ollama/llama3:70b',
        summary_llm_config=local_llm_config,
        embedding='ollama/mxbai-embed-large',
        agent=AgentSettings(
            agent_llm='ollama/llama3:70b', 
            agent_llm_config=local_llm_config
        ),
        paper_directory="net_pdfs"
    ),
)

Step 10: Stop Ollama model

$ killall ollama

Key Points

  • paperqa, Ollama, SuperPOD