This lesson is being piloted (Beta version)

Using LLAMA3

Overview

Teaching: 20 min
Exercises: 0 min
Questions
  • How to use LLAMA3

Objectives
  • Using LLAMA3 on SuperPOD

LLAMA

Models:

All LLaMA models can be found from the HuggingFace:

How to use LLaMA3 on SuperPOD

Step 1: Request a compute node & Load the conda environment

$ srun -N1 -c10 -G1 --mem=64gb --time=12:00:00 --pty $SHELL
$ module load conda gcc/11.2.0
$ module load cuda/11.8.0-vbvgppx cudnn
$ conda activate ~/pytorch_1.13
$ jupyter lab --ip=0.0.0.0 --no-browser --allow-root

Step 2: Request LLaMA3 access

image

Sometimes, it takes a day for you to get approval.

Step 3: Install HuggingFace Hub

Following the guideline here to install HuggingFace Hub into your SuperPOD home folder

Step 4: Create HuggingFace Token

Step 5: Get ready to load LLaMA3 model in your port-forwared JupyterLab env:

image

Follow the example from HuggingFace, we have the following result:

image

Key Points

  • Meta, LLAMA3, SuperPOD