How to run LLAMA3 (2024) on a crappy laptop

Опубликовано: 22 Апрель 2024
на канале: Draupner Data
392
15

Anyone with a crappy old Laptop can run Metas Llama3 locally and for free!
This is especially important for us who are not allowed to enter Meta-AIs webpage at all! Let me show you how, and what the limitations are.

I will use ollama. Show how to install this tool on Linux, Mac and Windows.

Then evaluate llama2, llama3 and gemma, for usage and execution time.

I am running this test on a Dell XPS-13 from 2018, with no graphics card installed!

Will I choose to use local and free llama3, or go back to pay for ChatGPT to OpenAI?
What will you chose?

Links:
Ollama/install:
https://github.com/ollama/ollama/tree...
Ollama/models:
https://ollama.com/library

00:00 Any old junk
00:35 Install
02:00 Pulling Models
03:00 Usage
04:30 Answers
05:50 Execution time 7b
07:10 Execution time 70b
09:40 Evaluation
10:40 Next