Python Gradio UI for Fine-tuning LLM (OpenAI GPT) with Custom Data

Опубликовано: 01 Январь 1970
на канале: 650 AI Lab
4,754
115

This video is hands-on-lab to create a python application using Gradio UI to fine-tuning OpenAI LLM model with llamaindex (or gpt-index) library.

Fine-tuning LLM: LLM can be extended, by fine tuning with custom dataset to provide Q&A, summary and many other ChatGPT like functions. In this video, we are using LlamaIndex (GPT Index) which provides a central interface to connect your LLM's with external data.

The full code for the Gradio UI is hosted here:
https://github.com/prodramp/DeepWorks...

== Video Timeline ==
(00:00) Content Intro
(00:10) Part 1 Recap
(01:14) Gradio UI Design
(05:55) Python Coding Start
(06:46) Tab 1- OpenAI Config UI
(08:15) Tab 2- Fine-tune LLM UI
(08:55) Tab 3- Query Prompt UI
(12:30) Testing UI (without action)
(13:05) Tab 1- Click Action Code
(15:15) Tab 2- Click Action Code
(23:05) Fine-turning with OpenAI UI
(24:50) Fine-turning with OpenAI Run
(28:28) Tab 3- Click Action(s) Code
(33:05) Query with Index UI
(35:30) Full testing with Text Document
(36:18) API Billing and Cost
(38:30) Code Repo and GitHub
(39:04) Summary

Please visit:
https://prodramp.com | @prodramp
  / prodramp  

Content Creator:
Avkash Chauhan (@avkashchauhan)
  / avkashchauhan  

Tags:
#llm #chatgpt #finetunellm #openai #python #ai