This hands-on workshop reveals use Langchain LLM software framework with Chroma embedding database to fine-tune an OpenAI GPT-3.5-Turbo LLM mannequin on net knowledge. The ultimate resolution returns ChatGPT like interface to your buyer net knowledge.
Additionally, you will be taught:
– Why you will need to fine-tune LLM fashions with ad-hoc knowledge
– Learn how to use open-source libraries i.e Langchain, ChromaDB
The workshop code:
– https://github.com/prodramp/DeepWorks/blob/main/ChatGPT/LangChainOpenAI.md
== Video Timeline ==
(00:00) Content material Intro
(00:58) The Drawback
(04:12) The Resolution
(06:13) Working Resolution Demo
(09:10) Understanding Resolution
(12:12) Open-source libs
(14:22) Internet Information as supply content material
(12:30) Testing UI (with out motion)
(15:58) Python Libs Put in
(17:08) Python Coding Begins
(17:27) Setting OpenAI API Key
(18:16) Setting Embeddings
(20:12) Setting Chunk Splitter
(21:02) Setting embeddings Mannequin
(21:43) Create & Persist Embeddings
(23:30) Take a look at Embeddings Code
(26:08) Setting Langchain App
(27:30) Including Question to KB DB
(29:18) Testing Question with KB
(30:24) Steady Queries
(32:08) Opensource Libs Overview
(32:41) OpenAI Billing
(33:34) Supply code
(34:00) Recap
=== Open-source Libraries Used: ===
– https://github.com/chroma-core/chroma
– https://github.com/hwchase17/chat-langchain
– https://github.com/openai
Please go to:
https://prodramp.com | @prodramp
https://www.linkedin.com/company/prodramp
Content material Creator:
Avkash Chauhan (@avkashchauhan)
https://www.linkedin.com/in/avkashchauhan
Tags:
#llm #chatgpt #finetunellm #openai #python #ai #langchain #chromadb