in

Chat with OpenAI in LangChain – #5



With the advent of OpenAI’s Chat-GPT API endpoint (ChatCompletion), LangChain quickly added support for the new endpoint. Unlike previous LLM endpoints, the chat endpoint takes multiple inputs and so has its own unique set of objects and methods.

OpenAI’s `ChatCompletion` endpoint consumes three types of input:

– System message — this acts as an initial prompt to “setup” the behavior of the chat completion.
– Human messages — these are human prompts (both current and past) that are fed into the model.
– AI messages — past AI responses to the human prompts.

The prior `Completion` endpoint used for other models, like OpenAI’s `text-davinci-003`, only accepted a single `input` field. In `input`, we would write everything.

Because of this difference, we now have the LangChain `ChatOpenAI` object alongside several new prompt templates and “message” objects. We’ll explore how these are used in this chapter.

Code notebook:
https://github.com/pinecone-io/examples/blob/master/learn/generation/langchain/handbook/04-langchain-chat.ipynb

LangChain ebook:
https://pinecone.io/learn/langchain/

AI Dev Studio:
https://aurelio.ai/

Subscribe for Article and Video Updates!
https://jamescalam.medium.com/subscribe
https://medium.com/@jamescalam/membership

Discord:
https://discord.gg/c5QtDB9RAP

00:00 LangChain’s new Chat modules
02:09 New LangChain chat in Python
03:14 Using LangChains ChatOpenAI object
04:36 Chat messages in LangChain
06:43 New chat prompt templates
09:05 LangChain human message prompt template
13:18 Using multiple chat prompt templates
17:42 F-strings vs. LangChain prompt templates
19:23 Where to use LangChain chat features?

#artificialintelligence #nlp #openai #deeplearning #langchain

How to Build Custom Actions for GPTs Without Coding (OpenAI Custom Tools)

SciTechDaily

AI Revolutionizes Complex Problem-Solving in Logistics and Beyond