1. Prerequisites
To get started, you should have:
- Basic understanding of Python and API development.
- Access to OpenAI's API.
- Familiarity with LangChain.
2. Tools and Libraries
We will use the following tools:
- OpenAI API: For using GPT models.
- LangChain: For managing conversations, memory, and interactions.
- Requests: For making HTTP requests to external APIs.
You can install the necessary libraries by running:
pip install openai langchain requests
3. Overview of the Architecture
The chatbot we will build will perform three key functions:
- Understand user inputs and process them via the OpenAI GPT model.
- Use LangChain to manage conversation history and handle complex workflows.
- Function calling via APIs: When a specific function is requested, the chatbot will make an external API call to perform actions such as retrieving weather information, querying a database, or booking an appointment.
4. Setting Up OpenAI GPT Model
First, let’s configure the OpenAI API to integrate GPT into our chatbot.
4.1 API Configuration
You need to get your API key from the OpenAI Dashboard and configure the Python script:
<pre><code class="language-python">
# Set up your OpenAI API key
openai.api_key = 'YOUR_OPENAI_API_KEY'
def generate_gpt_response(prompt):
response = openai.Completion.create(
engine="gpt-4", # Use GPT-4 model or GPT-3.5
prompt=prompt,
max_tokens=150,
temperature=0.7
)
return response.choices[0].text.strip()
<pre/><code />
4.2 Test the GPT Response
You can now run a simple test by calling the function:
pythonCopy codeprompt = "What is the capital of France?"
response = generate_gpt_response(prompt)
print(response)
5. Introducing LangChain for Conversation Management
LangChain allows us to structure the chatbot's conversation flow. You can think of it as the memory manager that tracks user inputs and maintains the context of the conversation.
5.1 Setting Up LangChain
To integrate LangChain, we will first initialize the conversation agent with memory management.
pythonCopy codefrom langchain.chains import ConversationChain
from langchain.llms import OpenAI
# Initialize OpenAI LLM
llm = OpenAI(api_key=openai.api_key)
# Initialize ConversationChain with memory
conversation_chain = ConversationChain(llm=llm)
def handle_conversation(user_input):
return conversation_chain.run(user_input)
Now, when the user interacts with the chatbot, LangChain will handle the conversation flow and keep track of the user’s queries.
5.2 Testing Conversation with Memory
pythonCopy codeuser_input = "Hello, what is the weather today?"
response = handle_conversation(user_input)
print(response)
At this point, we have the OpenAI GPT model responding to user queries and LangChain handling the conversation flow. But what if the user asks for information that requires an external API call, like fetching live weather data? That’s where function calling comes in.
6. Function Calling Using External API
Let’s say we want the chatbot to fetch weather information using an external weather API. To do this, we’ll make an API call based on the user’s request.
6.1 Define a Function for API Calls
We’ll use the requests library to call an external weather API (e.g., OpenWeatherMap).
pythonCopy codeimport requests
# Function to call the weather API
def get_weather(city):
api_key = "YOUR_WEATHER_API_KEY"
url = f"http://api.openweathermap.org/data/2.5/weather?q={city}&appid={api_key}"
response = requests.get(url)
if response.status_code == 200:
data = response.json()
weather_description = data['weather'][0]['description']
temp = data['main']['temp']
return f"The weather in {city} is {weather_description} with a temperature of {temp}K."
else:
return "Sorry, I couldn't retrieve the weather information."
6.2 Integrate Function Calling in Chatbot
We now modify our conversation handler to recognize specific function calls and delegate them to the appropriate API function.
pythonCopy codedef chatbot_with_function_calling(user_input):
if "weather" in user_input.lower():
# Extract city name from user input (for simplicity)
city = user_input.split()[-1] # Assume last word is city name
return get_weather(city)
else:
# For non-weather-related queries, use GPT model
return handle_conversation(user_input)
6.3 Test Function Calling
pythonCopy codeuser_input = "What is the weather in Paris?"
response = chatbot_with_function_calling(user_input)
print(response)
The chatbot should now detect when the user asks for weather information and call the external API to fetch the weather data for the specified city.
7. Advanced Example: Multiple API Function Calls
You can extend this functionality to handle different kinds of API calls. For instance, the chatbot could also book appointments, query databases, or fetch stock prices. Here’s an example of adding multiple function calls:
pythonCopy codedef chatbot_with_multiple_api_calls(user_input):
if "weather" in user_input.lower():
city = user_input.split()[-1]
return get_weather(city)
elif "stock price" in user_input.lower():
company = user_input.split()[-1]
return get_stock_price(company)
else:
return handle_conversation(user_input)
8. Conclusion
By combining OpenAI's GPT with LangChain, we have built a chatbot that can handle general conversational tasks while also performing specific actions via function calling with APIs. This architecture can be extended to handle various user interactions like querying databases, interacting with third-party services, and more.
9. Full Code Example
Here is the complete chatbot with function calling:
pythonCopy codeimport openai
import requests
from langchain.chains import ConversationChain
from langchain.llms import OpenAI
# API Key Setup
openai.api_key = 'YOUR_OPENAI_API_KEY'
# Initialize OpenAI LLM and LangChain conversation
llm = OpenAI(api_key=openai.api_key)
conversation_chain = ConversationChain(llm=llm)
# Function to call weather API
def get_weather(city):
api_key = "YOUR_WEATHER_API_KEY"
url = f"http://api.openweathermap.org/data/2.5/weather?q={city}&appid={api_key}"
response = requests.get(url)
if response.status_code == 200:
data = response.json()
weather_description = data['weather'][0]['description']
temp = data['main']['temp']
return f"The weather in {city} is {weather_description} with a temperature of {temp}K."
else:
return "Sorry, I couldn't retrieve the weather information."
# Chatbot with function calling
def chatbot_with_function_calling(user_input):
if "weather" in user_input.lower():
city = user_input.split()[-1]
return get_weather(city)
else:
return conversation_chain.run(user_input)
# Example usage
user_input = "What is the weather in London?"
response = chatbot_with_function_calling(user_input)
print(response)