This tutorial demonstrates building a versatile conversational AI agent using LangChain, a powerful framework that integrates Large Language Models (LLMs) with external tools and APIs. This agent can perform diverse tasks, from generating random numbers and offering philosophical musings to dynamically retrieving and processing information from webpages. The combination of pre-built and custom tools enables real-time, context-aware, and informative responses.
*This article is part of the***Data Science Blogathon.
Tool
Class for Web ScrapingThe synergy of LangChain, OpenAI, and DuckDuckGo allows for sophisticated conversational AI. OpenAI's LLMs provide natural language processing, while DuckDuckGo offers a privacy-focused search API. This combination enables the AI to generate contextually relevant responses and retrieve real-time data, enhancing its adaptability and accuracy. This powerful toolkit is ideal for creating intelligent chatbots or virtual assistants capable of handling diverse user inquiries.
Begin by installing required Python packages using pip:
<code>!pip -q install langchain==0.3.4 openai pip install langchain !pip -q install duckduckgo-search</code>
Verify LangChain's installation:
<code>!pip show langchain</code>
Obtain your OpenAI API key and set it as an environment variable:
<code>import os os.environ["OPENAI_API_KEY"] = "your_openai_key_here"</code>
Replace "your_openai_key_here"
with your actual key. This is crucial for interacting with the GPT-3.5-turbo model.
Establish a connection to OpenAI's model using LangChain:
<code>from langchain import OpenAI from langchain.chat_models import ChatOpenAI from langchain.chains.conversation.memory import ConversationBufferWindowMemory # Configure the GPT-4o LLM turbo_llm = ChatOpenAI( temperature=0, model_name='gpt-4o' )</code>
A low temperature (temperature=0) ensures consistent responses.
Enhance your agent's capabilities by adding the DuckDuckGo search tool:
<code>from langchain.tools import DuckDuckGoSearchTool from langchain.agents import Tool from langchain.tools import BaseTool search = DuckDuckGoSearchTool() # Define the tool tools = [ Tool( name = "search", func=search.run, description="Best for questions about current events. Use precise queries." ) ]</code>
This tool, described as ideal for current events, is added to the agent's toolkit.
Extend your agent's functionality with custom tools:
This function provides a playful response to the question of life's meaning:
<code>def meaning_of_life(input=""): return 'The meaning of life is 42 (approximately!)' life_tool = Tool( name='Meaning of Life', func= meaning_of_life, description="Use for questions about the meaning of life. Input: 'MOL'" )</code>
This tool generates random integers between 0 and 5:
<code>import random def random_num(input=""): return random.randint(0,5) random_tool = Tool( name='Random number', func= random_num, description="Use to get a random number. Input: 'random'" )</code>
Creating a conversational agent with custom tools allows for highly tailored interactions.
Import initialize_agent
and define the tools:
<code>from langchain.agents import initialize_agent tools = [search, random_tool, life_tool]</code>
Implement memory using ConversationBufferWindowMemory
:
<code>from langchain.chains.conversation.memory import ConversationBufferWindowMemory memory = ConversationBufferWindowMemory( memory_key='chat_history', k=3, return_messages=True )</code>
This allows the agent to recall recent conversation turns (up to 3).
Initialize the agent:
<code>conversational_agent = initialize_agent( agent='chat-conversational-react-description', tools=tools, llm=turbo_llm, verbose=True, max_iterations=3, early_stopping_method='generate', memory=memory )</code>
The parameters specify the agent type, tools, LLM, verbosity, iteration limit, early stopping, and memory.
Interact with the agent:
<code>conversational_agent("What time is it in London?") conversational_agent("Can you give me a random number?") conversational_agent("What is the meaning of life?")</code>
Refine the agent's behavior by adjusting the system prompt:
<code># system prompt conversational_agent.agent.llm_chain.prompt.messages[0].prompt.template</code>
<code>fixed_prompt = '''Assistant is a large language model... [modified prompt instructing the agent to use tools appropriately]'''</code>
Apply the modified prompt:
<code>conversational_agent.agent.llm_chain.prompt.messages[0].prompt.template = fixed_prompt</code>
Retest the agent.
Tool
Class for Web ScrapingCreate a custom tool to extract plain text from webpages:
<code>from bs4 import BeautifulSoup import requests from langchain.agents import Tool def stripped_webpage(webpage): # ... (function to fetch and clean webpage text) ... web_scraper_tool = Tool( name='Web Scraper', func=stripped_webpage, description="Fetches and cleans webpage text (limited to 4000 characters)." )</code>
Integrate this tool into your agent.
WebPageTool
ClassA more robust solution involves creating a custom WebPageTool
class:
from langchain.tools import BaseTool from bs4 import BeautifulSoup import requests class WebPageTool(BaseTool): # ... (class definition as in original response) ...
Reinitialize the agent with the new tool and updated system prompt. Test with examples like:
conversational_agent.run("Is there an article about Clubhouse on https://techcrunch.com/? today") conversational_agent.run("What are the top stories on www.cbsnews.com/?")
This tutorial demonstrates building a highly adaptable conversational agent using LangChain. The modular design allows for easy expansion and customization. This agent showcases the power of combining AI with real-time data access.
(Same FAQs as in the original response, reworded for better flow and conciseness.)
The above is the detailed content of Setting up Custom Tools and Agents in LangChain. For more information, please follow other related articles on the PHP Chinese website!