When building AI agents, one of the most powerful aspects is their ability to manage and execute tools (function calls). Tools can help an agent perform tasks like scraping data, summarizing content, or even solving complex workflows. But as your AI agent grows in size and capabilities, it becomes increasingly difficult to manage/maintain multiple tools.
In this tutorial, we’ll focus on using the Toolhouse SDK to demonstrate how to manage tools effectively and how we can track every single tool call using the platform.
For this example, we’ll build a very simple interface where a user can input a URL and a prompt, and an AI agent will use tools to scrape the webpage and process the data.
AI agents are nothing without Tools. They're like arms and legs of the AI agent. Each Tool is a specialized skill or function that the AI relies on to complete a specific task.
User-facing AI agents need to be flawless in their execution of different tasks. And writing AI tools from scratch to implement API integration or web scraping logic is like re-inventing the wheel that also requires maintenance over the long run by the dev team.
These problems are taken care of by Toolhouse. It helps you to:
These capabilities simplify your tool management and lets you focus on building smarter AI agents instead of worrying about building/maintaining Tools.
Alright so let's build an AI-powered web-scraper. Sounds fancy but it's just a single page app that lets you input a URL to scrape and an optional prompt that you want to execute along with the scraped data.
Here's what you'll need:
We’ll use React to create a simple frontend for managing tool calls. Make sure you have create-react-app installed which we'll use to initialize a new React application. If you don't have it installed, you can do so by running:
npm install -g create-react-app
Open your favorite code editor and inside the terminal type the following:
npx create-react-app ai-scraper
Once it's done creating a new app, change into the project directory:
cd ai-scraper
If you expand the ai-scraper folder, it should look like this:
Great! Let's now start the server:
npm install -g create-react-app
It should automatically start a new app at localhost:3000:
Neat! Let's install all the essential libraries now.
These SDKs will let our app interact with the Toolhouse platform and OpenAI models.
npx create-react-app ai-scraper
Create a new .env inside the project folder ai-scraper and add the following API keys:
cd ai-scraper
You can find your OpenAI API key at platform.openai.com/api-keys. In the .env file replace "your_openai_api_key" with the actual OpenAI key.
Let's now see how we can set-up our Toolhouse account for our AI web scraping app. In order to get your Toolhouse API key you'll first need to create an account at Toolhouse.ai.
Once you've signed up, go to API Keys page. This page should look something like the following:
Clicking on the eye icon should reveal your API key. Copy this and paste it in your .env file in the place of "your_toolhouse_api_key".
This is how your Dashboard looks like:
On the left menu click on "Bundles". This will take us to a new page where we can create a new Bundle. The purpose of Bundles is to organize our AI Tools into groups or packs.
Once created, you will then be taken to this page, where you can find different pre-made tools and add them to your Bundle:
If you scroll further down, you'll find a Tool named Tavily web search. Enable this Tool and it will be added to your Bundle:
Coming back to our app, we’ll now create a simple React component to showcase how tools are managed and executed. Go to your App.js file (or App.ts if you're using TypeScript) inside the src folder and replace the entire code inside with the following code:
npm install -g create-react-app
Stop the React server if it's already running by typing Ctrl C inside the terminal. Run the following command to start the server again in order to load up the environment variables:
npx create-react-app ai-scraper
This is how your app should look like:
You can enter any URL and then a prompt, then our AI agent will scrape the URL and summarize the webpage. Note that some websites like microsoft.com don't allow scraping and hence our scraper will fail in those cases, so make sure the URLs you use allow scraping.
Here's me playing around with the scraper:
You can also monitor every single Tool call made to the Tools hosted on Toolhouse. This can help you estimate the number of Tool calls and optimize your Tool calls to save time and money.
Here's how the Execution Logs look like:
As you can see you'll find the exact time of each Tool call as well as the ouput of each Tool call in the Execution Logs.
That's about it for this tutorial. If you want to learn more about building AI agents, feel free to follow me here or on LinkedIn.
The above is the detailed content of Managing AI Tools for Function Calling with Toolhouse SDK. For more information, please follow other related articles on the PHP Chinese website!