Skip to content
James Briggs
0:47:55
3 373
116
16
Last update : 04/09/2024

🍕 Your Personal AI Pizza Guide: Building a Local Agent with LangGraph and Ollama

Have you ever craved pizza while traveling and wished for an AI assistant to navigate the local pizza scene? This breakdown explores how to build your own pizza-recommending AI agent using LangGraph and Ollama, all running locally on your computer!

🧠 Understanding the AI Agent Architecture

Imagine your AI agent as a clever friend who helps you find the perfect pizza. This friend utilizes:

  • The Oracle (LLM): This is the brain of the operation, using Llama 3.1 (an 8 billion parameter language model) to make decisions based on your requests.
  • Search Tool: This tool scours Reddit, acting as a source of local knowledge for the best pizza places.
  • Final Answer Tool: This tool crafts a user-friendly response with the pizza recommendation, incorporating information gathered from the search.
  • Agent State: This acts as the friend’s memory, storing your initial request (e.g., “best pizza in Rome”) and any relevant context (e.g., “I’m currently in Rome”).

The magic happens as these components interact: You provide a request, the Oracle decides whether to search for information or directly answer, and the Final Answer Tool delivers the pizza verdict.

🧰 Building the AI Toolkit: Ollama and LangGraph

To create this AI friend, we use two powerful open-source tools:

  • Ollama: This tool allows you to run large language models (like Llama 3.1) locally on your computer. Think of it as providing the physical space for your AI friend to exist.
  • LangGraph: This tool enables you to build the agent’s decision-making process within a graph-like structure. Imagine it as creating a flowchart for your AI friend to follow.

🧱 Constructing Your AI Friend: Step-by-Step

  1. Setting Up Ollama: Download and install Ollama, then download the Llama 3.1 8B model.
  2. Creating the Reddit Search Tool: Utilize the Reddit API to fetch relevant posts and comments about pizza in a specific location.
  3. Designing the Final Answer Tool: Format the output to provide a clear pizza recommendation, including details like the restaurant name and why it’s worth trying.
  4. Building the Agent Graph: Connect the Oracle, Search Tool, and Final Answer Tool in a logical flow using LangGraph. This ensures the agent understands when to search for information and when to provide an answer.

🗣️ Communicating with Your AI Friend: Prompting and Context

To get the best pizza recommendations, it’s crucial to provide clear instructions and context to your AI friend:

  • System Prompt: Provide a clear objective for the agent, such as “You are a helpful AI assistant that recommends the best pizza places based on user preferences.”
  • User Query: Be specific with your request, mentioning your location and any dietary restrictions (e.g., “Where is the best gluten-free pizza in Rome?”).

🚀 Testing Your Pizza Agent and Enjoying the Results!

Once everything is set up, you can start asking your AI friend for pizza recommendations! While the agent might occasionally hallucinate (suggesting non-existent pizzerias), with some fine-tuning and clear prompting, it can become a valuable tool for navigating the world of pizza. 🍕

🧰 Resource Toolbox:

Remember: Building an AI agent is an iterative process. Don’t be afraid to experiment, adjust your prompts, and improve your agent’s logic over time.

Other videos of

Play Video
James Briggs
0:46:13
11 296
338
35
Last update : 25/08/2024