A complete AI-powered recipe search application that understands both text and images using LanceDB, PydanticAI, and Streamlit.
Perfect for learning! This Colab notebook provides a step-by-step tutorial with sample data. No setup required - just click and start learning about multimodal agents.
# Download the tutorial files from GitHub
# Extract all files to a folder named 'multimodal-recipe-agent'
# Navigate to the folder
cd multimodal-recipe-agentuv syncFirst, download the dataset:
multimodal-recipe-agent folderrecipes.csv file is in the data/ directoryThen run the import script:
uv run python import.pyThis will:
Streamlit Chat App:
uv run streamlit run app.pyJupyter Notebook Tutorial:
uv run jupyter notebook multimodal-recipe-agent.ipynbmultimodal-recipe-agent/
├── multimodal-recipe-agent.ipynb # Interactive tutorial
├── agent.py # PydanticAI agent implementation
├── app.py # Streamlit chat interface
├── import.py # Data import and processing
├── pyproject.toml # Modern Python project configuration
├── uv.lock # Locked dependency versions
├── README.md # This file
└── data/ # Generated data directory (created after import)
├── recipes.csv # Recipe dataset
├── images/ # Recipe images
└── recipes.lance # LanceDB databasemultimodal-recipe-agentimport.py processes recipe data, generates embeddings, and stores everything in LanceDBagent.py creates a PydanticAI agent with tools for searching recipesapp.py provides a Streamlit chat interface for interacting with the agentmultimodal-recipe-agent.ipynb walks through the implementation step-by-stepThis project demonstrates:
This project is part of the LanceDB tutorials and follows the same license terms.