PySpur - Graph UI for AI Agents
What is the project about?
PySpur is a visual, graph-based user interface (UI) designed for building, testing, and deploying AI agents. It provides a drag-and-drop environment for creating complex workflows involving Large Language Models (LLMs), data processing, and external tools.
What problem does it solve?
PySpur simplifies the development and deployment of AI agents by:
- Providing a visual, no-code/low-code interface, making agent creation accessible to a wider audience.
- Streamlining the iterative process of building, testing, and refining agent behavior.
- Offering built-in support for common AI agent tasks like iterative tool calling, file processing, structured output generation, and RAG (Retrieval-Augmented Generation).
- Facilitating the evaluation and deployment of agents.
What are the features of the project?
- Drag-and-Drop Interface: Build workflows visually.
- Loops: Enable iterative tool calling with memory.
- File Upload: Process documents (PDFs, videos, audio, images, text).
- Structured Outputs: Define JSON schemas for structured data extraction.
- RAG: Integrate Retrieval-Augmented Generation with document parsing, chunking, embedding, and vector database upsert.
- Multimodal Support: Handle various data types (video, images, audio, text, code).
- Tool Integration: Connect with external services like Slack, Firecrawl.dev, Google Sheets, GitHub, etc.
- Evals: Evaluate agent performance on datasets.
- One-Click Deploy: Publish agents as APIs.
- Python-Based: Extend functionality by adding custom nodes with Python.
- Vendor Agnostic: Supports a wide range of LLM providers, embedders, and vector databases (>100).
- Node-Level Debugging: Inspect the input and output of each node in the workflow.
- Support for Local Models: Integration with Ollama for using local LLMs.
What are the technologies used in the project?
- Python: The core language for extending the platform and creating custom nodes.
- Docker: Used for containerization and deployment.
- PostgreSQL: Used as the database to store spurs and other state information.
- LLM Providers: Supports various LLM providers (OpenAI, Anthropic, etc.) and local models via Ollama.
- Vector Databases: Integrates with vector databases for RAG functionality.
- (Implied) Web Technologies: The UI is web-based, implying the use of HTML, CSS, and JavaScript (or a framework built on them).
What are the benefits of the project?
- Faster Development: Accelerates the creation and iteration of AI agents.
- Increased Accessibility: Lowers the barrier to entry for building AI agents.
- Improved Debugging: Node-level debugging simplifies troubleshooting.
- Flexibility: Supports a wide range of use cases and integrations.
- Scalability: Docker-based deployment enables scaling.
- Extensibility: Python-based node creation allows for customization.
- Easy Deployment: One-click deployment to API simplifies integration.
What are the use cases of the project?
- Building AI assistants for various tasks.
- Creating automated workflows involving document processing and data extraction.
- Developing RAG-based applications.
- Prototyping and testing AI agent concepts.
- Building custom AI solutions that integrate with external services.
- Any application that requires complex interactions with LLMs and other tools.
