FM-LLM Solver is a cutting-edge system for generating and verifying barrier certificates for dynamical systems using Large Language Models (LLMs) enhanced with Retrieval-Augmented Generation (RAG) and fine-tuning capabilities.
- LLM-Powered Generation: Generate barrier certificates using state-of-the-art language models
- Multi-System Support: Handle continuous, discrete, and stochastic dynamical systems
- RAG Integration: Leverage academic papers and examples for improved generation
- Comprehensive Verification: Multiple verification methods (numerical, symbolic, SOS)
- Web Interface: User-friendly interface for system input and visualization
- API Access: RESTful API for programmatic access
- Extensible Architecture: Modular design for easy extension and customization
# Install from PyPI (coming soon)
pip install fm-llm-solver
# Install from source
git clone https://github.com/yourusername/FM-LLM-Solver.git
cd FM-LLM-Solver
pip install -e ".[all]"
from fm_llm_solver import CertificateGenerator, SystemDescription
# Define your dynamical system
system = SystemDescription(
dynamics={"x": "-x + y", "y": "x - y"},
initial_set="x**2 + y**2 <= 0.5",
unsafe_set="x**2 + y**2 >= 2.0"
)
# Generate a barrier certificate
generator = CertificateGenerator.from_config()
result = generator.generate(system)
if result.success:
print(f"Certificate: {result.certificate}")
print(f"Confidence: {result.confidence:.2%}")
else:
print(f"Generation failed: {result.error}")
# Start the web interface
fm-llm-solver web
# Or with custom configuration
fm-llm-solver web --config config/production.yaml --host 0.0.0.0 --port 8080
# Start the inference API
fm-llm-solver api
# Run both web interface and API
fm-llm-solver both
FM-LLM-Solver/
βββ fm_llm_solver/ # Main package
β βββ core/ # Core components (config, logging, types)
β βββ services/ # Business logic services
β βββ web/ # Flask web interface
β βββ api/ # FastAPI inference API
β βββ utils/ # Utility functions
βββ tests/ # Test suite
β βββ unit/ # Unit tests
β βββ integration/ # Integration tests
β βββ benchmarks/ # Performance benchmarks
βββ config/ # Configuration files
βββ docs/ # Documentation
βββ scripts/ # Utility scripts
βββ data/ # Data files
Create a config/config.yaml
file:
model:
provider: qwen
name: Qwen/Qwen2.5-14B-Instruct
temperature: 0.7
device: cuda
rag:
enabled: true
k_retrieved: 3
chunk_size: 1000
verification:
methods: [numerical, symbolic]
numerical:
num_samples: 1000
security:
rate_limit:
requests_per_day: 50
# Run all tests
fm-llm-solver test
# Run with coverage
fm-llm-solver test --coverage
# Run specific tests
fm-llm-solver test tests/unit/test_generator.py
We welcome contributions! Please see our Contributing Guide for details.
# Set up development environment
pip install -e ".[dev]"
pre-commit install
# Run code quality checks
black fm_llm_solver tests
isort fm_llm_solver tests
flake8 fm_llm_solver tests
mypy fm_llm_solver
FM-LLM Solver follows a modular architecture:
- Core Layer: Configuration, logging, exceptions, and type definitions
- Service Layer: Certificate generation, verification, knowledge base management
- Interface Layer: Web UI and REST API
- Infrastructure Layer: Caching, monitoring, and deployment utilities
# Build and run with Docker
docker build -t fm-llm-solver .
docker run -p 5000:5000 -p 8000:8000 fm-llm-solver
See Deployment Guide for cloud deployment options (AWS, GCP, Azure).
- Certificate generation: ~2-5 seconds per system
- Verification: <1 second for numerical, 2-10 seconds for symbolic
- Supports batch processing for multiple systems
- GPU acceleration available for LLM inference
- Authentication and authorization for web interface
- Rate limiting to prevent abuse
- API key management for programmatic access
- Secure session handling
- Input validation and sanitization
This project is licensed under the MIT License - see the LICENSE file for details.
- Thanks to the Qwen team for the excellent language models
- Inspired by research in formal methods and neural certificate generation
- Built with support from the University of Colorado
- Author: Patrick Allen Cooper
- Email: [email protected]
- Website: fm-llm-solver.ai
Made with β€οΈ by researchers, for researchers