diff --git a/projects/unit3/build-mcp-server/solution/README.md b/projects/unit3/build-mcp-server/solution/README.md deleted file mode 100644 index 6c90cb0..0000000 --- a/projects/unit3/build-mcp-server/solution/README.md +++ /dev/null @@ -1,73 +0,0 @@ -# Module 1: Basic MCP Server with PR Template Tools - -This module implements a basic MCP server that provides tools for analyzing git changes and suggesting appropriate PR templates. - -## Setup - -### 1. Install uv - -Follow the official installation instructions at: https://docs.astral.sh/uv/getting-started/installation/ - -### 2. Install dependencies - -```bash -# Install all dependencies -uv sync - -# Or install with dev dependencies for testing -uv sync --all-extras -``` - -### 3. Configure the MCP Server - -Add the server to Claude Code: - -```bash -# Add the MCP server -claude mcp add pr-agent -- uv --directory /absolute/path/to/module1/solution run server.py - -# Verify it's configured -claude mcp list -``` - -## Tools Available - -1. **analyze_file_changes** - Get the full diff and list of changed files -2. **get_pr_templates** - List available PR templates with their content -3. **suggest_template** - Let Claude analyze changes and suggest a template - -## Usage Example - -1. Make some changes in a git repository -2. Ask Claude: "Can you analyze my changes and suggest a PR template?" -3. Claude will: - - Use `analyze_file_changes` to see what changed - - Analyze the diff to understand the nature of changes - - Use `suggest_template` to recommend the most appropriate template - - Help you fill out the template based on the specific changes - -## How It Works - -Unlike traditional template systems that rely on file extensions or simple patterns, this MCP server provides Claude with raw git data and lets Claude's intelligence determine: -- What type of change is being made (bug fix, feature, refactor, etc.) -- Which template is most appropriate -- How to fill out the template based on the actual code changes - -This approach leverages Claude's understanding of code and context rather than rigid rules. - -## Running Tests - -```bash -# Run the validation script -uv run python validate_solution.py - -# Run unit tests -uv run pytest test_server.py -v -``` - -## Running the Server Directly - -```bash -# Start the MCP server -uv run server.py -``` \ No newline at end of file diff --git a/projects/unit3/build-mcp-server/solution/manual_test.md b/projects/unit3/build-mcp-server/solution/manual_test.md deleted file mode 100644 index 0dcfa4c..0000000 --- a/projects/unit3/build-mcp-server/solution/manual_test.md +++ /dev/null @@ -1,115 +0,0 @@ -# Manual Testing Guide for Module 1 Solution - -## Prerequisites - -1. Ensure you're in a git repository with some changes -2. Install uv following instructions at: https://docs.astral.sh/uv/getting-started/installation/ -3. Install dependencies: - ```bash - uv sync --all-extras - ``` - -## Test 1: Validate the Solution - -Run the automated validation script: -```bash -uv run python validate_solution.py -``` - -This will check: -- Git environment -- Python imports -- Server creation -- Tool registration -- Tool execution -- Template creation - -## Test 2: Run Unit Tests - -```bash -uv run pytest test_server.py -v -``` - -## Test 3: Test with Claude Code - -1. **Configure MCP Server** - - Add the server to Claude Code: - ```bash - # Add the MCP server - claude mcp add pr-agent -- uv --directory /absolute/path/to/module1/solution run server.py - - # Verify it's configured - claude mcp list - ``` - -2. **Restart Claude Code** to pick up the new server - -3. **Make Some Git Changes** - - In any git repository: - ```bash - echo "test change" >> README.md - git add README.md - ``` - -4. **Test with Claude** - - Ask Claude: - - "Can you analyze my git changes?" - - "What PR templates are available?" - - "Based on my changes, which PR template should I use?" - -## Test 4: Direct Server Testing - -You can also test the server directly: - -```python -import asyncio -from server import analyze_file_changes, get_pr_templates, suggest_template - -async def test(): - # Test analyze_file_changes - changes = await analyze_file_changes("main", True) - print("Changes:", changes[:200] + "...") - - # Test get_pr_templates - templates = await get_pr_templates() - print("Templates available:", len(json.loads(templates))) - - # Test suggest_template - suggestion = await suggest_template( - "Fixed authentication bug", - "bug" - ) - print("Suggestion:", json.loads(suggestion)["recommended_template"]["type"]) - -asyncio.run(test()) -``` - -## Expected Behavior - -1. **analyze_file_changes** should return JSON with: - - base_branch - - files_changed - - statistics - - commits - - diff (if include_diff=True) - -2. **get_pr_templates** should return JSON array of templates with: - - filename - - type - - content - -3. **suggest_template** should return JSON with: - - recommended_template - - reasoning - - template_content - - usage_hint - -## Troubleshooting - -- **"Git not found"**: Ensure you're in a git repository -- **Import errors**: Check virtual environment is activated -- **MCP connection failed**: Verify the path in the claude mcp add command is absolute -- **No tools showing**: Restart Claude Code after adding the server \ No newline at end of file diff --git a/projects/unit3/build-mcp-server/solution/pyproject.toml b/projects/unit3/build-mcp-server/solution/pyproject.toml deleted file mode 100644 index c2c9f7e..0000000 --- a/projects/unit3/build-mcp-server/solution/pyproject.toml +++ /dev/null @@ -1,28 +0,0 @@ -[project] -name = "pr-agent" -version = "1.0.0" -description = "MCP server for PR template suggestions" -readme = "README.md" -requires-python = ">=3.10" -dependencies = [ - "mcp[cli]>=1.0.0", -] - -[project.optional-dependencies] -dev = [ - "pytest>=8.3.0", - "pytest-asyncio>=0.21.0", -] - -[build-system] -requires = ["hatchling"] -build-backend = "hatchling.build" - -[tool.hatch.build.targets.wheel] -packages = ["."] - -[tool.uv] -dev-dependencies = [ - "pytest>=8.3.0", - "pytest-asyncio>=0.21.0", -] \ No newline at end of file diff --git a/projects/unit3/build-mcp-server/solution/server.py b/projects/unit3/build-mcp-server/solution/server.py deleted file mode 100644 index a87211e..0000000 --- a/projects/unit3/build-mcp-server/solution/server.py +++ /dev/null @@ -1,217 +0,0 @@ -#!/usr/bin/env python3 -""" -Module 1: Basic MCP Server with PR Template Tools -A minimal MCP server that provides tools for analyzing file changes and suggesting PR templates. -""" - -import json -import os -import subprocess -from typing import Optional -from pathlib import Path - -from mcp.server.fastmcp import FastMCP - -# Initialize the FastMCP server -mcp = FastMCP("pr-agent") - -# PR template directory (shared between starter and solution) -TEMPLATES_DIR = Path(__file__).parent.parent.parent / "templates" - -# Default PR templates -DEFAULT_TEMPLATES = { - "bug.md": "Bug Fix", - "feature.md": "Feature", - "docs.md": "Documentation", - "refactor.md": "Refactor", - "test.md": "Test", - "performance.md": "Performance", - "security.md": "Security" -} - -# Type mapping for PR templates -TYPE_MAPPING = { - "bug": "bug.md", - "fix": "bug.md", - "feature": "feature.md", - "enhancement": "feature.md", - "docs": "docs.md", - "documentation": "docs.md", - "refactor": "refactor.md", - "cleanup": "refactor.md", - "test": "test.md", - "testing": "test.md", - "performance": "performance.md", - "optimization": "performance.md", - "security": "security.md" -} - - -@mcp.tool() -async def analyze_file_changes( - base_branch: str = "main", - include_diff: bool = True, - max_diff_lines: int = 500, - working_directory: Optional[str] = None -) -> str: - """Get the full diff and list of changed files in the current git repository. - - Args: - base_branch: Base branch to compare against (default: main) - include_diff: Include the full diff content (default: true) - max_diff_lines: Maximum number of diff lines to include (default: 500) - working_directory: Directory to run git commands in (default: current directory) - """ - try: - # Try to get working directory from roots first - if working_directory is None: - try: - context = mcp.get_context() - roots_result = await context.session.list_roots() - # Get the first root - Claude Code sets this to the CWD - root = roots_result.roots[0] - # FileUrl object has a .path property that gives us the path directly - working_directory = root.uri.path - except Exception: - # If we can't get roots, fall back to current directory - pass - - # Use provided working directory or current directory - cwd = working_directory if working_directory else os.getcwd() - - # Debug output - debug_info = { - "provided_working_directory": working_directory, - "actual_cwd": cwd, - "server_process_cwd": os.getcwd(), - "server_file_location": str(Path(__file__).parent), - "roots_check": None - } - - # Add roots debug info - try: - context = mcp.get_context() - roots_result = await context.session.list_roots() - debug_info["roots_check"] = { - "found": True, - "count": len(roots_result.roots), - "roots": [str(root.uri) for root in roots_result.roots] - } - except Exception as e: - debug_info["roots_check"] = { - "found": False, - "error": str(e) - } - - # Get list of changed files - files_result = subprocess.run( - ["git", "diff", "--name-status", f"{base_branch}...HEAD"], - capture_output=True, - text=True, - check=True, - cwd=cwd - ) - - # Get diff statistics - stat_result = subprocess.run( - ["git", "diff", "--stat", f"{base_branch}...HEAD"], - capture_output=True, - text=True, - cwd=cwd - ) - - # Get the actual diff if requested - diff_content = "" - truncated = False - if include_diff: - diff_result = subprocess.run( - ["git", "diff", f"{base_branch}...HEAD"], - capture_output=True, - text=True, - cwd=cwd - ) - diff_lines = diff_result.stdout.split('\n') - - # Check if we need to truncate - if len(diff_lines) > max_diff_lines: - diff_content = '\n'.join(diff_lines[:max_diff_lines]) - diff_content += f"\n\n... Output truncated. Showing {max_diff_lines} of {len(diff_lines)} lines ..." - diff_content += "\n... Use max_diff_lines parameter to see more ..." - truncated = True - else: - diff_content = diff_result.stdout - - # Get commit messages for context - commits_result = subprocess.run( - ["git", "log", "--oneline", f"{base_branch}..HEAD"], - capture_output=True, - text=True, - cwd=cwd - ) - - analysis = { - "base_branch": base_branch, - "files_changed": files_result.stdout, - "statistics": stat_result.stdout, - "commits": commits_result.stdout, - "diff": diff_content if include_diff else "Diff not included (set include_diff=true to see full diff)", - "truncated": truncated, - "total_diff_lines": len(diff_lines) if include_diff else 0, - "_debug": debug_info - } - - return json.dumps(analysis, indent=2) - - except subprocess.CalledProcessError as e: - return json.dumps({"error": f"Git error: {e.stderr}"}) - except Exception as e: - return json.dumps({"error": str(e)}) - - -@mcp.tool() -async def get_pr_templates() -> str: - """List available PR templates with their content.""" - templates = [ - { - "filename": filename, - "type": template_type, - "content": (TEMPLATES_DIR / filename).read_text() - } - for filename, template_type in DEFAULT_TEMPLATES.items() - ] - - return json.dumps(templates, indent=2) - - -@mcp.tool() -async def suggest_template(changes_summary: str, change_type: str) -> str: - """Let Claude analyze the changes and suggest the most appropriate PR template. - - Args: - changes_summary: Your analysis of what the changes do - change_type: The type of change you've identified (bug, feature, docs, refactor, test, etc.) - """ - - # Get available templates - templates_response = await get_pr_templates() - templates = json.loads(templates_response) - - # Find matching template - template_file = TYPE_MAPPING.get(change_type.lower(), "feature.md") - selected_template = next( - (t for t in templates if t["filename"] == template_file), - templates[0] # Default to first template if no match - ) - - suggestion = { - "recommended_template": selected_template, - "reasoning": f"Based on your analysis: '{changes_summary}', this appears to be a {change_type} change.", - "template_content": selected_template["content"], - "usage_hint": "Claude can help you fill out this template based on the specific changes in your PR." - } - - return json.dumps(suggestion, indent=2) - - -if __name__ == "__main__": - mcp.run() \ No newline at end of file diff --git a/projects/unit3/build-mcp-server/solution/test_server.py b/projects/unit3/build-mcp-server/solution/test_server.py deleted file mode 100644 index 280167b..0000000 --- a/projects/unit3/build-mcp-server/solution/test_server.py +++ /dev/null @@ -1,216 +0,0 @@ -#!/usr/bin/env python3 -""" -Unit tests for Module 1: Basic MCP Server -Run these tests to validate your implementation -""" - -import json -import pytest -import asyncio -from pathlib import Path -from unittest.mock import patch, MagicMock - -# Import your implemented functions -try: - from server import ( - mcp, - analyze_file_changes, - get_pr_templates, - suggest_template - ) - IMPORTS_SUCCESSFUL = True -except ImportError as e: - IMPORTS_SUCCESSFUL = False - IMPORT_ERROR = str(e) - - -class TestImplementation: - """Test that the required functions are implemented.""" - - def test_imports(self): - """Test that all required functions can be imported.""" - assert IMPORTS_SUCCESSFUL, f"Failed to import required functions: {IMPORT_ERROR if not IMPORTS_SUCCESSFUL else ''}" - assert mcp is not None, "FastMCP server instance not found" - assert callable(analyze_file_changes), "analyze_file_changes should be a callable function" - assert callable(get_pr_templates), "get_pr_templates should be a callable function" - assert callable(suggest_template), "suggest_template should be a callable function" - - -@pytest.mark.skipif(not IMPORTS_SUCCESSFUL, reason="Imports failed") -class TestAnalyzeFileChanges: - """Test the analyze_file_changes tool.""" - - @pytest.mark.asyncio - async def test_returns_json_string(self): - """Test that analyze_file_changes returns a JSON string.""" - with patch('subprocess.run') as mock_run: - mock_run.return_value = MagicMock(stdout="", stderr="") - - result = await analyze_file_changes() - - assert isinstance(result, str), "Should return a string" - # Should be valid JSON - data = json.loads(result) - assert isinstance(data, dict), "Should return a JSON object" - - @pytest.mark.asyncio - async def test_includes_required_fields(self): - """Test that the result includes expected fields.""" - with patch('subprocess.run') as mock_run: - mock_run.return_value = MagicMock(stdout="M\tfile1.py\n", stderr="") - - result = await analyze_file_changes() - data = json.loads(result) - - # For starter code, accept error messages; for full implementation, expect data - is_implemented = not ("error" in data and "Not implemented" in str(data.get("error", ""))) - if is_implemented: - # Check for some expected fields (flexible to allow different implementations) - assert any(key in data for key in ["files_changed", "files", "changes", "diff"]), \ - "Result should include file change information" - else: - # Starter code - just verify it returns something structured - assert isinstance(data, dict), "Should return a JSON object even if not implemented" - - @pytest.mark.asyncio - async def test_output_limiting(self): - """Test that large diffs are properly truncated.""" - with patch('subprocess.run') as mock_run: - # Create a mock diff with many lines - large_diff = "\n".join([f"+ line {i}" for i in range(1000)]) - - # Set up mock responses - mock_run.side_effect = [ - MagicMock(stdout="M\tfile1.py\n", stderr=""), # files changed - MagicMock(stdout="1 file changed, 1000 insertions(+)", stderr=""), # stats - MagicMock(stdout=large_diff, stderr=""), # diff - MagicMock(stdout="abc123 Initial commit", stderr="") # commits - ] - - # Test with default limit (500 lines) - result = await analyze_file_changes(include_diff=True) - data = json.loads(result) - - # Check if it's implemented - if "error" not in data or "Not implemented" not in str(data.get("error", "")): - if "diff" in data and data["diff"] != "Diff not included (set include_diff=true to see full diff)": - diff_lines = data["diff"].split('\n') - # Should be truncated to around 500 lines plus truncation message - assert len(diff_lines) < 600, "Large diffs should be truncated" - - # Check for truncation indicator - if "truncated" in data: - assert data["truncated"] == True, "Should indicate truncation" - - # Should have truncation message - assert "truncated" in data["diff"].lower() or "..." in data["diff"], \ - "Should indicate diff was truncated" - - -@pytest.mark.skipif(not IMPORTS_SUCCESSFUL, reason="Imports failed") -class TestGetPRTemplates: - """Test the get_pr_templates tool.""" - - @pytest.mark.asyncio - async def test_returns_json_string(self): - """Test that get_pr_templates returns a JSON string.""" - result = await get_pr_templates() - - assert isinstance(result, str), "Should return a string" - # Should be valid JSON - data = json.loads(result) - - # For starter code, accept error messages; for full implementation, expect list - is_implemented = not ("error" in data and isinstance(data, dict)) - if is_implemented: - assert isinstance(data, list), "Should return a JSON array of templates" - else: - # Starter code - just verify it returns something structured - assert isinstance(data, dict), "Should return a JSON object even if not implemented" - - @pytest.mark.asyncio - async def test_returns_templates(self): - """Test that templates are returned.""" - result = await get_pr_templates() - templates = json.loads(result) - - # For starter code, accept error messages; for full implementation, expect templates - is_implemented = not ("error" in templates and isinstance(templates, dict)) - if is_implemented: - assert len(templates) > 0, "Should return at least one template" - - # Check that templates have expected structure - for template in templates: - assert isinstance(template, dict), "Each template should be a dictionary" - # Should have some identifying information - assert any(key in template for key in ["filename", "name", "type", "id"]), \ - "Templates should have an identifier" - else: - # Starter code - just verify it's structured correctly - assert isinstance(templates, dict), "Should return structured error for starter code" - - -@pytest.mark.skipif(not IMPORTS_SUCCESSFUL, reason="Imports failed") -class TestSuggestTemplate: - """Test the suggest_template tool.""" - - @pytest.mark.asyncio - async def test_returns_json_string(self): - """Test that suggest_template returns a JSON string.""" - result = await suggest_template( - "Fixed a bug in the authentication system", - "bug" - ) - - assert isinstance(result, str), "Should return a string" - # Should be valid JSON - data = json.loads(result) - assert isinstance(data, dict), "Should return a JSON object" - - @pytest.mark.asyncio - async def test_suggestion_structure(self): - """Test that the suggestion has expected structure.""" - result = await suggest_template( - "Added new feature for user management", - "feature" - ) - suggestion = json.loads(result) - - # For starter code, accept error messages; for full implementation, expect suggestion - is_implemented = not ("error" in suggestion and "Not implemented" in str(suggestion.get("error", ""))) - if is_implemented: - # Check for some expected fields (flexible to allow different implementations) - assert any(key in suggestion for key in ["template", "recommended_template", "suggestion"]), \ - "Should include a template recommendation" - else: - # Starter code - just verify it's structured correctly - assert isinstance(suggestion, dict), "Should return structured error for starter code" - - -@pytest.mark.skipif(not IMPORTS_SUCCESSFUL, reason="Imports failed") -class TestToolRegistration: - """Test that tools are properly registered with FastMCP.""" - - def test_tools_have_decorators(self): - """Test that tool functions are decorated with @mcp.tool().""" - # In FastMCP, decorated functions should have certain attributes - # This is a basic check that functions exist and are callable - assert hasattr(analyze_file_changes, '__name__'), \ - "analyze_file_changes should be a proper function" - assert hasattr(get_pr_templates, '__name__'), \ - "get_pr_templates should be a proper function" - assert hasattr(suggest_template, '__name__'), \ - "suggest_template should be a proper function" - - -if __name__ == "__main__": - if not IMPORTS_SUCCESSFUL: - print(f"❌ Cannot run tests - imports failed: {IMPORT_ERROR}") - print("\nMake sure you've:") - print("1. Implemented all three tool functions") - print("2. Decorated them with @mcp.tool()") - print("3. Installed dependencies with: uv sync") - exit(1) - - # Run tests - pytest.main([__file__, "-v"]) \ No newline at end of file diff --git a/projects/unit3/build-mcp-server/solution/uv.lock b/projects/unit3/build-mcp-server/solution/uv.lock deleted file mode 100644 index 73f7a33..0000000 --- a/projects/unit3/build-mcp-server/solution/uv.lock +++ /dev/null @@ -1,550 +0,0 @@ -version = 1 -revision = 2 -requires-python = ">=3.10" - -[[package]] -name = "annotated-types" -version = "0.7.0" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/annotated-types/0.7.0/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/annotated-types/0.7.0/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53" }, -] - -[[package]] -name = "anyio" -version = "4.9.0" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -dependencies = [ - { name = "exceptiongroup", marker = "python_full_version < '3.11'" }, - { name = "idna" }, - { name = "sniffio" }, - { name = "typing-extensions", marker = "python_full_version < '3.13'" }, -] -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/anyio/4.9.0/anyio-4.9.0.tar.gz", hash = "sha256:673c0c244e15788651a4ff38710fea9675823028a6f08a5eda409e0c9840a028" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/anyio/4.9.0/anyio-4.9.0-py3-none-any.whl", hash = "sha256:9f76d541cad6e36af7beb62e978876f3b41e3e04f2c1fbf0884604c0a9c4d93c" }, -] - -[[package]] -name = "certifi" -version = "2025.4.26" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/certifi/2025.4.26/certifi-2025.4.26.tar.gz", hash = "sha256:0a816057ea3cdefcef70270d2c515e4506bbc954f417fa5ade2021213bb8f0c6" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/certifi/2025.4.26/certifi-2025.4.26-py3-none-any.whl", hash = "sha256:30350364dfe371162649852c63336a15c70c6510c2ad5015b21c2345311805f3" }, -] - -[[package]] -name = "click" -version = "8.1.8" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -dependencies = [ - { name = "colorama", marker = "sys_platform == 'win32'" }, -] -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/click/8.1.8/click-8.1.8.tar.gz", hash = "sha256:ed53c9d8990d83c2a27deae68e4ee337473f6330c040a31d4225c9574d16096a" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/click/8.1.8/click-8.1.8-py3-none-any.whl", hash = "sha256:63c132bbbed01578a06712a2d1f497bb62d9c1c0d329b7903a866228027263b2" }, -] - -[[package]] -name = "colorama" -version = "0.4.6" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/colorama/0.4.6/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/colorama/0.4.6/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6" }, -] - -[[package]] -name = "exceptiongroup" -version = "1.3.0" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -dependencies = [ - { name = "typing-extensions", marker = "python_full_version < '3.13'" }, -] -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/exceptiongroup/1.3.0/exceptiongroup-1.3.0.tar.gz", hash = "sha256:b241f5885f560bc56a59ee63ca4c6a8bfa46ae4ad651af316d4e81817bb9fd88" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/exceptiongroup/1.3.0/exceptiongroup-1.3.0-py3-none-any.whl", hash = "sha256:4d111e6e0c13d0644cad6ddaa7ed0261a0b36971f6d23e7ec9b4b9097da78a10" }, -] - -[[package]] -name = "h11" -version = "0.16.0" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/h11/0.16.0/h11-0.16.0.tar.gz", hash = "sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/h11/0.16.0/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86" }, -] - -[[package]] -name = "httpcore" -version = "1.0.9" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -dependencies = [ - { name = "certifi" }, - { name = "h11" }, -] -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/httpcore/1.0.9/httpcore-1.0.9.tar.gz", hash = "sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/httpcore/1.0.9/httpcore-1.0.9-py3-none-any.whl", hash = "sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55" }, -] - -[[package]] -name = "httpx" -version = "0.28.1" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -dependencies = [ - { name = "anyio" }, - { name = "certifi" }, - { name = "httpcore" }, - { name = "idna" }, -] -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/httpx/0.28.1/httpx-0.28.1.tar.gz", hash = "sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/httpx/0.28.1/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad" }, -] - -[[package]] -name = "httpx-sse" -version = "0.4.0" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/httpx-sse/0.4.0/httpx-sse-0.4.0.tar.gz", hash = "sha256:1e81a3a3070ce322add1d3529ed42eb5f70817f45ed6ec915ab753f961139721" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/httpx-sse/0.4.0/httpx_sse-0.4.0-py3-none-any.whl", hash = "sha256:f329af6eae57eaa2bdfd962b42524764af68075ea87370a2de920af5341e318f" }, -] - -[[package]] -name = "idna" -version = "3.10" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/idna/3.10/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/idna/3.10/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3" }, -] - -[[package]] -name = "iniconfig" -version = "2.1.0" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/iniconfig/2.1.0/iniconfig-2.1.0.tar.gz", hash = "sha256:3abbd2e30b36733fee78f9c7f7308f2d0050e88f0087fd25c2645f63c773e1c7" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/iniconfig/2.1.0/iniconfig-2.1.0-py3-none-any.whl", hash = "sha256:9deba5723312380e77435581c6bf4935c94cbfab9b1ed33ef8d238ea168eb760" }, -] - -[[package]] -name = "markdown-it-py" -version = "3.0.0" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -dependencies = [ - { name = "mdurl" }, -] -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/markdown-it-py/3.0.0/markdown-it-py-3.0.0.tar.gz", hash = "sha256:e3f60a94fa066dc52ec76661e37c851cb232d92f9886b15cb560aaada2df8feb" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/markdown-it-py/3.0.0/markdown_it_py-3.0.0-py3-none-any.whl", hash = "sha256:355216845c60bd96232cd8d8c40e8f9765cc86f46880e43a8fd22dc1a1a8cab1" }, -] - -[[package]] -name = "mcp" -version = "1.9.0" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -dependencies = [ - { name = "anyio" }, - { name = "httpx" }, - { name = "httpx-sse" }, - { name = "pydantic" }, - { name = "pydantic-settings" }, - { name = "python-multipart" }, - { name = "sse-starlette" }, - { name = "starlette" }, - { name = "uvicorn", marker = "sys_platform != 'emscripten'" }, -] -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/mcp/1.9.0/mcp-1.9.0.tar.gz", hash = "sha256:905d8d208baf7e3e71d70c82803b89112e321581bcd2530f9de0fe4103d28749" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/mcp/1.9.0/mcp-1.9.0-py3-none-any.whl", hash = "sha256:9dfb89c8c56f742da10a5910a1f64b0d2ac2c3ed2bd572ddb1cfab7f35957178" }, -] - -[package.optional-dependencies] -cli = [ - { name = "python-dotenv" }, - { name = "typer" }, -] - -[[package]] -name = "mdurl" -version = "0.1.2" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/mdurl/0.1.2/mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/mdurl/0.1.2/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8" }, -] - -[[package]] -name = "packaging" -version = "25.0" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/packaging/25.0/packaging-25.0.tar.gz", hash = "sha256:d443872c98d677bf60f6a1f2f8c1cb748e8fe762d2bf9d3148b5599295b0fc4f" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/packaging/25.0/packaging-25.0-py3-none-any.whl", hash = "sha256:29572ef2b1f17581046b3a2227d5c611fb25ec70ca1ba8554b24b0e69331a484" }, -] - -[[package]] -name = "pluggy" -version = "1.6.0" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pluggy/1.6.0/pluggy-1.6.0.tar.gz", hash = "sha256:7dcc130b76258d33b90f61b658791dede3486c3e6bfb003ee5c9bfb396dd22f3" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pluggy/1.6.0/pluggy-1.6.0-py3-none-any.whl", hash = "sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746" }, -] - -[[package]] -name = "pr-agent" -version = "1.0.0" -source = { editable = "." } -dependencies = [ - { name = "mcp", extra = ["cli"] }, -] - -[package.optional-dependencies] -dev = [ - { name = "pytest" }, - { name = "pytest-asyncio" }, -] - -[package.dev-dependencies] -dev = [ - { name = "pytest" }, - { name = "pytest-asyncio" }, -] - -[package.metadata] -requires-dist = [ - { name = "mcp", extras = ["cli"], specifier = ">=1.0.0" }, - { name = "pytest", marker = "extra == 'dev'", specifier = ">=8.3.0" }, - { name = "pytest-asyncio", marker = "extra == 'dev'", specifier = ">=0.21.0" }, -] -provides-extras = ["dev"] - -[package.metadata.requires-dev] -dev = [ - { name = "pytest", specifier = ">=8.3.0" }, - { name = "pytest-asyncio", specifier = ">=0.21.0" }, -] - -[[package]] -name = "pydantic" -version = "2.11.4" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -dependencies = [ - { name = "annotated-types" }, - { name = "pydantic-core" }, - { name = "typing-extensions" }, - { name = "typing-inspection" }, -] -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic/2.11.4/pydantic-2.11.4.tar.gz", hash = "sha256:32738d19d63a226a52eed76645a98ee07c1f410ee41d93b4afbfa85ed8111c2d" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic/2.11.4/pydantic-2.11.4-py3-none-any.whl", hash = "sha256:d9615eaa9ac5a063471da949c8fc16376a84afb5024688b3ff885693506764eb" }, -] - -[[package]] -name = "pydantic-core" -version = "2.33.2" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -dependencies = [ - { name = "typing-extensions" }, -] -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2.tar.gz", hash = "sha256:7cb8bc3605c29176e1b105350d2e6474142d7c1bd1d9327c4a9bdb46bf827acc" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:2b3d326aaef0c0399d9afffeb6367d5e26ddc24d351dbc9c636840ac355dc5d8" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:0e5b2671f05ba48b94cb90ce55d8bdcaaedb8ba00cc5359f6810fc918713983d" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0069c9acc3f3981b9ff4cdfaf088e98d83440a4c7ea1bc07460af3d4dc22e72d" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:d53b22f2032c42eaaf025f7c40c2e3b94568ae077a606f006d206a463bc69572" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0405262705a123b7ce9f0b92f123334d67b70fd1f20a9372b907ce1080c7ba02" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4b25d91e288e2c4e0662b8038a28c6a07eaac3e196cfc4ff69de4ea3db992a1b" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6bdfe4b3789761f3bcb4b1ddf33355a71079858958e3a552f16d5af19768fef2" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:efec8db3266b76ef9607c2c4c419bdb06bf335ae433b80816089ea7585816f6a" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:031c57d67ca86902726e0fae2214ce6770bbe2f710dc33063187a68744a5ecac" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_armv7l.whl", hash = "sha256:f8de619080e944347f5f20de29a975c2d815d9ddd8be9b9b7268e2e3ef68605a" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:73662edf539e72a9440129f231ed3757faab89630d291b784ca99237fb94db2b" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-win32.whl", hash = "sha256:0a39979dcbb70998b0e505fb1556a1d550a0781463ce84ebf915ba293ccb7e22" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp310-cp310-win_amd64.whl", hash = "sha256:b0379a2b24882fef529ec3b4987cb5d003b9cda32256024e6fe1586ac45fc640" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:4c5b0a576fb381edd6d27f0a85915c6daf2f8138dc5c267a57c08a62900758c7" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:e799c050df38a639db758c617ec771fd8fb7a5f8eaaa4b27b101f266b216a246" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dc46a01bf8d62f227d5ecee74178ffc448ff4e5197c756331f71efcc66dc980f" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:a144d4f717285c6d9234a66778059f33a89096dfb9b39117663fd8413d582dcc" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:73cf6373c21bc80b2e0dc88444f41ae60b2f070ed02095754eb5a01df12256de" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3dc625f4aa79713512d1976fe9f0bc99f706a9dee21dfd1810b4bbbf228d0e8a" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:881b21b5549499972441da4758d662aeea93f1923f953e9cbaff14b8b9565aef" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:bdc25f3681f7b78572699569514036afe3c243bc3059d3942624e936ec93450e" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:fe5b32187cbc0c862ee201ad66c30cf218e5ed468ec8dc1cf49dec66e160cc4d" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_armv7l.whl", hash = "sha256:bc7aee6f634a6f4a95676fcb5d6559a2c2a390330098dba5e5a5f28a2e4ada30" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:235f45e5dbcccf6bd99f9f472858849f73d11120d76ea8707115415f8e5ebebf" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-win32.whl", hash = "sha256:6368900c2d3ef09b69cb0b913f9f8263b03786e5b2a387706c5afb66800efd51" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-win_amd64.whl", hash = "sha256:1e063337ef9e9820c77acc768546325ebe04ee38b08703244c1309cccc4f1bab" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp311-cp311-win_arm64.whl", hash = "sha256:6b99022f1d19bc32a4c2a0d544fc9a76e3be90f0b3f4af413f87d38749300e65" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:a7ec89dc587667f22b6a0b6579c249fca9026ce7c333fc142ba42411fa243cdc" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3c6db6e52c6d70aa0d00d45cdb9b40f0433b96380071ea80b09277dba021ddf7" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e61206137cbc65e6d5256e1166f88331d3b6238e082d9f74613b9b765fb9025" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:eb8c529b2819c37140eb51b914153063d27ed88e3bdc31b71198a198e921e011" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c52b02ad8b4e2cf14ca7b3d918f3eb0ee91e63b3167c32591e57c4317e134f8f" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:96081f1605125ba0855dfda83f6f3df5ec90c61195421ba72223de35ccfb2f88" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f57a69461af2a5fa6e6bbd7a5f60d3b7e6cebb687f55106933188e79ad155c1" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:572c7e6c8bb4774d2ac88929e3d1f12bc45714ae5ee6d9a788a9fb35e60bb04b" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:db4b41f9bd95fbe5acd76d89920336ba96f03e149097365afe1cb092fceb89a1" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:fa854f5cf7e33842a892e5c73f45327760bc7bc516339fda888c75ae60edaeb6" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:5f483cfb75ff703095c59e365360cb73e00185e01aaea067cd19acffd2ab20ea" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-win32.whl", hash = "sha256:9cb1da0f5a471435a7bc7e439b8a728e8b61e59784b2af70d7c169f8dd8ae290" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-win_amd64.whl", hash = "sha256:f941635f2a3d96b2973e867144fde513665c87f13fe0e193c158ac51bfaaa7b2" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp312-cp312-win_arm64.whl", hash = "sha256:cca3868ddfaccfbc4bfb1d608e2ccaaebe0ae628e1416aeb9c4d88c001bb45ab" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:1082dd3e2d7109ad8b7da48e1d4710c8d06c253cbc4a27c1cff4fbcaa97a9e3f" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f517ca031dfc037a9c07e748cefd8d96235088b83b4f4ba8939105d20fa1dcd6" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0a9f2c9dd19656823cb8250b0724ee9c60a82f3cdf68a080979d13092a3b0fef" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2b0a451c263b01acebe51895bfb0e1cc842a5c666efe06cdf13846c7418caa9a" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ea40a64d23faa25e62a70ad163571c0b342b8bf66d5fa612ac0dec4f069d916" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0fb2d542b4d66f9470e8065c5469ec676978d625a8b7a363f07d9a501a9cb36a" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdac5d6ffa1b5a83bca06ffe7583f5576555e6c8b3a91fbd25ea7780f825f7d" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:04a1a413977ab517154eebb2d326da71638271477d6ad87a769102f7c2488c56" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:c8e7af2f4e0194c22b5b37205bfb293d166a7344a5b0d0eaccebc376546d77d5" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:5c92edd15cd58b3c2d34873597a1e20f13094f59cf88068adb18947df5455b4e" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:65132b7b4a1c0beded5e057324b7e16e10910c106d43675d9bd87d4f38dde162" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-win32.whl", hash = "sha256:52fb90784e0a242bb96ec53f42196a17278855b0f31ac7c3cc6f5c1ec4811849" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-win_amd64.whl", hash = "sha256:c083a3bdd5a93dfe480f1125926afcdbf2917ae714bdb80b36d34318b2bec5d9" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313-win_arm64.whl", hash = "sha256:e80b087132752f6b3d714f041ccf74403799d3b23a72722ea2e6ba2e892555b9" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:61c18fba8e5e9db3ab908620af374db0ac1baa69f0f32df4f61ae23f15e586ac" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:95237e53bb015f67b63c91af7518a62a8660376a6a0db19b89acc77a4d6199f5" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-cp313-cp313t-win_amd64.whl", hash = "sha256:c2fc0a768ef76c15ab9238afa6da7f69895bb5d1ee83aeea2e3509af4472d0b9" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:5c4aa4e82353f65e548c476b37e64189783aa5384903bfea4f41580f255fddfa" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:d946c8bf0d5c24bf4fe333af284c59a19358aa3ec18cb3dc4370080da1e8ad29" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:87b31b6846e361ef83fedb187bb5b4372d0da3f7e28d85415efa92d6125d6e6d" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:aa9d91b338f2df0508606f7009fde642391425189bba6d8c653afd80fd6bb64e" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2058a32994f1fde4ca0480ab9d1e75a0e8c87c22b53a3ae66554f9af78f2fe8c" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:0e03262ab796d986f978f79c943fc5f620381be7287148b8010b4097f79a39ec" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:1a8695a8d00c73e50bff9dfda4d540b7dee29ff9b8053e38380426a85ef10052" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:fa754d1850735a0b0e03bcffd9d4b4343eb417e47196e4485d9cca326073a42c" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:a11c8d26a50bfab49002947d3d237abe4d9e4b5bdc8846a63537b6488e197808" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_10_12_x86_64.whl", hash = "sha256:dd14041875d09cc0f9308e37a6f8b65f5585cf2598a53aa0123df8b129d481f8" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-macosx_11_0_arm64.whl", hash = "sha256:d87c561733f66531dced0da6e864f44ebf89a8fba55f31407b00c2f7f9449593" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2f82865531efd18d6e07a04a17331af02cb7a651583c418df8266f17a63c6612" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bfb5112df54209d820d7bf9317c7a6c9025ea52e49f46b6a2060104bba37de7" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:64632ff9d614e5eecfb495796ad51b0ed98c453e447a76bcbeeb69615079fc7e" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_aarch64.whl", hash = "sha256:f889f7a40498cc077332c7ab6b4608d296d852182211787d4f3ee377aaae66e8" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_armv7l.whl", hash = "sha256:de4b83bb311557e439b9e186f733f6c645b9417c84e2eb8203f3f820a4b988bf" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-musllinux_1_1_x86_64.whl", hash = "sha256:82f68293f055f51b51ea42fafc74b6aad03e70e191799430b90c13d643059ebb" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-core/2.33.2/pydantic_core-2.33.2-pp311-pypy311_pp73-win_amd64.whl", hash = "sha256:329467cecfb529c925cf2bbd4d60d2c509bc2fb52a20c1045bf09bb70971a9c1" }, -] - -[[package]] -name = "pydantic-settings" -version = "2.9.1" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -dependencies = [ - { name = "pydantic" }, - { name = "python-dotenv" }, - { name = "typing-inspection" }, -] -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-settings/2.9.1/pydantic_settings-2.9.1.tar.gz", hash = "sha256:c509bf79d27563add44e8446233359004ed85066cd096d8b510f715e6ef5d268" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pydantic-settings/2.9.1/pydantic_settings-2.9.1-py3-none-any.whl", hash = "sha256:59b4f431b1defb26fe620c71a7d3968a710d719f5f4cdbbdb7926edeb770f6ef" }, -] - -[[package]] -name = "pygments" -version = "2.19.1" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pygments/2.19.1/pygments-2.19.1.tar.gz", hash = "sha256:61c16d2a8576dc0649d9f39e089b5f02bcd27fba10d8fb4dcc28173f7a45151f" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pygments/2.19.1/pygments-2.19.1-py3-none-any.whl", hash = "sha256:9ea1544ad55cecf4b8242fab6dd35a93bbce657034b0611ee383099054ab6d8c" }, -] - -[[package]] -name = "pytest" -version = "8.3.5" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -dependencies = [ - { name = "colorama", marker = "sys_platform == 'win32'" }, - { name = "exceptiongroup", marker = "python_full_version < '3.11'" }, - { name = "iniconfig" }, - { name = "packaging" }, - { name = "pluggy" }, - { name = "tomli", marker = "python_full_version < '3.11'" }, -] -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pytest/8.3.5/pytest-8.3.5.tar.gz", hash = "sha256:f4efe70cc14e511565ac476b57c279e12a855b11f48f212af1080ef2263d3845" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pytest/8.3.5/pytest-8.3.5-py3-none-any.whl", hash = "sha256:c69214aa47deac29fad6c2a4f590b9c4a9fdb16a403176fe154b79c0b4d4d820" }, -] - -[[package]] -name = "pytest-asyncio" -version = "0.26.0" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -dependencies = [ - { name = "pytest" }, -] -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pytest-asyncio/0.26.0/pytest_asyncio-0.26.0.tar.gz", hash = "sha256:c4df2a697648241ff39e7f0e4a73050b03f123f760673956cf0d72a4990e312f" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/pytest-asyncio/0.26.0/pytest_asyncio-0.26.0-py3-none-any.whl", hash = "sha256:7b51ed894f4fbea1340262bdae5135797ebbe21d8638978e35d31c6d19f72fb0" }, -] - -[[package]] -name = "python-dotenv" -version = "1.1.0" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/python-dotenv/1.1.0/python_dotenv-1.1.0.tar.gz", hash = "sha256:41f90bc6f5f177fb41f53e87666db362025010eb28f60a01c9143bfa33a2b2d5" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/python-dotenv/1.1.0/python_dotenv-1.1.0-py3-none-any.whl", hash = "sha256:d7c01d9e2293916c18baf562d95698754b0dbbb5e74d457c45d4f6561fb9d55d" }, -] - -[[package]] -name = "python-multipart" -version = "0.0.20" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/python-multipart/0.0.20/python_multipart-0.0.20.tar.gz", hash = "sha256:8dd0cab45b8e23064ae09147625994d090fa46f5b0d1e13af944c331a7fa9d13" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/python-multipart/0.0.20/python_multipart-0.0.20-py3-none-any.whl", hash = "sha256:8a62d3a8335e06589fe01f2a3e178cdcc632f3fbe0d492ad9ee0ec35aab1f104" }, -] - -[[package]] -name = "rich" -version = "14.0.0" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -dependencies = [ - { name = "markdown-it-py" }, - { name = "pygments" }, - { name = "typing-extensions", marker = "python_full_version < '3.11'" }, -] -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/rich/14.0.0/rich-14.0.0.tar.gz", hash = "sha256:82f1bc23a6a21ebca4ae0c45af9bdbc492ed20231dcb63f297d6d1021a9d5725" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/rich/14.0.0/rich-14.0.0-py3-none-any.whl", hash = "sha256:1c9491e1951aac09caffd42f448ee3d04e58923ffe14993f6e83068dc395d7e0" }, -] - -[[package]] -name = "shellingham" -version = "1.5.4" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/shellingham/1.5.4/shellingham-1.5.4.tar.gz", hash = "sha256:8dbca0739d487e5bd35ab3ca4b36e11c4078f3a234bfce294b0a0291363404de" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/shellingham/1.5.4/shellingham-1.5.4-py2.py3-none-any.whl", hash = "sha256:7ecfff8f2fd72616f7481040475a65b2bf8af90a56c89140852d1120324e8686" }, -] - -[[package]] -name = "sniffio" -version = "1.3.1" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/sniffio/1.3.1/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/sniffio/1.3.1/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2" }, -] - -[[package]] -name = "sse-starlette" -version = "2.3.4" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -dependencies = [ - { name = "anyio" }, - { name = "starlette" }, -] -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/sse-starlette/2.3.4/sse_starlette-2.3.4.tar.gz", hash = "sha256:0ffd6bed217cdbb74a84816437c609278003998b4991cd2e6872d0b35130e4d5" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/sse-starlette/2.3.4/sse_starlette-2.3.4-py3-none-any.whl", hash = "sha256:b8100694f3f892b133d0f7483acb7aacfcf6ed60f863b31947664b6dc74e529f" }, -] - -[[package]] -name = "starlette" -version = "0.46.2" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -dependencies = [ - { name = "anyio" }, -] -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/starlette/0.46.2/starlette-0.46.2.tar.gz", hash = "sha256:7f7361f34eed179294600af672f565727419830b54b7b084efe44bb82d2fccd5" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/starlette/0.46.2/starlette-0.46.2-py3-none-any.whl", hash = "sha256:595633ce89f8ffa71a015caed34a5b2dc1c0cdb3f0f1fbd1e69339cf2abeec35" }, -] - -[[package]] -name = "tomli" -version = "2.2.1" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1.tar.gz", hash = "sha256:cd45e1dc79c835ce60f7404ec8119f2eb06d38b1deba146f07ced3bbc44505ff" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:678e4fa69e4575eb77d103de3df8a895e1591b48e740211bd1067378c69e8249" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:023aa114dd824ade0100497eb2318602af309e5a55595f76b626d6d9f3b7b0a6" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ece47d672db52ac607a3d9599a9d48dcb2f2f735c6c2d1f34130085bb12b112a" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6972ca9c9cc9f0acaa56a8ca1ff51e7af152a9f87fb64623e31d5c83700080ee" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c954d2250168d28797dd4e3ac5cf812a406cd5a92674ee4c8f123c889786aa8e" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:8dd28b3e155b80f4d54beb40a441d366adcfe740969820caf156c019fb5c7ec4" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:e59e304978767a54663af13c07b3d1af22ddee3bb2fb0618ca1593e4f593a106" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:33580bccab0338d00994d7f16f4c4ec25b776af3ffaac1ed74e0b3fc95e885a8" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-win32.whl", hash = "sha256:465af0e0875402f1d226519c9904f37254b3045fc5084697cefb9bdde1ff99ff" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp311-cp311-win_amd64.whl", hash = "sha256:2d0f2fdd22b02c6d81637a3c95f8cd77f995846af7414c5c4b8d0545afa1bc4b" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4a8f6e44de52d5e6c657c9fe83b562f5f4256d8ebbfe4ff922c495620a7f6cea" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8d57ca8095a641b8237d5b079147646153d22552f1c637fd3ba7f4b0b29167a8" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e340144ad7ae1533cb897d406382b4b6fede8890a03738ff1683af800d54192" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:db2b95f9de79181805df90bedc5a5ab4c165e6ec3fe99f970d0e302f384ad222" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:40741994320b232529c802f8bc86da4e1aa9f413db394617b9a256ae0f9a7f77" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:400e720fe168c0f8521520190686ef8ef033fb19fc493da09779e592861b78c6" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:02abe224de6ae62c19f090f68da4e27b10af2b93213d36cf44e6e1c5abd19fdd" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b82ebccc8c8a36f2094e969560a1b836758481f3dc360ce9a3277c65f374285e" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-win32.whl", hash = "sha256:889f80ef92701b9dbb224e49ec87c645ce5df3fa2cc548664eb8a25e03127a98" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp312-cp312-win_amd64.whl", hash = "sha256:7fc04e92e1d624a4a63c76474610238576942d6b8950a2d7f908a340494e67e4" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f4039b9cbc3048b2416cc57ab3bda989a6fcf9b36cf8937f01a6e731b64f80d7" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:286f0ca2ffeeb5b9bd4fcc8d6c330534323ec51b2f52da063b11c502da16f30c" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a92ef1a44547e894e2a17d24e7557a5e85a9e1d0048b0b5e7541f76c5032cb13" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9316dc65bed1684c9a98ee68759ceaed29d229e985297003e494aa825ebb0281" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e85e99945e688e32d5a35c1ff38ed0b3f41f43fad8df0bdf79f72b2ba7bc5272" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ac065718db92ca818f8d6141b5f66369833d4a80a9d74435a268c52bdfa73140" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:d920f33822747519673ee656a4b6ac33e382eca9d331c87770faa3eef562aeb2" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:a198f10c4d1b1375d7687bc25294306e551bf1abfa4eace6650070a5c1ae2744" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-win32.whl", hash = "sha256:d3f5614314d758649ab2ab3a62d4f2004c825922f9e370b29416484086b264ec" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-cp313-cp313-win_amd64.whl", hash = "sha256:a38aa0308e754b0e3c67e344754dff64999ff9b513e691d0e786265c93583c69" }, - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/tomli/2.2.1/tomli-2.2.1-py3-none-any.whl", hash = "sha256:cb55c73c5f4408779d0cf3eef9f762b9c9f147a77de7b258bef0a5628adc85cc" }, -] - -[[package]] -name = "typer" -version = "0.15.4" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -dependencies = [ - { name = "click" }, - { name = "rich" }, - { name = "shellingham" }, - { name = "typing-extensions" }, -] -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/typer/0.15.4/typer-0.15.4.tar.gz", hash = "sha256:89507b104f9b6a0730354f27c39fae5b63ccd0c95b1ce1f1a6ba0cfd329997c3" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/typer/0.15.4/typer-0.15.4-py3-none-any.whl", hash = "sha256:eb0651654dcdea706780c466cf06d8f174405a659ffff8f163cfbfee98c0e173" }, -] - -[[package]] -name = "typing-extensions" -version = "4.13.2" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/typing-extensions/4.13.2/typing_extensions-4.13.2.tar.gz", hash = "sha256:e6c81219bd689f51865d9e372991c540bda33a0379d5573cddb9a3a23f7caaef" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/typing-extensions/4.13.2/typing_extensions-4.13.2-py3-none-any.whl", hash = "sha256:a439e7c04b49fec3e5d3e2beaa21755cadbbdc391694e28ccdd36ca4a1408f8c" }, -] - -[[package]] -name = "typing-inspection" -version = "0.4.0" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -dependencies = [ - { name = "typing-extensions" }, -] -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/typing-inspection/0.4.0/typing_inspection-0.4.0.tar.gz", hash = "sha256:9765c87de36671694a67904bf2c96e395be9c6439bb6c87b5142569dcdd65122" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/typing-inspection/0.4.0/typing_inspection-0.4.0-py3-none-any.whl", hash = "sha256:50e72559fcd2a6367a19f7a7e610e6afcb9fac940c650290eed893d61386832f" }, -] - -[[package]] -name = "uvicorn" -version = "0.34.2" -source = { registry = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/simple" } -dependencies = [ - { name = "click" }, - { name = "h11" }, - { name = "typing-extensions", marker = "python_full_version < '3.11'" }, -] -sdist = { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/uvicorn/0.34.2/uvicorn-0.34.2.tar.gz", hash = "sha256:0e929828f6186353a80b58ea719861d2629d766293b6d19baf086ba31d4f3328" } -wheels = [ - { url = "https://artifactory.infra.ant.dev/artifactory/api/pypi/pypi-all/uvicorn/0.34.2/uvicorn-0.34.2-py3-none-any.whl", hash = "sha256:deb49af569084536d269fe0a6d67e3754f104cf03aba7c11c40f01aadf33c403" }, -] diff --git a/projects/unit3/build-mcp-server/starter/server.py b/projects/unit3/build-mcp-server/starter/server.py index 15e1b13..68a20b6 100644 --- a/projects/unit3/build-mcp-server/starter/server.py +++ b/projects/unit3/build-mcp-server/starter/server.py @@ -1,11 +1,13 @@ #!/usr/bin/env python3 """ -Module 1: Basic MCP Server - Starter Code -TODO: Implement tools for analyzing git changes and suggesting PR templates +Module 1: Basic MCP Server with PR Template Tools +A minimal MCP server that provides tools for analyzing file changes and suggesting PR templates. """ import json +import os import subprocess +from typing import Optional from pathlib import Path from mcp.server.fastmcp import FastMCP @@ -13,68 +15,202 @@ # Initialize the FastMCP server mcp = FastMCP("pr-agent") -# PR template directory (shared across all modules) +# PR template directory (shared between starter and solution) TEMPLATES_DIR = Path(__file__).parent.parent.parent / "templates" +# Default PR templates +DEFAULT_TEMPLATES = { + "bug.md": "Bug Fix", + "feature.md": "Feature", + "docs.md": "Documentation", + "refactor.md": "Refactor", + "test.md": "Test", + "performance.md": "Performance", + "security.md": "Security" +} -# TODO: Implement tool functions here -# Example structure for a tool: -# @mcp.tool() -# async def analyze_file_changes(base_branch: str = "main", include_diff: bool = True) -> str: -# """Get the full diff and list of changed files in the current git repository. -# -# Args: -# base_branch: Base branch to compare against (default: main) -# include_diff: Include the full diff content (default: true) -# """ -# # Your implementation here -# pass +# Type mapping for PR templates +TYPE_MAPPING = { + "bug": "bug.md", + "fix": "bug.md", + "feature": "feature.md", + "enhancement": "feature.md", + "docs": "docs.md", + "documentation": "docs.md", + "refactor": "refactor.md", + "cleanup": "refactor.md", + "test": "test.md", + "testing": "test.md", + "performance": "performance.md", + "optimization": "performance.md", + "security": "security.md" +} -# Minimal stub implementations so the server runs -# TODO: Replace these with your actual implementations @mcp.tool() -async def analyze_file_changes(base_branch: str = "main", include_diff: bool = True) -> str: +async def analyze_file_changes( + base_branch: str = "main", + include_diff: bool = True, + max_diff_lines: int = 500, + working_directory: Optional[str] = None +) -> str: """Get the full diff and list of changed files in the current git repository. - + Args: base_branch: Base branch to compare against (default: main) include_diff: Include the full diff content (default: true) + max_diff_lines: Maximum number of diff lines to include (default: 500) + working_directory: Directory to run git commands in (default: current directory) """ - # TODO: Implement this tool - # IMPORTANT: MCP tools have a 25,000 token response limit! - # Large diffs can easily exceed this. Consider: - # - Adding a max_diff_lines parameter (e.g., 500 lines) - # - Truncating large outputs with a message - # - Returning summary statistics alongside limited diffs - - # NOTE: Git commands run in the server's directory by default! - # To run in Claude's working directory, use MCP roots: - # context = mcp.get_context() - # roots_result = await context.session.list_roots() - # working_dir = roots_result.roots[0].uri.path - # subprocess.run(["git", "diff"], cwd=working_dir) - - return json.dumps({"error": "Not implemented yet", "hint": "Use subprocess to run git commands"}) + try: + # Try to get working directory from roots first + if working_directory is None: + try: + context = mcp.get_context() + roots_result = await context.session.list_roots() + # Get the first root - Claude Code sets this to the CWD + root = roots_result.roots[0] + # FileUrl object has a .path property that gives us the path directly + working_directory = root.uri.path + except Exception: + # If we can't get roots, fall back to current directory + pass + + # Use provided working directory or current directory + cwd = working_directory if working_directory else os.getcwd() + + # Debug output + debug_info = { + "provided_working_directory": working_directory, + "actual_cwd": cwd, + "server_process_cwd": os.getcwd(), + "server_file_location": str(Path(__file__).parent), + "roots_check": None + } + + # Add roots debug info + try: + context = mcp.get_context() + roots_result = await context.session.list_roots() + debug_info["roots_check"] = { + "found": True, + "count": len(roots_result.roots), + "roots": [str(root.uri) for root in roots_result.roots] + } + except Exception as e: + debug_info["roots_check"] = { + "found": False, + "error": str(e) + } + + # Get list of changed files + files_result = subprocess.run( + ["git", "diff", "--name-status", f"{base_branch}...HEAD"], + capture_output=True, + text=True, + check=True, + cwd=cwd + ) + + # Get diff statistics + stat_result = subprocess.run( + ["git", "diff", "--stat", f"{base_branch}...HEAD"], + capture_output=True, + text=True, + cwd=cwd + ) + + # Get the actual diff if requested + diff_content = "" + truncated = False + if include_diff: + diff_result = subprocess.run( + ["git", "diff", f"{base_branch}...HEAD"], + capture_output=True, + text=True, + cwd=cwd + ) + diff_lines = diff_result.stdout.split('\n') + + # Check if we need to truncate + if len(diff_lines) > max_diff_lines: + diff_content = '\n'.join(diff_lines[:max_diff_lines]) + diff_content += f"\n\n... Output truncated. Showing {max_diff_lines} of {len(diff_lines)} lines ..." + diff_content += "\n... Use max_diff_lines parameter to see more ..." + truncated = True + else: + diff_content = diff_result.stdout + + # Get commit messages for context + commits_result = subprocess.run( + ["git", "log", "--oneline", f"{base_branch}..HEAD"], + capture_output=True, + text=True, + cwd=cwd + ) + + analysis = { + "base_branch": base_branch, + "files_changed": files_result.stdout, + "statistics": stat_result.stdout, + "commits": commits_result.stdout, + "diff": diff_content if include_diff else "Diff not included (set include_diff=true to see full diff)", + "truncated": truncated, + "total_diff_lines": len(diff_lines) if include_diff else 0, + "_debug": debug_info + } + + return json.dumps(analysis, indent=2) + + except subprocess.CalledProcessError as e: + return json.dumps({"error": f"Git error: {e.stderr}"}) + except Exception as e: + return json.dumps({"error": str(e)}) @mcp.tool() async def get_pr_templates() -> str: """List available PR templates with their content.""" - # TODO: Implement this tool - return json.dumps({"error": "Not implemented yet", "hint": "Read templates from TEMPLATES_DIR"}) + templates = [ + { + "filename": filename, + "type": template_type, + "content": (TEMPLATES_DIR / filename).read_text() + } + for filename, template_type in DEFAULT_TEMPLATES.items() + ] + + return json.dumps(templates, indent=2) @mcp.tool() async def suggest_template(changes_summary: str, change_type: str) -> str: """Let Claude analyze the changes and suggest the most appropriate PR template. - + Args: changes_summary: Your analysis of what the changes do change_type: The type of change you've identified (bug, feature, docs, refactor, test, etc.) """ - # TODO: Implement this tool - return json.dumps({"error": "Not implemented yet", "hint": "Map change_type to templates"}) + + # Get available templates + templates_response = await get_pr_templates() + templates = json.loads(templates_response) + + # Find matching template + template_file = TYPE_MAPPING.get(change_type.lower(), "feature.md") + selected_template = next( + (t for t in templates if t["filename"] == template_file), + templates[0] # Default to first template if no match + ) + + suggestion = { + "recommended_template": selected_template, + "reasoning": f"Based on your analysis: '{changes_summary}', this appears to be a {change_type} change.", + "template_content": selected_template["content"], + "usage_hint": "Claude can help you fill out this template based on the specific changes in your PR." + } + + return json.dumps(suggestion, indent=2) if __name__ == "__main__": diff --git a/projects/unit3/github-actions-integration/starter/server.py b/projects/unit3/github-actions-integration/starter/server.py index 56610a1..4b6895d 100644 --- a/projects/unit3/github-actions-integration/starter/server.py +++ b/projects/unit3/github-actions-integration/starter/server.py @@ -1,7 +1,7 @@ #!/usr/bin/env python3 """ -Module 2: GitHub Actions Integration - STARTER CODE -Extend your PR Agent with webhook handling and MCP Prompts for CI/CD workflows. +Module 2: GitHub Actions Integration with MCP Prompts +Extends the PR agent with webhook handling and standardized CI/CD workflows using Prompts. """ import json @@ -9,7 +9,6 @@ import subprocess from typing import Optional from pathlib import Path -from datetime import datetime from mcp.server.fastmcp import FastMCP @@ -30,8 +29,8 @@ "security.md": "Security" } -# TODO: Add path to events file where webhook_server.py stores events -# Hint: EVENTS_FILE = Path(__file__).parent / "github_events.json" +# File where webhook server stores events +EVENTS_FILE = Path(__file__).parent / "github_events.json" # Type mapping for PR templates TYPE_MAPPING = { @@ -51,37 +50,56 @@ } -# ===== Module 1 Tools (Already includes output limiting fix from Module 1) ===== +# ===== Original Tools from Module 1 (with output limiting) ===== @mcp.tool() async def analyze_file_changes( - base_branch: str = "main", - include_diff: bool = True, - max_diff_lines: int = 500 + base_branch: str = "main", + include_diff: bool = True, + max_diff_lines: int = 500, + working_directory: Optional[str] = None ) -> str: """Get the full diff and list of changed files in the current git repository. - + Args: base_branch: Base branch to compare against (default: main) include_diff: Include the full diff content (default: true) max_diff_lines: Maximum number of diff lines to include (default: 500) + working_directory: Directory to run git commands in (default: current directory) """ try: + # Try to get working directory from roots first + if working_directory is None: + try: + context = mcp.get_context() + roots_result = await context.session.list_roots() + # Get the first root - Claude Code sets this to the CWD + root = roots_result.roots[0] + # FileUrl object has a .path property that gives us the path directly + working_directory = root.uri.path + except Exception: + # If we can't get roots, fall back to current directory + pass + + # Use provided working directory or current directory + cwd = working_directory if working_directory else os.getcwd() # Get list of changed files files_result = subprocess.run( ["git", "diff", "--name-status", f"{base_branch}...HEAD"], capture_output=True, text=True, - check=True + check=True, + cwd=cwd ) - + # Get diff statistics stat_result = subprocess.run( ["git", "diff", "--stat", f"{base_branch}...HEAD"], capture_output=True, - text=True + text=True, + cwd=cwd ) - + # Get the actual diff if requested diff_content = "" truncated = False @@ -89,11 +107,12 @@ async def analyze_file_changes( diff_result = subprocess.run( ["git", "diff", f"{base_branch}...HEAD"], capture_output=True, - text=True + text=True, + cwd=cwd ) diff_lines = diff_result.stdout.split('\n') - - # Check if we need to truncate (learned from Module 1) + + # Check if we need to truncate if len(diff_lines) > max_diff_lines: diff_content = '\n'.join(diff_lines[:max_diff_lines]) diff_content += f"\n\n... Output truncated. Showing {max_diff_lines} of {len(diff_lines)} lines ..." @@ -101,14 +120,15 @@ async def analyze_file_changes( truncated = True else: diff_content = diff_result.stdout - + # Get commit messages for context commits_result = subprocess.run( ["git", "log", "--oneline", f"{base_branch}..HEAD"], capture_output=True, - text=True + text=True, + cwd=cwd ) - + analysis = { "base_branch": base_branch, "files_changed": files_result.stdout, @@ -118,9 +138,9 @@ async def analyze_file_changes( "truncated": truncated, "total_diff_lines": len(diff_lines) if include_diff else 0 } - + return json.dumps(analysis, indent=2) - + except subprocess.CalledProcessError as e: return json.dumps({"error": f"Git error: {e.stderr}"}) except Exception as e: @@ -138,117 +158,240 @@ async def get_pr_templates() -> str: } for filename, template_type in DEFAULT_TEMPLATES.items() ] - + return json.dumps(templates, indent=2) @mcp.tool() async def suggest_template(changes_summary: str, change_type: str) -> str: """Let Claude analyze the changes and suggest the most appropriate PR template. - + Args: changes_summary: Your analysis of what the changes do change_type: The type of change you've identified (bug, feature, docs, refactor, test, etc.) """ - + # Get available templates templates_response = await get_pr_templates() templates = json.loads(templates_response) - + # Find matching template template_file = TYPE_MAPPING.get(change_type.lower(), "feature.md") selected_template = next( (t for t in templates if t["filename"] == template_file), templates[0] # Default to first template if no match ) - + suggestion = { "recommended_template": selected_template, "reasoning": f"Based on your analysis: '{changes_summary}', this appears to be a {change_type} change.", "template_content": selected_template["content"], "usage_hint": "Claude can help you fill out this template based on the specific changes in your PR." } - + return json.dumps(suggestion, indent=2) -# ===== Module 2: New GitHub Actions Tools ===== +# ===== New Module 2: GitHub Actions Tools ===== @mcp.tool() async def get_recent_actions_events(limit: int = 10) -> str: """Get recent GitHub Actions events received via webhook. - + Args: limit: Maximum number of events to return (default: 10) """ - # TODO: Implement this function - # 1. Check if EVENTS_FILE exists - # 2. Read the JSON file - # 3. Return the most recent events (up to limit) - # 4. Return empty list if file doesn't exist - - return json.dumps({"message": "TODO: Implement get_recent_actions_events"}) + # Read events from file + if not EVENTS_FILE.exists(): + return json.dumps([]) + + with open(EVENTS_FILE, 'r') as f: + events = json.load(f) + + # Return most recent events + recent = events[-limit:] + return json.dumps(recent, indent=2) @mcp.tool() async def get_workflow_status(workflow_name: Optional[str] = None) -> str: """Get the current status of GitHub Actions workflows. - + Args: workflow_name: Optional specific workflow name to filter by """ - # TODO: Implement this function - # 1. Read events from EVENTS_FILE - # 2. Filter events for workflow_run events - # 3. If workflow_name provided, filter by that name - # 4. Group by workflow and show latest status - # 5. Return formatted workflow status information - - return json.dumps({"message": "TODO: Implement get_workflow_status"}) + # Read events from file + if not EVENTS_FILE.exists(): + return json.dumps({"message": "No GitHub Actions events received yet"}) + + with open(EVENTS_FILE, 'r') as f: + events = json.load(f) + + if not events: + return json.dumps({"message": "No GitHub Actions events received yet"}) + + # Filter for workflow events + workflow_events = [ + e for e in events + if e.get("workflow_run") is not None + ] + + if workflow_name: + workflow_events = [ + e for e in workflow_events + if e["workflow_run"].get("name") == workflow_name + ] + # Group by workflow and get latest status + workflows = {} + for event in workflow_events: + run = event["workflow_run"] + name = run["name"] + if name not in workflows or run["updated_at"] > workflows[name]["updated_at"]: + workflows[name] = { + "name": name, + "status": run["status"], + "conclusion": run.get("conclusion"), + "run_number": run["run_number"], + "updated_at": run["updated_at"], + "html_url": run["html_url"] + } -# ===== Module 2: MCP Prompts ===== + return json.dumps(list(workflows.values()), indent=2) + + +# ===== New Module 2: MCP Prompts ===== @mcp.prompt() async def analyze_ci_results(): """Analyze recent CI/CD results and provide insights.""" - # TODO: Implement this prompt - # Return a string with instructions for Claude to: - # 1. Use get_recent_actions_events() - # 2. Use get_workflow_status() - # 3. Analyze results and provide insights - - return "TODO: Implement analyze_ci_results prompt" + return """Please analyze the recent CI/CD results from GitHub Actions: + +1. First, call get_recent_actions_events() to fetch the latest CI/CD events +2. Then call get_workflow_status() to check current workflow states +3. Identify any failures or issues that need attention +4. Provide actionable next steps based on the results + +Format your response as: +## CI/CD Status Summary +- **Overall Health**: [Good/Warning/Critical] +- **Failed Workflows**: [List any failures with links] +- **Successful Workflows**: [List recent successes] +- **Recommendations**: [Specific actions to take] +- **Trends**: [Any patterns you notice]""" @mcp.prompt() async def create_deployment_summary(): """Generate a deployment summary for team communication.""" - # TODO: Implement this prompt - # Return a string that guides Claude to create a deployment summary - - return "TODO: Implement create_deployment_summary prompt" + return """Create a deployment summary for team communication: + +1. Check workflow status with get_workflow_status() +2. Look specifically for deployment-related workflows +3. Note the deployment outcome, timing, and any issues + +Format as a concise message suitable for Slack: + +🚀 **Deployment Update** +- **Status**: [✅ Success / ❌ Failed / ⏳ In Progress] +- **Environment**: [Production/Staging/Dev] +- **Version/Commit**: [If available from workflow data] +- **Duration**: [If available] +- **Key Changes**: [Brief summary if available] +- **Issues**: [Any problems encountered] +- **Next Steps**: [Required actions if failed] + +Keep it brief but informative for team awareness.""" @mcp.prompt() async def generate_pr_status_report(): """Generate a comprehensive PR status report including CI/CD results.""" - # TODO: Implement this prompt - # Return a string that guides Claude to combine code changes with CI/CD status - - return "TODO: Implement generate_pr_status_report prompt" + return """Generate a comprehensive PR status report: + +1. Use analyze_file_changes() to understand what changed +2. Use get_workflow_status() to check CI/CD status +3. Use suggest_template() to recommend the appropriate PR template +4. Combine all information into a cohesive report + +Create a detailed report with: + +## 📋 PR Status Report + +### 📝 Code Changes +- **Files Modified**: [Count by type - .py, .js, etc.] +- **Change Type**: [Feature/Bug/Refactor/etc.] +- **Impact Assessment**: [High/Medium/Low with reasoning] +- **Key Changes**: [Bullet points of main modifications] + +### 🔄 CI/CD Status +- **All Checks**: [✅ Passing / ❌ Failing / ⏳ Running] +- **Test Results**: [Pass rate, failed tests if any] +- **Build Status**: [Success/Failed with details] +- **Code Quality**: [Linting, coverage if available] + +### 📌 Recommendations +- **PR Template**: [Suggested template and why] +- **Next Steps**: [What needs to happen before merge] +- **Reviewers**: [Suggested reviewers based on files changed] + +### ⚠️ Risks & Considerations +- [Any deployment risks] +- [Breaking changes] +- [Dependencies affected]""" @mcp.prompt() async def troubleshoot_workflow_failure(): """Help troubleshoot a failing GitHub Actions workflow.""" - # TODO: Implement this prompt - # Return a string that guides Claude through troubleshooting steps - - return "TODO: Implement troubleshoot_workflow_failure prompt" + return """Help troubleshoot failing GitHub Actions workflows: + +1. Use get_recent_actions_events() to find recent failures +2. Use get_workflow_status() to see which workflows are failing +3. Analyze the failure patterns and timing +4. Provide systematic troubleshooting steps + +Structure your response as: + +## 🔧 Workflow Troubleshooting Guide + +### ❌ Failed Workflow Details +- **Workflow Name**: [Name of failing workflow] +- **Failure Type**: [Test/Build/Deploy/Lint] +- **First Failed**: [When did it start failing] +- **Failure Rate**: [Intermittent or consistent] + +### 🔍 Diagnostic Information +- **Error Patterns**: [Common error messages or symptoms] +- **Recent Changes**: [What changed before failures started] +- **Dependencies**: [External services or resources involved] + +### 💡 Possible Causes (ordered by likelihood) +1. **[Most Likely]**: [Description and why] +2. **[Likely]**: [Description and why] +3. **[Possible]**: [Description and why] + +### ✅ Suggested Fixes +**Immediate Actions:** +- [ ] [Quick fix to try first] +- [ ] [Second quick fix] + +**Investigation Steps:** +- [ ] [How to gather more info] +- [ ] [Logs or data to check] + +**Long-term Solutions:** +- [ ] [Preventive measure] +- [ ] [Process improvement] + +### 📚 Resources +- [Relevant documentation links] +- [Similar issues or solutions]""" if __name__ == "__main__": + # Run MCP server normally print("Starting PR Agent MCP server...") - print("NOTE: Run webhook_server.py in a separate terminal to receive GitHub events") + print("To receive GitHub webhooks, run the webhook server separately:") + print(" python webhook_server.py") mcp.run() \ No newline at end of file diff --git a/projects/unit3/github-actions-integration/starter/validate_starter.py b/projects/unit3/github-actions-integration/starter/validate_starter.py index 9885831..f5b0f13 100644 --- a/projects/unit3/github-actions-integration/starter/validate_starter.py +++ b/projects/unit3/github-actions-integration/starter/validate_starter.py @@ -17,7 +17,6 @@ def test_project_structure(): "pyproject.toml", "README.md" ] - all_exist = True for file in required_files: if Path(file).exists():