Initial commit: Penpot MCP Server - Complete AI-powered design workflow automation with MCP protocol, Penpot API integration, Claude AI support, CLI tools, and comprehensive documentation
This commit is contained in:
18
.editorconfig
Normal file
18
.editorconfig
Normal file
@@ -0,0 +1,18 @@
|
|||||||
|
root = true
|
||||||
|
|
||||||
|
[*]
|
||||||
|
charset = utf-8
|
||||||
|
end_of_line = lf
|
||||||
|
insert_final_newline = true
|
||||||
|
trim_trailing_whitespace = true
|
||||||
|
indent_style = space
|
||||||
|
indent_size = 4
|
||||||
|
|
||||||
|
[*.{json,yml,yaml}]
|
||||||
|
indent_size = 2
|
||||||
|
|
||||||
|
[*.md]
|
||||||
|
trim_trailing_whitespace = false
|
||||||
|
|
||||||
|
[Makefile]
|
||||||
|
indent_style = tab
|
||||||
45
.github/ISSUE_TEMPLATE/bug_report.md
vendored
Normal file
45
.github/ISSUE_TEMPLATE/bug_report.md
vendored
Normal file
@@ -0,0 +1,45 @@
|
|||||||
|
---
|
||||||
|
name: Bug report
|
||||||
|
about: Create a report to help us improve Penpot MCP
|
||||||
|
title: '[BUG] '
|
||||||
|
labels: 'bug'
|
||||||
|
assignees: ''
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Describe the bug**
|
||||||
|
A clear and concise description of what the bug is.
|
||||||
|
|
||||||
|
**To Reproduce**
|
||||||
|
Steps to reproduce the behavior:
|
||||||
|
1. Go to '...'
|
||||||
|
2. Click on '....'
|
||||||
|
3. Scroll down to '....'
|
||||||
|
4. See error
|
||||||
|
|
||||||
|
**Expected behavior**
|
||||||
|
A clear and concise description of what you expected to happen.
|
||||||
|
|
||||||
|
**Screenshots**
|
||||||
|
If applicable, add screenshots to help explain your problem.
|
||||||
|
|
||||||
|
**Environment (please complete the following information):**
|
||||||
|
- OS: [e.g. Ubuntu 22.04, macOS 14.0, Windows 11]
|
||||||
|
- Python version: [e.g. 3.12.0]
|
||||||
|
- Penpot MCP version: [e.g. 0.1.0]
|
||||||
|
- Penpot version: [e.g. 2.0.0]
|
||||||
|
- AI Assistant: [e.g. Claude Desktop, Custom MCP client]
|
||||||
|
|
||||||
|
**Configuration**
|
||||||
|
- Are you using environment variables or .env file?
|
||||||
|
- What's your PENPOT_API_URL?
|
||||||
|
- Any custom configuration?
|
||||||
|
|
||||||
|
**Logs**
|
||||||
|
If applicable, add relevant log output:
|
||||||
|
```
|
||||||
|
Paste logs here
|
||||||
|
```
|
||||||
|
|
||||||
|
**Additional context**
|
||||||
|
Add any other context about the problem here.
|
||||||
38
.github/ISSUE_TEMPLATE/feature_request.md
vendored
Normal file
38
.github/ISSUE_TEMPLATE/feature_request.md
vendored
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
---
|
||||||
|
name: Feature request
|
||||||
|
about: Suggest an idea for Penpot MCP
|
||||||
|
title: '[FEATURE] '
|
||||||
|
labels: 'enhancement'
|
||||||
|
assignees: ''
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Is your feature request related to a problem? Please describe.**
|
||||||
|
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
|
||||||
|
|
||||||
|
**Describe the solution you'd like**
|
||||||
|
A clear and concise description of what you want to happen.
|
||||||
|
|
||||||
|
**Describe alternatives you've considered**
|
||||||
|
A clear and concise description of any alternative solutions or features you've considered.
|
||||||
|
|
||||||
|
**Use case**
|
||||||
|
Describe how this feature would be used:
|
||||||
|
- Who would benefit from this feature?
|
||||||
|
- In what scenarios would it be useful?
|
||||||
|
- How would it improve the Penpot MCP workflow?
|
||||||
|
|
||||||
|
**Implementation ideas**
|
||||||
|
If you have ideas about how this could be implemented, please share them:
|
||||||
|
- API changes needed
|
||||||
|
- New MCP tools or resources
|
||||||
|
- Integration points with Penpot or AI assistants
|
||||||
|
|
||||||
|
**Additional context**
|
||||||
|
Add any other context, screenshots, mockups, or examples about the feature request here.
|
||||||
|
|
||||||
|
**Priority**
|
||||||
|
How important is this feature to you?
|
||||||
|
- [ ] Nice to have
|
||||||
|
- [ ] Important for my workflow
|
||||||
|
- [ ] Critical for adoption
|
||||||
51
.github/PULL_REQUEST_TEMPLATE.md
vendored
Normal file
51
.github/PULL_REQUEST_TEMPLATE.md
vendored
Normal file
@@ -0,0 +1,51 @@
|
|||||||
|
## Description
|
||||||
|
|
||||||
|
Brief description of the changes in this PR.
|
||||||
|
|
||||||
|
## Type of Change
|
||||||
|
|
||||||
|
- [ ] Bug fix (non-breaking change which fixes an issue)
|
||||||
|
- [ ] New feature (non-breaking change which adds functionality)
|
||||||
|
- [ ] Breaking change (fix or feature that would cause existing functionality to not work as expected)
|
||||||
|
- [ ] Documentation update
|
||||||
|
- [ ] Performance improvement
|
||||||
|
- [ ] Code refactoring
|
||||||
|
|
||||||
|
## Related Issues
|
||||||
|
|
||||||
|
Fixes #(issue number)
|
||||||
|
|
||||||
|
## Changes Made
|
||||||
|
|
||||||
|
- [ ] Added/modified MCP tools or resources
|
||||||
|
- [ ] Updated Penpot API integration
|
||||||
|
- [ ] Enhanced AI assistant compatibility
|
||||||
|
- [ ] Improved error handling
|
||||||
|
- [ ] Added tests
|
||||||
|
- [ ] Updated documentation
|
||||||
|
|
||||||
|
## Testing
|
||||||
|
|
||||||
|
- [ ] Tests pass locally
|
||||||
|
- [ ] Added tests for new functionality
|
||||||
|
- [ ] Tested with Claude Desktop integration
|
||||||
|
- [ ] Tested with Penpot API
|
||||||
|
- [ ] Manual testing completed
|
||||||
|
|
||||||
|
## Checklist
|
||||||
|
|
||||||
|
- [ ] My code follows the project's style guidelines
|
||||||
|
- [ ] I have performed a self-review of my code
|
||||||
|
- [ ] I have commented my code, particularly in hard-to-understand areas
|
||||||
|
- [ ] I have made corresponding changes to the documentation
|
||||||
|
- [ ] My changes generate no new warnings
|
||||||
|
- [ ] I have added tests that prove my fix is effective or that my feature works
|
||||||
|
- [ ] New and existing unit tests pass locally with my changes
|
||||||
|
|
||||||
|
## Screenshots (if applicable)
|
||||||
|
|
||||||
|
Add screenshots to help explain your changes.
|
||||||
|
|
||||||
|
## Additional Notes
|
||||||
|
|
||||||
|
Any additional information that reviewers should know.
|
||||||
66
.gitignore
vendored
Normal file
66
.gitignore
vendored
Normal file
@@ -0,0 +1,66 @@
|
|||||||
|
# Python
|
||||||
|
__pycache__/
|
||||||
|
*.py[cod]
|
||||||
|
*$py.class
|
||||||
|
*.so
|
||||||
|
.Python
|
||||||
|
env/
|
||||||
|
build/
|
||||||
|
develop-eggs/
|
||||||
|
dist/
|
||||||
|
downloads/
|
||||||
|
eggs/
|
||||||
|
.eggs/
|
||||||
|
lib/
|
||||||
|
lib64/
|
||||||
|
parts/
|
||||||
|
sdist/
|
||||||
|
var/
|
||||||
|
*.egg-info/
|
||||||
|
.installed.cfg
|
||||||
|
*.egg
|
||||||
|
|
||||||
|
# Virtual Environment
|
||||||
|
venv/
|
||||||
|
ENV/
|
||||||
|
env/
|
||||||
|
.venv/
|
||||||
|
|
||||||
|
# uv
|
||||||
|
.python-version
|
||||||
|
|
||||||
|
# Environment variables
|
||||||
|
.env
|
||||||
|
|
||||||
|
# IDE files
|
||||||
|
.idea/
|
||||||
|
.vscode/
|
||||||
|
*.swp
|
||||||
|
*.swo
|
||||||
|
|
||||||
|
# OS specific
|
||||||
|
.DS_Store
|
||||||
|
Thumbs.db
|
||||||
|
|
||||||
|
# Logs
|
||||||
|
logs/
|
||||||
|
*.log
|
||||||
|
*.json
|
||||||
|
!penpot-schema.json
|
||||||
|
!penpot-tree-schema.json
|
||||||
|
.coverage
|
||||||
|
|
||||||
|
# Unit test / coverage reports
|
||||||
|
htmlcov/
|
||||||
|
.tox/
|
||||||
|
.nox/
|
||||||
|
.coverage
|
||||||
|
.coverage.*
|
||||||
|
.cache
|
||||||
|
nosetests.xml
|
||||||
|
coverage.xml
|
||||||
|
*.cover
|
||||||
|
*.py,cover
|
||||||
|
.hypothesis/
|
||||||
|
.pytest_cache/
|
||||||
|
pytestdebug.log
|
||||||
41
.pre-commit-config.yaml
Normal file
41
.pre-commit-config.yaml
Normal file
@@ -0,0 +1,41 @@
|
|||||||
|
repos:
|
||||||
|
- repo: https://github.com/pre-commit/pre-commit-hooks
|
||||||
|
rev: v4.4.0
|
||||||
|
hooks:
|
||||||
|
- id: trailing-whitespace
|
||||||
|
- id: end-of-file-fixer
|
||||||
|
- id: check-yaml
|
||||||
|
- id: check-added-large-files
|
||||||
|
|
||||||
|
- repo: https://github.com/pycqa/flake8
|
||||||
|
rev: 6.1.0
|
||||||
|
hooks:
|
||||||
|
- id: flake8
|
||||||
|
additional_dependencies: [flake8-docstrings]
|
||||||
|
types: [python]
|
||||||
|
files: ^(penpot_mcp|tests)/.*\.py$
|
||||||
|
|
||||||
|
- repo: https://github.com/pycqa/isort
|
||||||
|
rev: 5.12.0
|
||||||
|
hooks:
|
||||||
|
- id: isort
|
||||||
|
args: ["--profile", "black", "--filter-files"]
|
||||||
|
types: [python]
|
||||||
|
files: ^(penpot_mcp|tests)/.*\.py$
|
||||||
|
|
||||||
|
- repo: https://github.com/asottile/pyupgrade
|
||||||
|
rev: v3.13.0
|
||||||
|
hooks:
|
||||||
|
- id: pyupgrade
|
||||||
|
args: [--py312-plus]
|
||||||
|
types: [python]
|
||||||
|
files: ^(penpot_mcp|tests)/.*\.py$
|
||||||
|
|
||||||
|
- repo: https://github.com/pre-commit/mirrors-autopep8
|
||||||
|
rev: v2.0.4
|
||||||
|
hooks:
|
||||||
|
- id: autopep8
|
||||||
|
args: [--aggressive, --aggressive, --select=E,W]
|
||||||
|
types: [python]
|
||||||
|
files: ^(penpot_mcp|tests)/.*\.py$
|
||||||
|
additional_dependencies: [setuptools>=65.5.0]
|
||||||
20
.vscode/launch.json
vendored
Normal file
20
.vscode/launch.json
vendored
Normal file
@@ -0,0 +1,20 @@
|
|||||||
|
{
|
||||||
|
// Use IntelliSense to learn about possible attributes.
|
||||||
|
// Hover to view descriptions of existing attributes.
|
||||||
|
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
|
||||||
|
"version": "0.2.0",
|
||||||
|
"configurations": [
|
||||||
|
{
|
||||||
|
"name": "Debug Penpot MCP Server",
|
||||||
|
"type": "debugpy",
|
||||||
|
"request": "launch",
|
||||||
|
"program": "${workspaceFolder}/penpot_mcp/server/mcp_server.py",
|
||||||
|
"justMyCode": false,
|
||||||
|
"console": "integratedTerminal",
|
||||||
|
"args": [
|
||||||
|
"--mode",
|
||||||
|
"sse"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
95
CLAUDE_INTEGRATION.md
Normal file
95
CLAUDE_INTEGRATION.md
Normal file
@@ -0,0 +1,95 @@
|
|||||||
|
# Using Penpot MCP with Claude
|
||||||
|
|
||||||
|
This guide explains how to integrate the Penpot MCP server with Claude AI using the Model Context Protocol (MCP).
|
||||||
|
|
||||||
|
## Prerequisites
|
||||||
|
|
||||||
|
1. Claude Desktop application installed
|
||||||
|
2. Penpot MCP server set up and configured
|
||||||
|
|
||||||
|
## Installing the Penpot MCP Server in Claude Desktop
|
||||||
|
|
||||||
|
The easiest way to use the Penpot MCP server with Claude is to install it directly in Claude Desktop:
|
||||||
|
|
||||||
|
1. Make sure you have installed the required dependencies:
|
||||||
|
```bash
|
||||||
|
pip install -r requirements.txt
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Install the MCP server in Claude Desktop:
|
||||||
|
```bash
|
||||||
|
mcp install mcp_server.py
|
||||||
|
```
|
||||||
|
|
||||||
|
3. Claude will ask for your permission to install the server. Click "Allow".
|
||||||
|
|
||||||
|
4. The Penpot MCP server will now appear in Claude's tool menu.
|
||||||
|
|
||||||
|
## Using Penpot in Claude
|
||||||
|
|
||||||
|
Once installed, you can interact with Penpot through Claude by:
|
||||||
|
|
||||||
|
1. Open Claude Desktop
|
||||||
|
2. Click on the "+" button in the message input area
|
||||||
|
3. Select "Penpot MCP Server" from the list
|
||||||
|
4. Claude now has access to your Penpot projects and can:
|
||||||
|
- List your projects
|
||||||
|
- Get project details
|
||||||
|
- Access file information
|
||||||
|
- View components
|
||||||
|
|
||||||
|
## Example Prompts for Claude
|
||||||
|
|
||||||
|
Here are some example prompts you can use with Claude to interact with your Penpot data:
|
||||||
|
|
||||||
|
### Listing Projects
|
||||||
|
|
||||||
|
```
|
||||||
|
Can you show me a list of my Penpot projects?
|
||||||
|
```
|
||||||
|
|
||||||
|
### Getting Project Details
|
||||||
|
|
||||||
|
```
|
||||||
|
Please show me the details of my most recent Penpot project.
|
||||||
|
```
|
||||||
|
|
||||||
|
### Working with Files
|
||||||
|
|
||||||
|
```
|
||||||
|
Can you list the files in my "Website Redesign" project?
|
||||||
|
```
|
||||||
|
|
||||||
|
### Exploring Components
|
||||||
|
|
||||||
|
```
|
||||||
|
Please show me the available UI components in Penpot.
|
||||||
|
```
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
If you encounter issues:
|
||||||
|
|
||||||
|
1. Check that your Penpot access token is correctly set in the environment variables
|
||||||
|
2. Verify that the Penpot API URL is correct
|
||||||
|
3. Try reinstalling the MCP server in Claude Desktop:
|
||||||
|
```bash
|
||||||
|
mcp uninstall "Penpot MCP Server"
|
||||||
|
mcp install mcp_server.py
|
||||||
|
```
|
||||||
|
|
||||||
|
## Advanced: Using with Other MCP-compatible Tools
|
||||||
|
|
||||||
|
The Penpot MCP server can be used with any MCP-compatible client, not just Claude Desktop. Other integrations include:
|
||||||
|
|
||||||
|
- OpenAI Agents SDK
|
||||||
|
- PydanticAI
|
||||||
|
- Python MCP clients (see `example_client.py`)
|
||||||
|
|
||||||
|
Refer to the specific documentation for these tools for integration instructions.
|
||||||
|
|
||||||
|
## Resources
|
||||||
|
|
||||||
|
- [Model Context Protocol Documentation](https://modelcontextprotocol.io)
|
||||||
|
- [Claude Developer Documentation](https://docs.anthropic.com)
|
||||||
|
- [MCP Python SDK Documentation](https://github.com/modelcontextprotocol/python-sdk)
|
||||||
217
CONTRIBUTING.md
Normal file
217
CONTRIBUTING.md
Normal file
@@ -0,0 +1,217 @@
|
|||||||
|
# Contributing to Penpot MCP 🤝
|
||||||
|
|
||||||
|
Thank you for your interest in contributing to Penpot MCP! This project aims to bridge AI assistants with Penpot design tools, and we welcome contributions from developers, designers, and AI enthusiasts.
|
||||||
|
|
||||||
|
## 🌟 Ways to Contribute
|
||||||
|
|
||||||
|
### For Developers
|
||||||
|
- **Bug Fixes**: Help us squash bugs and improve stability
|
||||||
|
- **New Features**: Add new MCP tools, resources, or AI integrations
|
||||||
|
- **Performance**: Optimize API calls, caching, and response times
|
||||||
|
- **Documentation**: Improve code documentation and examples
|
||||||
|
- **Testing**: Add unit tests, integration tests, and edge case coverage
|
||||||
|
|
||||||
|
### For Designers
|
||||||
|
- **Use Case Documentation**: Share how you use Penpot MCP in your workflow
|
||||||
|
- **Feature Requests**: Suggest new AI-powered design features
|
||||||
|
- **UI/UX Feedback**: Help improve the developer and user experience
|
||||||
|
- **Design Examples**: Contribute example Penpot files for testing
|
||||||
|
|
||||||
|
### For AI Enthusiasts
|
||||||
|
- **Prompt Engineering**: Improve AI interaction patterns
|
||||||
|
- **Model Integration**: Add support for new AI models and assistants
|
||||||
|
- **Workflow Automation**: Create AI-powered design automation scripts
|
||||||
|
- **Research**: Explore new applications of AI in design workflows
|
||||||
|
|
||||||
|
## 🚀 Getting Started
|
||||||
|
|
||||||
|
### 1. Fork and Clone
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Fork the repository on GitHub, then clone your fork
|
||||||
|
git clone https://github.com/YOUR_USERNAME/penpot-mcp.git
|
||||||
|
cd penpot-mcp
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Set Up Development Environment
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Install uv (recommended Python package manager)
|
||||||
|
curl -LsSf https://astral.sh/uv/install.sh | sh
|
||||||
|
|
||||||
|
# Install dependencies and set up development environment
|
||||||
|
uv sync --extra dev
|
||||||
|
|
||||||
|
# Install pre-commit hooks
|
||||||
|
uv run pre-commit install
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3. Configure Environment
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Copy environment template
|
||||||
|
cp env.example .env
|
||||||
|
|
||||||
|
# Edit .env with your Penpot credentials
|
||||||
|
# PENPOT_API_URL=https://design.penpot.app/api
|
||||||
|
# PENPOT_USERNAME=your_username
|
||||||
|
# PENPOT_PASSWORD=your_password
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4. Run Tests
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Run the full test suite
|
||||||
|
uv run pytest
|
||||||
|
|
||||||
|
# Run with coverage
|
||||||
|
uv run pytest --cov=penpot_mcp
|
||||||
|
|
||||||
|
# Run specific test categories
|
||||||
|
uv run pytest -m "not slow" # Skip slow tests
|
||||||
|
uv run pytest tests/test_api/ # Test specific module
|
||||||
|
```
|
||||||
|
|
||||||
|
## 🔧 Development Workflow
|
||||||
|
|
||||||
|
### Code Style
|
||||||
|
|
||||||
|
We use automated code formatting and linting:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Run all linting and formatting
|
||||||
|
uv run python lint.py
|
||||||
|
|
||||||
|
# Auto-fix issues where possible
|
||||||
|
uv run python lint.py --autofix
|
||||||
|
|
||||||
|
# Check specific files
|
||||||
|
uv run flake8 penpot_mcp/
|
||||||
|
uv run isort penpot_mcp/
|
||||||
|
```
|
||||||
|
|
||||||
|
### Testing Guidelines
|
||||||
|
|
||||||
|
- **Unit Tests**: Test individual functions and classes
|
||||||
|
- **Integration Tests**: Test MCP protocol interactions
|
||||||
|
- **API Tests**: Test Penpot API integration (use mocks for CI)
|
||||||
|
- **End-to-End Tests**: Test complete workflows with real data
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Test structure
|
||||||
|
tests/
|
||||||
|
├── unit/ # Fast, isolated tests
|
||||||
|
├── integration/ # MCP protocol tests
|
||||||
|
├── api/ # Penpot API tests
|
||||||
|
└── e2e/ # End-to-end workflow tests
|
||||||
|
```
|
||||||
|
|
||||||
|
### Adding New Features
|
||||||
|
|
||||||
|
1. **Create an Issue**: Discuss your idea before implementing
|
||||||
|
2. **Branch Naming**: Use descriptive names like `feature/ai-design-analysis`
|
||||||
|
3. **Small PRs**: Keep changes focused and reviewable
|
||||||
|
4. **Documentation**: Update README, docstrings, and examples
|
||||||
|
5. **Tests**: Add comprehensive tests for new functionality
|
||||||
|
|
||||||
|
### MCP Protocol Guidelines
|
||||||
|
|
||||||
|
When adding new MCP tools or resources:
|
||||||
|
|
||||||
|
```python
|
||||||
|
# Follow this pattern for new tools
|
||||||
|
@mcp_tool("tool_name")
|
||||||
|
async def new_tool(param1: str, param2: int = 10) -> dict:
|
||||||
|
"""
|
||||||
|
Brief description of what this tool does.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
param1: Description of parameter
|
||||||
|
param2: Optional parameter with default
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary with tool results
|
||||||
|
"""
|
||||||
|
# Implementation here
|
||||||
|
pass
|
||||||
|
```
|
||||||
|
|
||||||
|
## 📝 Commit Guidelines
|
||||||
|
|
||||||
|
We follow [Conventional Commits](https://www.conventionalcommits.org/):
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Format: type(scope): description
|
||||||
|
git commit -m "feat(api): add design component analysis tool"
|
||||||
|
git commit -m "fix(mcp): handle connection timeout errors"
|
||||||
|
git commit -m "docs(readme): add Claude Desktop setup guide"
|
||||||
|
git commit -m "test(api): add unit tests for file export"
|
||||||
|
```
|
||||||
|
|
||||||
|
### Commit Types
|
||||||
|
- `feat`: New features
|
||||||
|
- `fix`: Bug fixes
|
||||||
|
- `docs`: Documentation changes
|
||||||
|
- `test`: Adding or updating tests
|
||||||
|
- `refactor`: Code refactoring
|
||||||
|
- `perf`: Performance improvements
|
||||||
|
- `chore`: Maintenance tasks
|
||||||
|
|
||||||
|
## 🐛 Reporting Issues
|
||||||
|
|
||||||
|
### Bug Reports
|
||||||
|
Use our [bug report template](.github/ISSUE_TEMPLATE/bug_report.md) and include:
|
||||||
|
- Clear reproduction steps
|
||||||
|
- Environment details (OS, Python version, etc.)
|
||||||
|
- Error messages and logs
|
||||||
|
- Expected vs actual behavior
|
||||||
|
|
||||||
|
### Feature Requests
|
||||||
|
Use our [feature request template](.github/ISSUE_TEMPLATE/feature_request.md) and include:
|
||||||
|
- Use case description
|
||||||
|
- Proposed solution
|
||||||
|
- Implementation ideas
|
||||||
|
- Priority level
|
||||||
|
|
||||||
|
## 🔍 Code Review Process
|
||||||
|
|
||||||
|
1. **Automated Checks**: All PRs must pass CI/CD checks
|
||||||
|
2. **Peer Review**: At least one maintainer review required
|
||||||
|
3. **Testing**: New features must include tests
|
||||||
|
4. **Documentation**: Update relevant documentation
|
||||||
|
5. **Backwards Compatibility**: Avoid breaking changes when possible
|
||||||
|
|
||||||
|
## 🏆 Recognition
|
||||||
|
|
||||||
|
Contributors are recognized in:
|
||||||
|
- GitHub contributors list
|
||||||
|
- Release notes for significant contributions
|
||||||
|
- Special mentions for innovative features
|
||||||
|
- Community showcase for creative use cases
|
||||||
|
|
||||||
|
## 📚 Resources
|
||||||
|
|
||||||
|
### Documentation
|
||||||
|
- [MCP Protocol Specification](https://modelcontextprotocol.io)
|
||||||
|
- [Penpot API Documentation](https://help.penpot.app/technical-guide/developer-resources/)
|
||||||
|
- [Claude AI Integration Guide](CLAUDE_INTEGRATION.md)
|
||||||
|
|
||||||
|
### Community
|
||||||
|
- [GitHub Discussions](https://github.com/montevive/penpot-mcp/discussions)
|
||||||
|
- [Issues](https://github.com/montevive/penpot-mcp/issues)
|
||||||
|
- [Penpot Community](https://community.penpot.app/)
|
||||||
|
|
||||||
|
## 📄 License
|
||||||
|
|
||||||
|
By contributing to Penpot MCP, you agree that your contributions will be licensed under the [MIT License](LICENSE).
|
||||||
|
|
||||||
|
## ❓ Questions?
|
||||||
|
|
||||||
|
- **General Questions**: Use [GitHub Discussions](https://github.com/montevive/penpot-mcp/discussions)
|
||||||
|
- **Bug Reports**: Create an [issue](https://github.com/montevive/penpot-mcp/issues)
|
||||||
|
- **Feature Ideas**: Use our [feature request template](.github/ISSUE_TEMPLATE/feature_request.md)
|
||||||
|
- **Security Issues**: Email us at security@montevive.ai
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
Thank you for helping make Penpot MCP better! 🎨🤖
|
||||||
202
LINTING.md
Normal file
202
LINTING.md
Normal file
@@ -0,0 +1,202 @@
|
|||||||
|
# Linting Guide
|
||||||
|
|
||||||
|
This document provides guidelines on how to work with the linting tools configured in this project.
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
The project uses the following linting tools:
|
||||||
|
|
||||||
|
- **flake8**: Code style and quality checker
|
||||||
|
- **isort**: Import sorting
|
||||||
|
- **autopep8**: PEP 8 code formatting with auto-fix capability
|
||||||
|
- **pyupgrade**: Upgrades Python syntax for newer versions
|
||||||
|
- **pre-commit**: Framework for managing pre-commit hooks
|
||||||
|
|
||||||
|
## Quick Start
|
||||||
|
|
||||||
|
1. Use the setup script to install all dependencies and set up pre-commit hooks:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./fix-lint-deps.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
Or install dependencies manually:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
pip install -r requirements-dev.txt
|
||||||
|
pre-commit install
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Run the linting script:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Check for issues
|
||||||
|
./lint.py
|
||||||
|
|
||||||
|
# Fix issues automatically where possible
|
||||||
|
./lint.py --autofix
|
||||||
|
```
|
||||||
|
|
||||||
|
## Dependencies
|
||||||
|
|
||||||
|
The linting tools require specific dependencies:
|
||||||
|
|
||||||
|
- **flake8** and **flake8-docstrings**: For code style and documentation checking
|
||||||
|
- **isort**: For import sorting
|
||||||
|
- **autopep8**: For automatic PEP 8 compliance
|
||||||
|
- **pyupgrade**: For Python syntax upgrading
|
||||||
|
- **setuptools**: Required for lib2to3 which is used by autopep8
|
||||||
|
|
||||||
|
If you encounter a `ModuleNotFoundError: No module named 'lib2to3'` error, make sure you have setuptools installed:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
pip install setuptools>=65.5.0
|
||||||
|
```
|
||||||
|
|
||||||
|
Or simply run the fix script:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./fix-lint-deps.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
The linting tools are configured in the following files:
|
||||||
|
|
||||||
|
- **setup.cfg**: Contains settings for flake8, autopep8, etc.
|
||||||
|
- **.pre-commit-config.yaml**: Configuration for pre-commit hooks
|
||||||
|
- **.editorconfig**: Editor settings for consistent code formatting
|
||||||
|
|
||||||
|
## Linting Rules
|
||||||
|
|
||||||
|
### Code Style Rules
|
||||||
|
|
||||||
|
We follow PEP 8 with some exceptions:
|
||||||
|
|
||||||
|
- **Line Length**: Max line length is 100 characters
|
||||||
|
- **Ignored Rules**:
|
||||||
|
- E203: Whitespace before ':' (conflicts with Black)
|
||||||
|
- W503: Line break before binary operator (conflicts with Black)
|
||||||
|
|
||||||
|
### Documentation Rules
|
||||||
|
|
||||||
|
All public modules, functions, classes, and methods should have docstrings. We use the Google style for docstrings.
|
||||||
|
|
||||||
|
Example:
|
||||||
|
|
||||||
|
```python
|
||||||
|
def function(param1, param2):
|
||||||
|
"""Summary of function purpose.
|
||||||
|
|
||||||
|
More detailed explanation if needed.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
param1: Description of param1.
|
||||||
|
param2: Description of param2.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Description of return value.
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
ExceptionType: When and why this exception is raised.
|
||||||
|
"""
|
||||||
|
# function implementation
|
||||||
|
```
|
||||||
|
|
||||||
|
### Import Sorting
|
||||||
|
|
||||||
|
Imports should be sorted using isort with the black profile. Imports are grouped in the following order:
|
||||||
|
|
||||||
|
1. Standard library imports
|
||||||
|
2. Related third-party imports
|
||||||
|
3. Local application/library specific imports
|
||||||
|
|
||||||
|
With each group sorted alphabetically.
|
||||||
|
|
||||||
|
## Auto-Fixing Issues
|
||||||
|
|
||||||
|
Many issues can be fixed automatically:
|
||||||
|
|
||||||
|
- **Import Sorting**: `isort` can sort imports automatically
|
||||||
|
- **PEP 8 Formatting**: `autopep8` can fix many style issues
|
||||||
|
- **Python Syntax**: `pyupgrade` can update syntax to newer Python versions
|
||||||
|
|
||||||
|
Run the auto-fix command:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
./lint.py --autofix
|
||||||
|
```
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
If you encounter issues with the linting tools:
|
||||||
|
|
||||||
|
1. **Missing dependencies**: Run `./fix-lint-deps.sh` to install all required dependencies
|
||||||
|
2. **Autopep8 errors**: Make sure setuptools is installed for lib2to3 support
|
||||||
|
3. **Pre-commit hook failures**: Run `pre-commit run --all-files` to see which files are causing issues
|
||||||
|
|
||||||
|
## Pre-commit Hooks
|
||||||
|
|
||||||
|
Pre-commit hooks run automatically when you commit changes. They ensure that linting issues are caught before code is committed.
|
||||||
|
|
||||||
|
If hooks fail during a commit:
|
||||||
|
|
||||||
|
1. The commit will be aborted
|
||||||
|
2. Review the error messages
|
||||||
|
3. Fix the issues manually or using auto-fix
|
||||||
|
4. Stage the fixed files
|
||||||
|
5. Retry your commit
|
||||||
|
|
||||||
|
## Common Issues and Solutions
|
||||||
|
|
||||||
|
### Disabling Linting for Specific Lines
|
||||||
|
|
||||||
|
Sometimes it's necessary to disable linting for specific lines:
|
||||||
|
|
||||||
|
```python
|
||||||
|
# For flake8
|
||||||
|
some_code = "example" # noqa: E501
|
||||||
|
|
||||||
|
# For multiple rules
|
||||||
|
some_code = "example" # noqa: E501, F401
|
||||||
|
```
|
||||||
|
|
||||||
|
### Handling Third-Party Code
|
||||||
|
|
||||||
|
For third-party code that doesn't follow our style, consider isolating it in a separate file or directory and excluding it from linting.
|
||||||
|
|
||||||
|
## IDE Integration
|
||||||
|
|
||||||
|
### VSCode
|
||||||
|
|
||||||
|
Install the Python, Flake8, and EditorConfig extensions. Add to settings.json:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"python.linting.enabled": true,
|
||||||
|
"python.linting.flake8Enabled": true,
|
||||||
|
"editor.formatOnSave": true,
|
||||||
|
"python.formatting.provider": "autopep8",
|
||||||
|
"python.sortImports.args": ["--profile", "black"]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### PyCharm
|
||||||
|
|
||||||
|
Enable Flake8 in:
|
||||||
|
Settings → Editor → Inspections → Python → Flake8
|
||||||
|
|
||||||
|
Configure isort:
|
||||||
|
Settings → Editor → Code Style → Python → Imports
|
||||||
|
|
||||||
|
## Customizing Linting Rules
|
||||||
|
|
||||||
|
To modify linting rules:
|
||||||
|
|
||||||
|
1. Edit `setup.cfg` for flake8 and autopep8 settings
|
||||||
|
2. Edit `.pre-commit-config.yaml` for pre-commit hook settings
|
||||||
|
3. Run `pre-commit autoupdate` to update hook versions
|
||||||
|
|
||||||
|
## Continuous Integration
|
||||||
|
|
||||||
|
Linting checks are part of the CI pipeline. Pull requests that fail linting will not be merged until issues are fixed.
|
||||||
24
Makefile
Normal file
24
Makefile
Normal file
@@ -0,0 +1,24 @@
|
|||||||
|
# Makefile for Penpot MCP
|
||||||
|
.PHONY: mcp-server mcp-inspector mcp-server-sse
|
||||||
|
|
||||||
|
# Default port for MCP server
|
||||||
|
PORT ?= 5000
|
||||||
|
# Default mode is stdio (can be overridden by environment variable MODE)
|
||||||
|
MODE ?= stdio
|
||||||
|
|
||||||
|
# Launch MCP server with configurable mode (stdio or sse)
|
||||||
|
mcp-server:
|
||||||
|
python -m penpot_mcp.server.mcp_server --mode $(MODE)
|
||||||
|
|
||||||
|
# Launch MCP server specifically in SSE mode
|
||||||
|
mcp-server-sse:
|
||||||
|
MODE=sse python -m penpot_mcp.server.mcp_server
|
||||||
|
|
||||||
|
# Launch MCP inspector - requires the server to be running in sse mode
|
||||||
|
mcp-inspector:
|
||||||
|
npx @modelcontextprotocol/inspector
|
||||||
|
|
||||||
|
# Run both server (in sse mode) and inspector (server in background)
|
||||||
|
all:
|
||||||
|
MODE=sse python -m penpot_mcp.server.mcp_server & \
|
||||||
|
npx @modelcontextprotocol/inspector
|
||||||
342
README.md
Normal file
342
README.md
Normal file
@@ -0,0 +1,342 @@
|
|||||||
|
# Penpot MCP Server 🎨🤖
|
||||||
|
|
||||||
|
<p align="center">
|
||||||
|
<img src="images/penpot-mcp.png" alt="Penpot MCP Logo" width="400"/>
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<p align="center">
|
||||||
|
<strong>AI-Powered Design Workflow Automation</strong><br>
|
||||||
|
Connect Claude AI and other LLMs to Penpot designs via Model Context Protocol
|
||||||
|
</p>
|
||||||
|
|
||||||
|
<p align="center">
|
||||||
|
<a href="https://github.com/montevive/penpot-mcp/blob/main/LICENSE">
|
||||||
|
<img src="https://img.shields.io/badge/License-MIT-blue.svg" alt="License: MIT">
|
||||||
|
</a>
|
||||||
|
<a href="https://www.python.org/downloads/">
|
||||||
|
<img src="https://img.shields.io/badge/python-3.12%2B-blue" alt="Python Version">
|
||||||
|
</a>
|
||||||
|
<a href="https://pypi.org/project/penpot-mcp/">
|
||||||
|
<img src="https://img.shields.io/pypi/v/penpot-mcp" alt="PyPI version">
|
||||||
|
</a>
|
||||||
|
<a href="https://github.com/montevive/penpot-mcp/actions">
|
||||||
|
<img src="https://img.shields.io/github/workflow/status/montevive/penpot-mcp/CI" alt="Build Status">
|
||||||
|
</a>
|
||||||
|
</p>
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 🚀 What is Penpot MCP?
|
||||||
|
|
||||||
|
**Penpot MCP** is a revolutionary Model Context Protocol (MCP) server that bridges the gap between AI language models and [Penpot](https://penpot.app/), the open-source design and prototyping platform. This integration enables AI assistants like Claude to understand, analyze, and interact with your design files programmatically.
|
||||||
|
|
||||||
|
### 🎯 Key Benefits
|
||||||
|
|
||||||
|
- **🤖 AI-Native Design Analysis**: Let Claude AI analyze your UI/UX designs, provide feedback, and suggest improvements
|
||||||
|
- **⚡ Automated Design Workflows**: Streamline repetitive design tasks with AI-powered automation
|
||||||
|
- **🔍 Intelligent Design Search**: Find design components and patterns across your projects using natural language
|
||||||
|
- **📊 Design System Management**: Automatically document and maintain design systems with AI assistance
|
||||||
|
- **🎨 Cross-Platform Integration**: Works with any MCP-compatible AI assistant (Claude, ChatGPT, etc.)
|
||||||
|
|
||||||
|
## 🎥 Demo Video
|
||||||
|
|
||||||
|
Check out our demo video to see Penpot MCP in action:
|
||||||
|
|
||||||
|
[](https://www.youtube.com/watch?v=vOMEh-ONN1k)
|
||||||
|
|
||||||
|
## ✨ Features
|
||||||
|
|
||||||
|
### 🔌 Core Capabilities
|
||||||
|
- **MCP Protocol Implementation**: Full compliance with Model Context Protocol standards
|
||||||
|
- **Real-time Design Access**: Direct integration with Penpot's API for live design data
|
||||||
|
- **Component Analysis**: AI-powered analysis of design components and layouts
|
||||||
|
- **Export Automation**: Programmatic export of design assets in multiple formats
|
||||||
|
- **Design Validation**: Automated design system compliance checking
|
||||||
|
|
||||||
|
### 🛠️ Developer Tools
|
||||||
|
- **Command-line Utilities**: Powerful CLI tools for design file analysis and validation
|
||||||
|
- **Python SDK**: Comprehensive Python library for custom integrations
|
||||||
|
- **REST API**: HTTP endpoints for web application integration
|
||||||
|
- **Extensible Architecture**: Plugin system for custom AI workflows
|
||||||
|
|
||||||
|
### 🎨 AI Integration Features
|
||||||
|
- **Claude Desktop Integration**: Native support for Claude AI assistant
|
||||||
|
- **Design Context Sharing**: Provide design context to AI models for better responses
|
||||||
|
- **Visual Component Recognition**: AI can "see" and understand design components
|
||||||
|
- **Natural Language Queries**: Ask questions about your designs in plain English
|
||||||
|
|
||||||
|
## 💡 Use Cases
|
||||||
|
|
||||||
|
### For Designers
|
||||||
|
- **Design Review Automation**: Get instant AI feedback on accessibility, usability, and design principles
|
||||||
|
- **Component Documentation**: Automatically generate documentation for design systems
|
||||||
|
- **Design Consistency Checks**: Ensure brand guidelines compliance across projects
|
||||||
|
- **Asset Organization**: AI-powered tagging and categorization of design components
|
||||||
|
|
||||||
|
### For Developers
|
||||||
|
- **Design-to-Code Workflows**: Bridge the gap between design and development with AI assistance
|
||||||
|
- **API Integration**: Programmatic access to design data for custom tools and workflows
|
||||||
|
- **Automated Testing**: Generate visual regression tests from design specifications
|
||||||
|
- **Design System Sync**: Keep design tokens and code components in sync
|
||||||
|
|
||||||
|
### For Product Teams
|
||||||
|
- **Design Analytics**: Track design system adoption and component usage
|
||||||
|
- **Collaboration Enhancement**: AI-powered design reviews and feedback collection
|
||||||
|
- **Workflow Optimization**: Automate repetitive design operations and approvals
|
||||||
|
- **Cross-tool Integration**: Connect Penpot with other tools in your design workflow
|
||||||
|
|
||||||
|
## 🚀 Quick Start
|
||||||
|
|
||||||
|
### Prerequisites
|
||||||
|
|
||||||
|
- **Python 3.12+** (Latest Python recommended for optimal performance)
|
||||||
|
- **Penpot Account** ([Sign up free](https://penpot.app/))
|
||||||
|
- **Claude Desktop** (Optional, for AI integration)
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
|
||||||
|
### Prerequisites
|
||||||
|
|
||||||
|
- Python 3.12+
|
||||||
|
- Penpot account credentials
|
||||||
|
|
||||||
|
### Installation
|
||||||
|
|
||||||
|
#### Option 1: Install from PyPI
|
||||||
|
|
||||||
|
```bash
|
||||||
|
pip install penpot-mcp
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Option 2: Using uv (recommended for modern Python development)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Install directly with uvx (when published to PyPI)
|
||||||
|
uvx penpot-mcp
|
||||||
|
|
||||||
|
# For local development, use uvx with local path
|
||||||
|
uvx --from . penpot-mcp
|
||||||
|
|
||||||
|
# Or install in a project with uv
|
||||||
|
uv add penpot-mcp
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Option 3: Install from source
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Clone the repository
|
||||||
|
git clone https://github.com/montevive/penpot-mcp.git
|
||||||
|
cd penpot-mcp
|
||||||
|
|
||||||
|
# Using uv (recommended)
|
||||||
|
uv sync
|
||||||
|
uv run penpot-mcp
|
||||||
|
|
||||||
|
# Or using traditional pip
|
||||||
|
python -m venv .venv
|
||||||
|
source .venv/bin/activate # On Windows: .venv\Scripts\activate
|
||||||
|
pip install -e .
|
||||||
|
```
|
||||||
|
|
||||||
|
### Configuration
|
||||||
|
|
||||||
|
Create a `.env` file based on `env.example` with your Penpot credentials:
|
||||||
|
|
||||||
|
```
|
||||||
|
PENPOT_API_URL=https://design.penpot.app/api
|
||||||
|
PENPOT_USERNAME=your_penpot_username
|
||||||
|
PENPOT_PASSWORD=your_penpot_password
|
||||||
|
PORT=5000
|
||||||
|
DEBUG=true
|
||||||
|
```
|
||||||
|
|
||||||
|
## Usage
|
||||||
|
|
||||||
|
### Running the MCP Server
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Using uvx (when published to PyPI)
|
||||||
|
uvx penpot-mcp
|
||||||
|
|
||||||
|
# Using uvx for local development
|
||||||
|
uvx --from . penpot-mcp
|
||||||
|
|
||||||
|
# Using uv in a project (recommended for local development)
|
||||||
|
uv run penpot-mcp
|
||||||
|
|
||||||
|
# Using the entry point (if installed)
|
||||||
|
penpot-mcp
|
||||||
|
|
||||||
|
# Or using the module directly
|
||||||
|
python -m penpot_mcp.server.mcp_server
|
||||||
|
```
|
||||||
|
|
||||||
|
### Debugging the MCP Server
|
||||||
|
|
||||||
|
To debug the MCP server, you can:
|
||||||
|
|
||||||
|
1. Enable debug mode in your `.env` file by setting `DEBUG=true`
|
||||||
|
2. Use the Penpot API CLI for testing API operations:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Test API connection with debug output
|
||||||
|
python -m penpot_mcp.api.penpot_api --debug list-projects
|
||||||
|
|
||||||
|
# Get details for a specific project
|
||||||
|
python -m penpot_mcp.api.penpot_api --debug get-project --id YOUR_PROJECT_ID
|
||||||
|
|
||||||
|
# List files in a project
|
||||||
|
python -m penpot_mcp.api.penpot_api --debug list-files --project-id YOUR_PROJECT_ID
|
||||||
|
|
||||||
|
# Get file details
|
||||||
|
python -m penpot_mcp.api.penpot_api --debug get-file --file-id YOUR_FILE_ID
|
||||||
|
```
|
||||||
|
|
||||||
|
### Command-line Tools
|
||||||
|
|
||||||
|
The package includes utility command-line tools:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Generate a tree visualization of a Penpot file
|
||||||
|
penpot-tree path/to/penpot_file.json
|
||||||
|
|
||||||
|
# Validate a Penpot file against the schema
|
||||||
|
penpot-validate path/to/penpot_file.json
|
||||||
|
```
|
||||||
|
|
||||||
|
### MCP Monitoring & Testing
|
||||||
|
|
||||||
|
#### MCP CLI Monitor
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Start your MCP server in one terminal
|
||||||
|
python -m penpot_mcp.server.mcp_server
|
||||||
|
|
||||||
|
# In another terminal, use mcp-cli to monitor and interact with your server
|
||||||
|
python -m mcp.cli monitor python -m penpot_mcp.server.mcp_server
|
||||||
|
|
||||||
|
# Or connect to an already running server on a specific port
|
||||||
|
python -m mcp.cli monitor --port 5000
|
||||||
|
```
|
||||||
|
|
||||||
|
#### MCP Inspector
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Start your MCP server in one terminal
|
||||||
|
python -m penpot_mcp.server.mcp_server
|
||||||
|
|
||||||
|
# In another terminal, run the MCP Inspector (requires Node.js)
|
||||||
|
npx @modelcontextprotocol/inspector
|
||||||
|
```
|
||||||
|
|
||||||
|
### Using the Client
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Run the example client
|
||||||
|
penpot-client
|
||||||
|
```
|
||||||
|
|
||||||
|
## MCP Resources & Tools
|
||||||
|
|
||||||
|
### Resources
|
||||||
|
- `server://info` - Server status and information
|
||||||
|
- `penpot://schema` - Penpot API schema as JSON
|
||||||
|
- `penpot://tree-schema` - Penpot object tree schema as JSON
|
||||||
|
- `rendered-component://{component_id}` - Rendered component images
|
||||||
|
- `penpot://cached-files` - List of cached Penpot files
|
||||||
|
|
||||||
|
### Tools
|
||||||
|
- `list_projects` - List all Penpot projects
|
||||||
|
- `get_project_files` - Get files for a specific project
|
||||||
|
- `get_file` - Retrieve a Penpot file by its ID and cache it
|
||||||
|
- `export_object` - Export a Penpot object as an image
|
||||||
|
- `get_object_tree` - Get the object tree structure for a Penpot object
|
||||||
|
- `search_object` - Search for objects within a Penpot file by name
|
||||||
|
|
||||||
|
## Claude AI Integration
|
||||||
|
|
||||||
|
The Penpot MCP server can be integrated with Claude AI using the Model Context Protocol. For detailed instructions, see [CLAUDE_INTEGRATION.md](CLAUDE_INTEGRATION.md).
|
||||||
|
|
||||||
|
Key features of the Claude integration:
|
||||||
|
- Direct integration with Claude Desktop
|
||||||
|
- Access to Penpot projects and files
|
||||||
|
- Ability to view and analyze design components
|
||||||
|
- Export Penpot objects as images
|
||||||
|
|
||||||
|
## Package Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
penpot_mcp/
|
||||||
|
├── api/ # Penpot API client
|
||||||
|
├── server/ # MCP server implementation
|
||||||
|
│ ├── mcp_server.py # Main MCP server
|
||||||
|
│ └── client.py # Client implementation
|
||||||
|
├── tools/ # Utility tools
|
||||||
|
│ ├── cli/ # Command-line interfaces
|
||||||
|
│ └── penpot_tree.py # Penpot object tree visualization
|
||||||
|
├── resources/ # Resource files and schemas
|
||||||
|
└── utils/ # Helper utilities
|
||||||
|
```
|
||||||
|
|
||||||
|
## Development
|
||||||
|
|
||||||
|
### Testing
|
||||||
|
|
||||||
|
The project uses pytest for testing:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Using uv (recommended)
|
||||||
|
uv sync --extra dev
|
||||||
|
uv run pytest
|
||||||
|
|
||||||
|
# Run with coverage
|
||||||
|
uv run pytest --cov=penpot_mcp tests/
|
||||||
|
|
||||||
|
# Using traditional pip
|
||||||
|
pip install -e ".[dev]"
|
||||||
|
pytest
|
||||||
|
pytest --cov=penpot_mcp tests/
|
||||||
|
```
|
||||||
|
|
||||||
|
### Linting
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Using uv (recommended)
|
||||||
|
uv sync --extra dev
|
||||||
|
|
||||||
|
# Set up pre-commit hooks
|
||||||
|
uv run pre-commit install
|
||||||
|
|
||||||
|
# Run linting
|
||||||
|
uv run python lint.py
|
||||||
|
|
||||||
|
# Auto-fix linting issues
|
||||||
|
uv run python lint.py --autofix
|
||||||
|
|
||||||
|
# Using traditional pip
|
||||||
|
pip install -r requirements-dev.txt
|
||||||
|
pre-commit install
|
||||||
|
./lint.py
|
||||||
|
./lint.py --autofix
|
||||||
|
```
|
||||||
|
|
||||||
|
## Contributing
|
||||||
|
|
||||||
|
Contributions are welcome! Please feel free to submit a Pull Request.
|
||||||
|
|
||||||
|
1. Fork the repository
|
||||||
|
2. Create your feature branch (`git checkout -b feature/amazing-feature`)
|
||||||
|
3. Commit your changes (`git commit -m 'Add some amazing feature'`)
|
||||||
|
4. Push to the branch (`git push origin feature/amazing-feature`)
|
||||||
|
5. Open a Pull Request
|
||||||
|
|
||||||
|
Please make sure your code follows the project's coding standards and includes appropriate tests.
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
This project is licensed under the MIT License - see the [LICENSE](LICENSE) file for details.
|
||||||
|
|
||||||
|
## Acknowledgments
|
||||||
|
|
||||||
|
- [Penpot](https://penpot.app/) - The open-source design and prototyping platform
|
||||||
|
- [Model Context Protocol](https://modelcontextprotocol.io) - The standardized protocol for AI model context
|
||||||
173
SECURITY.md
Normal file
173
SECURITY.md
Normal file
@@ -0,0 +1,173 @@
|
|||||||
|
# Security Policy
|
||||||
|
|
||||||
|
## Supported Versions
|
||||||
|
|
||||||
|
We actively support the following versions of Penpot MCP with security updates:
|
||||||
|
|
||||||
|
| Version | Supported |
|
||||||
|
| ------- | ------------------ |
|
||||||
|
| 0.1.x | :white_check_mark: |
|
||||||
|
| < 0.1 | :x: |
|
||||||
|
|
||||||
|
## Reporting a Vulnerability
|
||||||
|
|
||||||
|
The Penpot MCP team takes security seriously. If you discover a security vulnerability, please follow these steps:
|
||||||
|
|
||||||
|
### 🔒 Private Disclosure
|
||||||
|
|
||||||
|
**DO NOT** create a public GitHub issue for security vulnerabilities.
|
||||||
|
|
||||||
|
Instead, please email us at: **security@montevive.ai**
|
||||||
|
|
||||||
|
### 📧 What to Include
|
||||||
|
|
||||||
|
Please include the following information in your report:
|
||||||
|
|
||||||
|
- **Description**: A clear description of the vulnerability
|
||||||
|
- **Impact**: What could an attacker accomplish?
|
||||||
|
- **Reproduction**: Step-by-step instructions to reproduce the issue
|
||||||
|
- **Environment**: Affected versions, operating systems, configurations
|
||||||
|
- **Proof of Concept**: Code, screenshots, or other evidence (if applicable)
|
||||||
|
- **Suggested Fix**: If you have ideas for how to fix the issue
|
||||||
|
|
||||||
|
### 🕐 Response Timeline
|
||||||
|
|
||||||
|
- **Initial Response**: Within 48 hours
|
||||||
|
- **Triage**: Within 1 week
|
||||||
|
- **Fix Development**: Depends on severity and complexity
|
||||||
|
- **Public Disclosure**: After fix is released and users have time to update
|
||||||
|
|
||||||
|
### 🏆 Recognition
|
||||||
|
|
||||||
|
We believe in recognizing security researchers who help keep our users safe:
|
||||||
|
|
||||||
|
- **Security Hall of Fame**: Public recognition (with your permission)
|
||||||
|
- **CVE Assignment**: For qualifying vulnerabilities
|
||||||
|
- **Coordinated Disclosure**: We'll work with you on timing and attribution
|
||||||
|
|
||||||
|
## Security Considerations
|
||||||
|
|
||||||
|
### 🔐 Authentication & Credentials
|
||||||
|
|
||||||
|
- **Penpot Credentials**: Store securely using environment variables or secure credential management
|
||||||
|
- **API Keys**: Never commit API keys or passwords to version control
|
||||||
|
- **Environment Files**: Add `.env` files to `.gitignore`
|
||||||
|
|
||||||
|
### 🌐 Network Security
|
||||||
|
|
||||||
|
- **HTTPS Only**: Always use HTTPS for Penpot API connections
|
||||||
|
- **Certificate Validation**: Don't disable SSL certificate verification
|
||||||
|
- **Rate Limiting**: Respect API rate limits to avoid service disruption
|
||||||
|
|
||||||
|
### 🛡️ Input Validation
|
||||||
|
|
||||||
|
- **User Input**: All user inputs are validated and sanitized
|
||||||
|
- **File Uploads**: Penpot file parsing includes safety checks
|
||||||
|
- **API Responses**: External API responses are validated before processing
|
||||||
|
|
||||||
|
### 🔍 Data Privacy
|
||||||
|
|
||||||
|
- **Minimal Data**: We only access necessary Penpot data
|
||||||
|
- **No Storage**: Design data is not permanently stored by default
|
||||||
|
- **User Control**: Users control what data is shared with AI assistants
|
||||||
|
|
||||||
|
### 🚀 Deployment Security
|
||||||
|
|
||||||
|
- **Dependencies**: Regularly update dependencies for security patches
|
||||||
|
- **Permissions**: Run with minimal required permissions
|
||||||
|
- **Isolation**: Use virtual environments or containers
|
||||||
|
|
||||||
|
## Security Best Practices for Users
|
||||||
|
|
||||||
|
### 🔧 Configuration
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Use environment variables for sensitive data
|
||||||
|
export PENPOT_USERNAME="your_username"
|
||||||
|
export PENPOT_PASSWORD="your_secure_password"
|
||||||
|
export PENPOT_API_URL="https://design.penpot.app/api"
|
||||||
|
|
||||||
|
# Or use a .env file (never commit this!)
|
||||||
|
echo "PENPOT_USERNAME=your_username" > .env
|
||||||
|
echo "PENPOT_PASSWORD=your_secure_password" >> .env
|
||||||
|
echo "PENPOT_API_URL=https://design.penpot.app/api" >> .env
|
||||||
|
```
|
||||||
|
|
||||||
|
### 🔒 Access Control
|
||||||
|
|
||||||
|
- **Principle of Least Privilege**: Only grant necessary Penpot permissions
|
||||||
|
- **Regular Audits**: Review and rotate credentials regularly
|
||||||
|
- **Team Access**: Use team accounts rather than personal credentials for shared projects
|
||||||
|
|
||||||
|
### 🖥️ Local Development
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Keep your development environment secure
|
||||||
|
chmod 600 .env # Restrict file permissions
|
||||||
|
git add .env # This should fail if .gitignore is properly configured
|
||||||
|
```
|
||||||
|
|
||||||
|
### 🤖 AI Integration
|
||||||
|
|
||||||
|
- **Data Sensitivity**: Be mindful of what design data you share with AI assistants
|
||||||
|
- **Public vs Private**: Consider using private AI instances for sensitive designs
|
||||||
|
- **Audit Logs**: Monitor what data is being accessed and shared
|
||||||
|
|
||||||
|
## Vulnerability Disclosure Policy
|
||||||
|
|
||||||
|
### 🎯 Scope
|
||||||
|
|
||||||
|
This security policy applies to:
|
||||||
|
|
||||||
|
- **Penpot MCP Server**: Core MCP protocol implementation
|
||||||
|
- **API Client**: Penpot API integration code
|
||||||
|
- **CLI Tools**: Command-line utilities
|
||||||
|
- **Documentation**: Security-related documentation
|
||||||
|
|
||||||
|
### ⚠️ Out of Scope
|
||||||
|
|
||||||
|
The following are outside our direct control but we'll help coordinate:
|
||||||
|
|
||||||
|
- **Penpot Platform**: Report to Penpot team directly
|
||||||
|
- **Third-party Dependencies**: We'll help coordinate with upstream maintainers
|
||||||
|
- **AI Assistant Platforms**: Report to respective platform security teams
|
||||||
|
|
||||||
|
### 🚫 Testing Guidelines
|
||||||
|
|
||||||
|
When testing for vulnerabilities:
|
||||||
|
|
||||||
|
- **DO NOT** test against production Penpot instances without permission
|
||||||
|
- **DO NOT** access data you don't own
|
||||||
|
- **DO NOT** perform destructive actions
|
||||||
|
- **DO** use test accounts and data
|
||||||
|
- **DO** respect rate limits and terms of service
|
||||||
|
|
||||||
|
## Security Updates
|
||||||
|
|
||||||
|
### 📢 Notifications
|
||||||
|
|
||||||
|
Security updates will be announced through:
|
||||||
|
|
||||||
|
- **GitHub Security Advisories**: Primary notification method
|
||||||
|
- **Release Notes**: Detailed in version release notes
|
||||||
|
- **Email**: For critical vulnerabilities (if you've subscribed)
|
||||||
|
|
||||||
|
### 🔄 Update Process
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Always update to the latest version for security fixes
|
||||||
|
pip install --upgrade penpot-mcp
|
||||||
|
|
||||||
|
# Or with uv
|
||||||
|
uv add penpot-mcp@latest
|
||||||
|
```
|
||||||
|
|
||||||
|
## Contact
|
||||||
|
|
||||||
|
- **Security Issues**: security@montevive.ai
|
||||||
|
- **General Questions**: Use [GitHub Discussions](https://github.com/montevive/penpot-mcp/discussions)
|
||||||
|
- **Bug Reports**: [GitHub Issues](https://github.com/montevive/penpot-mcp/issues)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
Thank you for helping keep Penpot MCP and our community safe! 🛡️
|
||||||
10
env.example
Normal file
10
env.example
Normal file
@@ -0,0 +1,10 @@
|
|||||||
|
# Penpot authentication
|
||||||
|
PENPOT_USERNAME=your_penpot_username_here
|
||||||
|
PENPOT_PASSWORD=your_penpot_password_here
|
||||||
|
|
||||||
|
# Server configuration
|
||||||
|
PORT=5000
|
||||||
|
DEBUG=true
|
||||||
|
|
||||||
|
# Penpot API base URL (change if using self-hosted Penpot)
|
||||||
|
PENPOT_API_URL=https://design.penpot.app/api
|
||||||
121
fix-lint-deps.sh
Executable file
121
fix-lint-deps.sh
Executable file
@@ -0,0 +1,121 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
# Helper script to install missing linting dependencies
|
||||||
|
|
||||||
|
# Colors for output
|
||||||
|
RED='\033[0;31m'
|
||||||
|
GREEN='\033[0;32m'
|
||||||
|
YELLOW='\033[0;33m'
|
||||||
|
NC='\033[0m' # No Color
|
||||||
|
|
||||||
|
# Function to create and activate a virtual environment
|
||||||
|
create_venv() {
|
||||||
|
echo -e "${YELLOW}Creating virtual environment in '$1'...${NC}"
|
||||||
|
python3 -m venv "$1"
|
||||||
|
|
||||||
|
if [ $? -ne 0 ]; then
|
||||||
|
echo -e "${RED}Failed to create virtual environment.${NC}"
|
||||||
|
echo "Make sure python3-venv is installed."
|
||||||
|
echo "On Ubuntu/Debian: sudo apt install python3-venv"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo -e "${GREEN}Virtual environment created successfully.${NC}"
|
||||||
|
|
||||||
|
# Activate the virtual environment
|
||||||
|
if [[ "$OSTYPE" == "msys" || "$OSTYPE" == "win32" ]]; then
|
||||||
|
# Windows
|
||||||
|
source "$1/Scripts/activate"
|
||||||
|
else
|
||||||
|
# Unix/Linux/MacOS
|
||||||
|
source "$1/bin/activate"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if [ $? -ne 0 ]; then
|
||||||
|
echo -e "${RED}Failed to activate virtual environment.${NC}"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo -e "${GREEN}Virtual environment activated.${NC}"
|
||||||
|
|
||||||
|
# Upgrade pip to avoid issues
|
||||||
|
pip install --upgrade pip
|
||||||
|
|
||||||
|
if [ $? -ne 0 ]; then
|
||||||
|
echo -e "${YELLOW}Warning: Could not upgrade pip, but continuing anyway.${NC}"
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Check if we're in a virtual environment
|
||||||
|
if [[ -z "$VIRTUAL_ENV" ]]; then
|
||||||
|
echo -e "${YELLOW}You are not in a virtual environment.${NC}"
|
||||||
|
|
||||||
|
# Check if a virtual environment already exists
|
||||||
|
if [ -d ".venv" ]; then
|
||||||
|
echo "Found existing virtual environment in .venv directory."
|
||||||
|
read -p "Would you like to use it? (y/n): " use_existing
|
||||||
|
|
||||||
|
if [[ $use_existing == "y" || $use_existing == "Y" ]]; then
|
||||||
|
create_venv ".venv"
|
||||||
|
else
|
||||||
|
read -p "Create a new virtual environment? (y/n): " create_new
|
||||||
|
|
||||||
|
if [[ $create_new == "y" || $create_new == "Y" ]]; then
|
||||||
|
read -p "Enter path for new virtual environment [.venv]: " venv_path
|
||||||
|
venv_path=${venv_path:-.venv}
|
||||||
|
create_venv "$venv_path"
|
||||||
|
else
|
||||||
|
echo -e "${RED}Cannot continue without a virtual environment.${NC}"
|
||||||
|
echo "Using system Python is not recommended and may cause permission issues."
|
||||||
|
echo "Please run this script again and choose to create a virtual environment."
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
read -p "Would you like to create a virtual environment? (y/n): " create_new
|
||||||
|
|
||||||
|
if [[ $create_new == "y" || $create_new == "Y" ]]; then
|
||||||
|
read -p "Enter path for new virtual environment [.venv]: " venv_path
|
||||||
|
venv_path=${venv_path:-.venv}
|
||||||
|
create_venv "$venv_path"
|
||||||
|
else
|
||||||
|
echo -e "${RED}Cannot continue without a virtual environment.${NC}"
|
||||||
|
echo "Using system Python is not recommended and may cause permission issues."
|
||||||
|
echo "Please run this script again and choose to create a virtual environment."
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
echo -e "${GREEN}Using existing virtual environment: $VIRTUAL_ENV${NC}"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Install development dependencies
|
||||||
|
echo -e "${YELLOW}Installing linting dependencies...${NC}"
|
||||||
|
pip install -r requirements-dev.txt
|
||||||
|
|
||||||
|
if [ $? -ne 0 ]; then
|
||||||
|
echo -e "${RED}Failed to install dependencies.${NC}"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo -e "${GREEN}Dependencies installed successfully.${NC}"
|
||||||
|
|
||||||
|
# Install pre-commit hooks
|
||||||
|
echo -e "${YELLOW}Setting up pre-commit hooks...${NC}"
|
||||||
|
pre-commit install
|
||||||
|
|
||||||
|
if [ $? -ne 0 ]; then
|
||||||
|
echo -e "${RED}Failed to install pre-commit hooks.${NC}"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo -e "${GREEN}Pre-commit hooks installed successfully.${NC}"
|
||||||
|
|
||||||
|
echo -e "\n${GREEN}Setup completed!${NC}"
|
||||||
|
echo "You can now run the linting script with:"
|
||||||
|
echo " ./lint.py"
|
||||||
|
echo "Or with auto-fix:"
|
||||||
|
echo " ./lint.py --autofix"
|
||||||
|
echo ""
|
||||||
|
echo "Remember to activate your virtual environment whenever you open a new terminal:"
|
||||||
|
echo " source .venv/bin/activate # On Linux/macOS"
|
||||||
|
echo " .venv\\Scripts\\activate # On Windows"
|
||||||
BIN
images/penpot-mcp.png
Normal file
BIN
images/penpot-mcp.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 944 KiB |
246
lint.py
Executable file
246
lint.py
Executable file
@@ -0,0 +1,246 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""Script to run linters with auto-fix capabilities.
|
||||||
|
|
||||||
|
Run with: python lint.py [--autofix]
|
||||||
|
"""
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import importlib.util
|
||||||
|
import os
|
||||||
|
import site
|
||||||
|
import subprocess
|
||||||
|
import sys
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
|
||||||
|
def is_venv():
|
||||||
|
"""Check if running in a virtual environment."""
|
||||||
|
return (hasattr(sys, 'real_prefix') or
|
||||||
|
(hasattr(sys, 'base_prefix') and sys.base_prefix != sys.prefix))
|
||||||
|
|
||||||
|
|
||||||
|
def check_dependencies():
|
||||||
|
"""Check if all required dependencies are installed."""
|
||||||
|
missing_deps = []
|
||||||
|
|
||||||
|
# Check for required modules
|
||||||
|
required_modules = ["flake8", "isort", "autopep8", "pyflakes"]
|
||||||
|
|
||||||
|
# In Python 3.12+, also check for pycodestyle as a fallback
|
||||||
|
if sys.version_info >= (3, 12):
|
||||||
|
required_modules.append("pycodestyle")
|
||||||
|
|
||||||
|
for module in required_modules:
|
||||||
|
if importlib.util.find_spec(module) is None:
|
||||||
|
missing_deps.append(module)
|
||||||
|
|
||||||
|
# Special check for autopep8 compatibility with Python 3.12+
|
||||||
|
if sys.version_info >= (3, 12) and importlib.util.find_spec("autopep8") is not None:
|
||||||
|
try:
|
||||||
|
import autopep8
|
||||||
|
|
||||||
|
# Try to access a function that would use lib2to3
|
||||||
|
# Will throw an error if lib2to3 is missing and not handled properly
|
||||||
|
autopep8_version = autopep8.__version__
|
||||||
|
print(f"Using autopep8 version: {autopep8_version}")
|
||||||
|
except ImportError as e:
|
||||||
|
if "lib2to3" in str(e):
|
||||||
|
print("WARNING: You're using Python 3.12+ where lib2to3 is no longer included.")
|
||||||
|
print("Your installed version of autopep8 may not work correctly.")
|
||||||
|
print("Consider using a version of autopep8 compatible with Python 3.12+")
|
||||||
|
print("or run this script with Python 3.11 or earlier.")
|
||||||
|
|
||||||
|
if missing_deps:
|
||||||
|
print("ERROR: Missing required dependencies:")
|
||||||
|
for dep in missing_deps:
|
||||||
|
print(f" - {dep}")
|
||||||
|
|
||||||
|
if not is_venv():
|
||||||
|
print("\nYou are using the system Python environment.")
|
||||||
|
print("It's recommended to use a virtual environment:")
|
||||||
|
print("\n1. Create a virtual environment:")
|
||||||
|
print(" python3 -m venv .venv")
|
||||||
|
print("\n2. Activate the virtual environment:")
|
||||||
|
print(" source .venv/bin/activate # On Linux/macOS")
|
||||||
|
print(" .venv\\Scripts\\activate # On Windows")
|
||||||
|
print("\n3. Install dependencies:")
|
||||||
|
print(" pip install -r requirements-dev.txt")
|
||||||
|
else:
|
||||||
|
print("\nPlease install these dependencies with:")
|
||||||
|
print(" pip install -r requirements-dev.txt")
|
||||||
|
|
||||||
|
return False
|
||||||
|
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
def run_command(cmd, cwd=None):
|
||||||
|
"""Run a shell command and return the exit code."""
|
||||||
|
try:
|
||||||
|
process = subprocess.run(cmd, shell=True, cwd=cwd)
|
||||||
|
return process.returncode
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error executing command '{cmd}': {e}")
|
||||||
|
return 1
|
||||||
|
|
||||||
|
|
||||||
|
def fix_unused_imports(root_dir):
|
||||||
|
"""Fix unused imports using pyflakes and autoflake."""
|
||||||
|
try:
|
||||||
|
if importlib.util.find_spec("autoflake") is not None:
|
||||||
|
print("Running autoflake to remove unused imports...")
|
||||||
|
cmd = "autoflake --remove-all-unused-imports --recursive --in-place penpot_mcp/ tests/"
|
||||||
|
return run_command(cmd, cwd=root_dir)
|
||||||
|
else:
|
||||||
|
print("autoflake not found. To automatically remove unused imports, install:")
|
||||||
|
print(" pip install autoflake")
|
||||||
|
return 0
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error with autoflake: {e}")
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
def fix_whitespace_and_docstring_issues(root_dir):
|
||||||
|
"""Attempt to fix whitespace and simple docstring issues."""
|
||||||
|
# Find Python files that need fixing
|
||||||
|
try:
|
||||||
|
filelist_cmd = "find penpot_mcp tests setup.py -name '*.py' -type f"
|
||||||
|
process = subprocess.run(
|
||||||
|
filelist_cmd, shell=True, cwd=root_dir,
|
||||||
|
capture_output=True, text=True
|
||||||
|
)
|
||||||
|
|
||||||
|
if process.returncode != 0:
|
||||||
|
print("Error finding Python files")
|
||||||
|
return 1
|
||||||
|
|
||||||
|
files = process.stdout.strip().split('\n')
|
||||||
|
fixed_count = 0
|
||||||
|
|
||||||
|
for file_path in files:
|
||||||
|
if not file_path:
|
||||||
|
continue
|
||||||
|
|
||||||
|
full_path = Path(root_dir) / file_path
|
||||||
|
|
||||||
|
try:
|
||||||
|
with open(full_path, 'r', encoding='utf-8') as f:
|
||||||
|
content = f.read()
|
||||||
|
|
||||||
|
# Fix trailing whitespace
|
||||||
|
fixed_content = '\n'.join(line.rstrip() for line in content.split('\n'))
|
||||||
|
|
||||||
|
# Ensure final newline
|
||||||
|
if not fixed_content.endswith('\n'):
|
||||||
|
fixed_content += '\n'
|
||||||
|
|
||||||
|
# Add basic docstrings to empty modules, classes, functions
|
||||||
|
if '__init__.py' in file_path and '"""' not in fixed_content:
|
||||||
|
package_name = file_path.split('/')[-2]
|
||||||
|
fixed_content = f'"""Package {package_name}."""\n' + fixed_content
|
||||||
|
|
||||||
|
# Write back if changes were made
|
||||||
|
if fixed_content != content:
|
||||||
|
with open(full_path, 'w', encoding='utf-8') as f:
|
||||||
|
f.write(fixed_content)
|
||||||
|
fixed_count += 1
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error processing {file_path}: {e}")
|
||||||
|
|
||||||
|
if fixed_count > 0:
|
||||||
|
print(f"Fixed whitespace and newlines in {fixed_count} files")
|
||||||
|
|
||||||
|
return 0
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Error in whitespace fixing: {e}")
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
"""Main entry point for the linter script."""
|
||||||
|
parser = argparse.ArgumentParser(description="Run linters with optional auto-fix")
|
||||||
|
parser.add_argument(
|
||||||
|
"--autofix", "-a", action="store_true", help="Automatically fix linting issues"
|
||||||
|
)
|
||||||
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
# Verify dependencies before proceeding
|
||||||
|
if not check_dependencies():
|
||||||
|
return 1
|
||||||
|
|
||||||
|
root_dir = Path(__file__).parent.absolute()
|
||||||
|
|
||||||
|
print("Running linters...")
|
||||||
|
|
||||||
|
# Run isort
|
||||||
|
isort_cmd = "isort --profile black ."
|
||||||
|
if args.autofix:
|
||||||
|
print("Running isort with auto-fix...")
|
||||||
|
exit_code = run_command(isort_cmd, cwd=root_dir)
|
||||||
|
else:
|
||||||
|
print("Checking imports with isort...")
|
||||||
|
exit_code = run_command(f"{isort_cmd} --check", cwd=root_dir)
|
||||||
|
|
||||||
|
if exit_code != 0 and not args.autofix:
|
||||||
|
print("isort found issues. Run with --autofix to fix automatically.")
|
||||||
|
|
||||||
|
# Run additional fixers when in autofix mode
|
||||||
|
if args.autofix:
|
||||||
|
# Fix unused imports
|
||||||
|
fix_unused_imports(root_dir)
|
||||||
|
|
||||||
|
# Fix whitespace and newline issues
|
||||||
|
fix_whitespace_and_docstring_issues(root_dir)
|
||||||
|
|
||||||
|
# Run autopep8
|
||||||
|
print("Running autopep8 with auto-fix...")
|
||||||
|
|
||||||
|
if sys.version_info >= (3, 12):
|
||||||
|
print("Detected Python 3.12+. Using compatible code formatting approach...")
|
||||||
|
# Use a more compatible approach for Python 3.12+
|
||||||
|
# First try autopep8 (newer versions may have fixed lib2to3 dependency)
|
||||||
|
autopep8_cmd = "autopep8 --recursive --aggressive --aggressive --in-place --select E,W penpot_mcp/ tests/ setup.py"
|
||||||
|
try:
|
||||||
|
exit_code = run_command(autopep8_cmd, cwd=root_dir)
|
||||||
|
if exit_code != 0:
|
||||||
|
print("Warning: autopep8 encountered issues. Some files may not have been fixed.")
|
||||||
|
except Exception as e:
|
||||||
|
if "lib2to3" in str(e):
|
||||||
|
print("Error with autopep8 due to missing lib2to3 module in Python 3.12+")
|
||||||
|
print("Using pycodestyle for checking only (no auto-fix is possible)")
|
||||||
|
exit_code = run_command("pycodestyle penpot_mcp/ tests/", cwd=root_dir)
|
||||||
|
else:
|
||||||
|
raise
|
||||||
|
else:
|
||||||
|
# Normal execution for Python < 3.12
|
||||||
|
autopep8_cmd = "autopep8 --recursive --aggressive --aggressive --in-place --select E,W penpot_mcp/ tests/ setup.py"
|
||||||
|
exit_code = run_command(autopep8_cmd, cwd=root_dir)
|
||||||
|
if exit_code != 0:
|
||||||
|
print("Warning: autopep8 encountered issues. Some files may not have been fixed.")
|
||||||
|
|
||||||
|
# Run flake8 (check only, no auto-fix)
|
||||||
|
print("Running flake8...")
|
||||||
|
flake8_result = run_command("flake8", cwd=root_dir)
|
||||||
|
|
||||||
|
if flake8_result != 0:
|
||||||
|
print("flake8 found issues that need to be fixed manually.")
|
||||||
|
print("Common issues and how to fix them:")
|
||||||
|
print("- F401 (unused import): Remove the import or use it")
|
||||||
|
print("- D1XX (missing docstring): Add a docstring to the module/class/function")
|
||||||
|
print("- E501 (line too long): Break the line or use line continuation")
|
||||||
|
print("- F841 (unused variable): Remove or use the variable")
|
||||||
|
|
||||||
|
if args.autofix:
|
||||||
|
print("Auto-fix completed! Run flake8 again to see if there are any remaining issues.")
|
||||||
|
elif exit_code != 0 or flake8_result != 0:
|
||||||
|
print("Linting issues found. Run with --autofix to fix automatically where possible.")
|
||||||
|
return 1
|
||||||
|
else:
|
||||||
|
print("All linting checks passed!")
|
||||||
|
|
||||||
|
return 0
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
sys.exit(main())
|
||||||
1
penpot_mcp/__init__.py
Normal file
1
penpot_mcp/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""Package penpot_mcp."""
|
||||||
1
penpot_mcp/api/__init__.py
Normal file
1
penpot_mcp/api/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""PenpotAPI module for interacting with the Penpot design platform."""
|
||||||
852
penpot_mcp/api/penpot_api.py
Normal file
852
penpot_mcp/api/penpot_api.py
Normal file
@@ -0,0 +1,852 @@
|
|||||||
|
import argparse
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
from typing import Any, Dict, List, Optional, Union
|
||||||
|
|
||||||
|
import requests
|
||||||
|
from dotenv import load_dotenv
|
||||||
|
|
||||||
|
|
||||||
|
class PenpotAPI:
|
||||||
|
def __init__(
|
||||||
|
self,
|
||||||
|
base_url: str = None,
|
||||||
|
debug: bool = False,
|
||||||
|
email: Optional[str] = None,
|
||||||
|
password: Optional[str] = None):
|
||||||
|
# Load environment variables if not already loaded
|
||||||
|
load_dotenv()
|
||||||
|
|
||||||
|
# Use base_url from parameters if provided, otherwise from environment,
|
||||||
|
# fallback to default URL
|
||||||
|
self.base_url = base_url or os.getenv("PENPOT_API_URL", "https://design.penpot.app/api")
|
||||||
|
self.session = requests.Session()
|
||||||
|
self.access_token = None
|
||||||
|
self.debug = debug
|
||||||
|
self.email = email or os.getenv("PENPOT_USERNAME")
|
||||||
|
self.password = password or os.getenv("PENPOT_PASSWORD")
|
||||||
|
self.profile_id = None
|
||||||
|
|
||||||
|
# Set default headers - we'll use different headers at request time
|
||||||
|
# based on the required content type (JSON vs Transit+JSON)
|
||||||
|
self.session.headers.update({
|
||||||
|
"Accept": "application/json, application/transit+json",
|
||||||
|
"Content-Type": "application/json"
|
||||||
|
})
|
||||||
|
|
||||||
|
def set_access_token(self, token: str):
|
||||||
|
"""Set the auth token for authentication."""
|
||||||
|
self.access_token = token
|
||||||
|
# For cookie-based auth, set the auth-token cookie
|
||||||
|
self.session.cookies.set("auth-token", token)
|
||||||
|
# Also set Authorization header for APIs that use it
|
||||||
|
self.session.headers.update({
|
||||||
|
"Authorization": f"Token {token}"
|
||||||
|
})
|
||||||
|
|
||||||
|
def login_with_password(
|
||||||
|
self,
|
||||||
|
email: Optional[str] = None,
|
||||||
|
password: Optional[str] = None) -> str:
|
||||||
|
"""
|
||||||
|
Login with email and password to get an auth token.
|
||||||
|
|
||||||
|
This method uses the same cookie-based auth approach as the export methods.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
email: Email for Penpot account (if None, will use stored email or PENPOT_USERNAME env var)
|
||||||
|
password: Password for Penpot account (if None, will use stored password or PENPOT_PASSWORD env var)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Auth token for API calls
|
||||||
|
"""
|
||||||
|
# Just use the export authentication as it's more reliable
|
||||||
|
token = self.login_for_export(email, password)
|
||||||
|
self.set_access_token(token)
|
||||||
|
# Get profile ID after login
|
||||||
|
self.get_profile()
|
||||||
|
return token
|
||||||
|
|
||||||
|
def get_profile(self) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Get profile information for the current authenticated user.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary containing profile information, including the profile ID
|
||||||
|
"""
|
||||||
|
url = f"{self.base_url}/rpc/command/get-profile"
|
||||||
|
|
||||||
|
payload = {} # No parameters needed
|
||||||
|
|
||||||
|
response = self._make_authenticated_request('post', url, json=payload, use_transit=False)
|
||||||
|
|
||||||
|
# Parse and normalize the response
|
||||||
|
data = response.json()
|
||||||
|
normalized_data = self._normalize_transit_response(data)
|
||||||
|
|
||||||
|
if self.debug:
|
||||||
|
print("\nProfile data retrieved:")
|
||||||
|
print(json.dumps(normalized_data, indent=2)[:200] + "...")
|
||||||
|
|
||||||
|
# Store profile ID for later use
|
||||||
|
if 'id' in normalized_data:
|
||||||
|
self.profile_id = normalized_data['id']
|
||||||
|
if self.debug:
|
||||||
|
print(f"\nStored profile ID: {self.profile_id}")
|
||||||
|
|
||||||
|
return normalized_data
|
||||||
|
|
||||||
|
def login_for_export(self, email: Optional[str] = None, password: Optional[str] = None) -> str:
|
||||||
|
"""
|
||||||
|
Login with email and password to get an auth token for export operations.
|
||||||
|
|
||||||
|
This is required for export operations which use a different authentication
|
||||||
|
mechanism than the standard API access token.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
email: Email for Penpot account (if None, will use stored email or PENPOT_USERNAME env var)
|
||||||
|
password: Password for Penpot account (if None, will use stored password or PENPOT_PASSWORD env var)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Auth token extracted from cookies
|
||||||
|
"""
|
||||||
|
# Use parameters if provided, else use instance variables, else check environment variables
|
||||||
|
email = email or self.email or os.getenv("PENPOT_USERNAME")
|
||||||
|
password = password or self.password or os.getenv("PENPOT_PASSWORD")
|
||||||
|
|
||||||
|
if not email or not password:
|
||||||
|
raise ValueError(
|
||||||
|
"Email and password are required for export authentication. "
|
||||||
|
"Please provide them as parameters or set PENPOT_USERNAME and "
|
||||||
|
"PENPOT_PASSWORD environment variables."
|
||||||
|
)
|
||||||
|
|
||||||
|
url = f"{self.base_url}/rpc/command/login-with-password"
|
||||||
|
|
||||||
|
# Use Transit+JSON format
|
||||||
|
payload = {
|
||||||
|
"~:email": email,
|
||||||
|
"~:password": password
|
||||||
|
}
|
||||||
|
|
||||||
|
if self.debug:
|
||||||
|
print("\nLogin request payload (Transit+JSON format):")
|
||||||
|
print(json.dumps(payload, indent=2).replace(password, "********"))
|
||||||
|
|
||||||
|
# Create a new session just for this request
|
||||||
|
login_session = requests.Session()
|
||||||
|
|
||||||
|
# Set headers
|
||||||
|
headers = {
|
||||||
|
"Content-Type": "application/transit+json"
|
||||||
|
}
|
||||||
|
|
||||||
|
response = login_session.post(url, json=payload, headers=headers)
|
||||||
|
if self.debug and response.status_code != 200:
|
||||||
|
print(f"\nError response: {response.status_code}")
|
||||||
|
print(f"Response text: {response.text}")
|
||||||
|
response.raise_for_status()
|
||||||
|
|
||||||
|
# Extract auth token from cookies
|
||||||
|
if 'Set-Cookie' in response.headers:
|
||||||
|
if self.debug:
|
||||||
|
print("\nSet-Cookie header found")
|
||||||
|
|
||||||
|
for cookie in login_session.cookies:
|
||||||
|
if cookie.name == "auth-token":
|
||||||
|
if self.debug:
|
||||||
|
print(f"\nAuth token extracted from cookies: {cookie.value[:10]}...")
|
||||||
|
return cookie.value
|
||||||
|
|
||||||
|
raise ValueError("Auth token not found in response cookies")
|
||||||
|
else:
|
||||||
|
# Try to extract from response JSON if available
|
||||||
|
try:
|
||||||
|
data = response.json()
|
||||||
|
if 'auth-token' in data:
|
||||||
|
return data['auth-token']
|
||||||
|
except Exception:
|
||||||
|
pass
|
||||||
|
|
||||||
|
# If we reached here, we couldn't find the token
|
||||||
|
raise ValueError("Auth token not found in response cookies or JSON body")
|
||||||
|
|
||||||
|
def _make_authenticated_request(self, method: str, url: str, **kwargs) -> requests.Response:
|
||||||
|
"""
|
||||||
|
Make an authenticated request, handling re-auth if needed.
|
||||||
|
|
||||||
|
This internal method handles lazy authentication when a request
|
||||||
|
fails due to authentication issues, using the same cookie-based
|
||||||
|
approach as the export methods.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
method: HTTP method (post, get, etc.)
|
||||||
|
url: URL to make the request to
|
||||||
|
**kwargs: Additional arguments to pass to requests
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The response object
|
||||||
|
"""
|
||||||
|
# If we don't have a token yet but have credentials, login first
|
||||||
|
if not self.access_token and self.email and self.password:
|
||||||
|
if self.debug:
|
||||||
|
print("\nNo access token set, logging in with credentials...")
|
||||||
|
self.login_with_password()
|
||||||
|
|
||||||
|
# Set up headers
|
||||||
|
headers = kwargs.get('headers', {})
|
||||||
|
if 'headers' in kwargs:
|
||||||
|
del kwargs['headers']
|
||||||
|
|
||||||
|
# Use Transit+JSON format for API calls (required by Penpot)
|
||||||
|
use_transit = kwargs.pop('use_transit', True)
|
||||||
|
|
||||||
|
if use_transit:
|
||||||
|
headers['Content-Type'] = 'application/transit+json'
|
||||||
|
headers['Accept'] = 'application/transit+json'
|
||||||
|
|
||||||
|
# Convert payload to Transit+JSON format if present
|
||||||
|
if 'json' in kwargs and kwargs['json']:
|
||||||
|
payload = kwargs['json']
|
||||||
|
|
||||||
|
# Only transform if not already in Transit format
|
||||||
|
if not any(isinstance(k, str) and k.startswith('~:') for k in payload.keys()):
|
||||||
|
transit_payload = {}
|
||||||
|
|
||||||
|
# Add cmd if not present
|
||||||
|
if 'cmd' not in payload and '~:cmd' not in payload:
|
||||||
|
# Extract command from URL
|
||||||
|
cmd = url.split('/')[-1]
|
||||||
|
transit_payload['~:cmd'] = f"~:{cmd}"
|
||||||
|
|
||||||
|
# Convert standard JSON to Transit+JSON format
|
||||||
|
for key, value in payload.items():
|
||||||
|
# Skip command if already added
|
||||||
|
if key == 'cmd':
|
||||||
|
continue
|
||||||
|
|
||||||
|
transit_key = f"~:{key}" if not key.startswith('~:') else key
|
||||||
|
|
||||||
|
# Handle special UUID conversion for IDs
|
||||||
|
if isinstance(value, str) and ('-' in value) and len(value) > 30:
|
||||||
|
transit_value = f"~u{value}"
|
||||||
|
else:
|
||||||
|
transit_value = value
|
||||||
|
|
||||||
|
transit_payload[transit_key] = transit_value
|
||||||
|
|
||||||
|
if self.debug:
|
||||||
|
print("\nConverted payload to Transit+JSON format:")
|
||||||
|
print(f"Original: {payload}")
|
||||||
|
print(f"Transit: {transit_payload}")
|
||||||
|
|
||||||
|
kwargs['json'] = transit_payload
|
||||||
|
else:
|
||||||
|
headers['Content-Type'] = 'application/json'
|
||||||
|
headers['Accept'] = 'application/json'
|
||||||
|
|
||||||
|
# Ensure the Authorization header is set if we have a token
|
||||||
|
if self.access_token:
|
||||||
|
headers['Authorization'] = f"Token {self.access_token}"
|
||||||
|
|
||||||
|
# Combine with session headers
|
||||||
|
combined_headers = {**self.session.headers, **headers}
|
||||||
|
|
||||||
|
# Make the request
|
||||||
|
try:
|
||||||
|
response = getattr(self.session, method)(url, headers=combined_headers, **kwargs)
|
||||||
|
|
||||||
|
if self.debug:
|
||||||
|
print(f"\nRequest to: {url}")
|
||||||
|
print(f"Method: {method}")
|
||||||
|
print(f"Headers: {combined_headers}")
|
||||||
|
if 'json' in kwargs:
|
||||||
|
print(f"Payload: {json.dumps(kwargs['json'], indent=2)}")
|
||||||
|
print(f"Response status: {response.status_code}")
|
||||||
|
|
||||||
|
response.raise_for_status()
|
||||||
|
return response
|
||||||
|
|
||||||
|
except requests.HTTPError as e:
|
||||||
|
# Handle authentication errors
|
||||||
|
if e.response.status_code in (401, 403) and self.email and self.password:
|
||||||
|
if self.debug:
|
||||||
|
print("\nAuthentication failed. Trying to re-login...")
|
||||||
|
|
||||||
|
# Re-login and update token
|
||||||
|
self.login_with_password()
|
||||||
|
|
||||||
|
# Update headers with new token
|
||||||
|
headers['Authorization'] = f"Token {self.access_token}"
|
||||||
|
combined_headers = {**self.session.headers, **headers}
|
||||||
|
|
||||||
|
# Retry the request with the new token
|
||||||
|
response = getattr(self.session, method)(url, headers=combined_headers, **kwargs)
|
||||||
|
response.raise_for_status()
|
||||||
|
return response
|
||||||
|
else:
|
||||||
|
# Re-raise other errors
|
||||||
|
raise
|
||||||
|
|
||||||
|
def _normalize_transit_response(self, data: Union[Dict, List, Any]) -> Union[Dict, List, Any]:
|
||||||
|
"""
|
||||||
|
Normalize a Transit+JSON response to a more usable format.
|
||||||
|
|
||||||
|
This recursively processes the response data, handling special Transit types
|
||||||
|
like UUIDs, keywords, and nested structures.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
data: The data to normalize, can be a dict, list, or other value
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Normalized data
|
||||||
|
"""
|
||||||
|
if isinstance(data, dict):
|
||||||
|
# Normalize dictionary
|
||||||
|
result = {}
|
||||||
|
for key, value in data.items():
|
||||||
|
# Convert transit keywords in keys (~:key -> key)
|
||||||
|
norm_key = key.replace(
|
||||||
|
'~:', '') if isinstance(
|
||||||
|
key, str) and key.startswith('~:') else key
|
||||||
|
# Recursively normalize values
|
||||||
|
result[norm_key] = self._normalize_transit_response(value)
|
||||||
|
return result
|
||||||
|
elif isinstance(data, list):
|
||||||
|
# Normalize list items
|
||||||
|
return [self._normalize_transit_response(item) for item in data]
|
||||||
|
elif isinstance(data, str) and data.startswith('~u'):
|
||||||
|
# Convert Transit UUIDs (~u123-456 -> 123-456)
|
||||||
|
return data[2:]
|
||||||
|
else:
|
||||||
|
# Return other types as-is
|
||||||
|
return data
|
||||||
|
|
||||||
|
def list_projects(self) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
List all available projects for the authenticated user.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary containing project information
|
||||||
|
"""
|
||||||
|
url = f"{self.base_url}/rpc/command/get-all-projects"
|
||||||
|
|
||||||
|
payload = {} # No parameters required
|
||||||
|
|
||||||
|
response = self._make_authenticated_request('post', url, json=payload, use_transit=False)
|
||||||
|
|
||||||
|
if self.debug:
|
||||||
|
content_type = response.headers.get('Content-Type', '')
|
||||||
|
print(f"\nResponse content type: {content_type}")
|
||||||
|
print(f"Response preview: {response.text[:100]}...")
|
||||||
|
|
||||||
|
# Parse JSON
|
||||||
|
data = response.json()
|
||||||
|
|
||||||
|
if self.debug:
|
||||||
|
print("\nData preview:")
|
||||||
|
print(json.dumps(data, indent=2)[:200] + "...")
|
||||||
|
|
||||||
|
return data
|
||||||
|
|
||||||
|
def get_project(self, project_id: str) -> Optional[Dict[str, Any]]:
|
||||||
|
"""
|
||||||
|
Get details for a specific project.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
project_id: The ID of the project to retrieve
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary containing project information
|
||||||
|
"""
|
||||||
|
# First get all projects
|
||||||
|
projects = self.list_projects()
|
||||||
|
|
||||||
|
# Find the specific project by ID
|
||||||
|
for project in projects:
|
||||||
|
if project.get('id') == project_id:
|
||||||
|
return project
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
def get_project_files(self, project_id: str) -> List[Dict[str, Any]]:
|
||||||
|
"""
|
||||||
|
Get all files for a specific project.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
project_id: The ID of the project
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of file information dictionaries
|
||||||
|
"""
|
||||||
|
url = f"{self.base_url}/rpc/command/get-project-files"
|
||||||
|
|
||||||
|
payload = {
|
||||||
|
"project-id": project_id
|
||||||
|
}
|
||||||
|
|
||||||
|
response = self._make_authenticated_request('post', url, json=payload, use_transit=False)
|
||||||
|
|
||||||
|
# Parse JSON
|
||||||
|
files = response.json()
|
||||||
|
return files
|
||||||
|
|
||||||
|
def get_file(self, file_id: str, save_data: bool = False,
|
||||||
|
save_raw_response: bool = False) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Get details for a specific file.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file_id: The ID of the file to retrieve
|
||||||
|
features: List of features to include in the response
|
||||||
|
project_id: Optional project ID if known
|
||||||
|
save_data: Whether to save the data to a file
|
||||||
|
save_raw_response: Whether to save the raw response
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary containing file information
|
||||||
|
"""
|
||||||
|
url = f"{self.base_url}/rpc/command/get-file"
|
||||||
|
|
||||||
|
payload = {
|
||||||
|
"id": file_id,
|
||||||
|
}
|
||||||
|
|
||||||
|
response = self._make_authenticated_request('post', url, json=payload, use_transit=False)
|
||||||
|
|
||||||
|
# Save raw response if requested
|
||||||
|
if save_raw_response:
|
||||||
|
raw_filename = f"{file_id}_raw_response.json"
|
||||||
|
with open(raw_filename, 'w') as f:
|
||||||
|
f.write(response.text)
|
||||||
|
if self.debug:
|
||||||
|
print(f"\nSaved raw response to {raw_filename}")
|
||||||
|
|
||||||
|
# Parse JSON
|
||||||
|
data = response.json()
|
||||||
|
|
||||||
|
# Save normalized data if requested
|
||||||
|
if save_data:
|
||||||
|
filename = f"{file_id}.json"
|
||||||
|
with open(filename, 'w') as f:
|
||||||
|
json.dump(data, f, indent=2)
|
||||||
|
if self.debug:
|
||||||
|
print(f"\nSaved file data to {filename}")
|
||||||
|
|
||||||
|
return data
|
||||||
|
|
||||||
|
def create_export(self, file_id: str, page_id: str, object_id: str,
|
||||||
|
export_type: str = "png", scale: int = 1,
|
||||||
|
email: Optional[str] = None, password: Optional[str] = None,
|
||||||
|
profile_id: Optional[str] = None):
|
||||||
|
"""
|
||||||
|
Create an export job for a Penpot object.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file_id: The file ID
|
||||||
|
page_id: The page ID
|
||||||
|
object_id: The object ID to export
|
||||||
|
export_type: Type of export (png, svg, pdf)
|
||||||
|
scale: Scale factor for the export
|
||||||
|
name: Name for the export
|
||||||
|
suffix: Suffix to add to the export name
|
||||||
|
email: Email for authentication (if different from instance)
|
||||||
|
password: Password for authentication (if different from instance)
|
||||||
|
profile_id: Optional profile ID (if not provided, will be fetched automatically)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Export resource ID
|
||||||
|
"""
|
||||||
|
# This uses the cookie auth approach, which requires login
|
||||||
|
token = self.login_for_export(email, password)
|
||||||
|
|
||||||
|
# If profile_id is not provided, get it from instance variable or fetch it
|
||||||
|
if not profile_id:
|
||||||
|
if not self.profile_id:
|
||||||
|
# We need to set the token first for the get_profile call to work
|
||||||
|
self.set_access_token(token)
|
||||||
|
self.get_profile()
|
||||||
|
profile_id = self.profile_id
|
||||||
|
|
||||||
|
if not profile_id:
|
||||||
|
raise ValueError("Profile ID not available and couldn't be retrieved automatically")
|
||||||
|
|
||||||
|
# Build the URL for export creation
|
||||||
|
url = f"{self.base_url}/export"
|
||||||
|
|
||||||
|
# Set up the data for the export
|
||||||
|
payload = {
|
||||||
|
"~:wait": True,
|
||||||
|
"~:exports": [
|
||||||
|
{"~:type": f"~:{export_type}",
|
||||||
|
"~:suffix": "",
|
||||||
|
"~:scale": scale,
|
||||||
|
"~:page-id": f"~u{page_id}",
|
||||||
|
"~:file-id": f"~u{file_id}",
|
||||||
|
"~:name": "",
|
||||||
|
"~:object-id": f"~u{object_id}"}
|
||||||
|
],
|
||||||
|
"~:profile-id": f"~u{profile_id}",
|
||||||
|
"~:cmd": "~:export-shapes"
|
||||||
|
}
|
||||||
|
|
||||||
|
if self.debug:
|
||||||
|
print("\nCreating export with parameters:")
|
||||||
|
print(json.dumps(payload, indent=2))
|
||||||
|
|
||||||
|
# Create a session with the auth token
|
||||||
|
export_session = requests.Session()
|
||||||
|
export_session.cookies.set("auth-token", token)
|
||||||
|
|
||||||
|
headers = {
|
||||||
|
"Content-Type": "application/transit+json",
|
||||||
|
"Accept": "application/transit+json"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Make the request
|
||||||
|
response = export_session.post(url, json=payload, headers=headers)
|
||||||
|
|
||||||
|
if self.debug and response.status_code != 200:
|
||||||
|
print(f"\nError response: {response.status_code}")
|
||||||
|
print(f"Response text: {response.text}")
|
||||||
|
|
||||||
|
response.raise_for_status()
|
||||||
|
|
||||||
|
# Parse the response
|
||||||
|
data = response.json()
|
||||||
|
|
||||||
|
if self.debug:
|
||||||
|
print("\nExport created successfully")
|
||||||
|
print(f"Response: {json.dumps(data, indent=2)}")
|
||||||
|
|
||||||
|
# Extract and return the resource ID
|
||||||
|
resource_id = data.get("~:id")
|
||||||
|
if not resource_id:
|
||||||
|
raise ValueError("Resource ID not found in response")
|
||||||
|
|
||||||
|
return resource_id
|
||||||
|
|
||||||
|
def get_export_resource(self,
|
||||||
|
resource_id: str,
|
||||||
|
save_to_file: Optional[str] = None,
|
||||||
|
email: Optional[str] = None,
|
||||||
|
password: Optional[str] = None) -> Union[bytes,
|
||||||
|
str]:
|
||||||
|
"""
|
||||||
|
Download an export resource by ID.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
resource_id: The resource ID from create_export
|
||||||
|
save_to_file: Path to save the file (if None, returns the content)
|
||||||
|
email: Email for authentication (if different from instance)
|
||||||
|
password: Password for authentication (if different from instance)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Either the file content as bytes, or the path to the saved file
|
||||||
|
"""
|
||||||
|
# This uses the cookie auth approach, which requires login
|
||||||
|
token = self.login_for_export(email, password)
|
||||||
|
|
||||||
|
# Build the URL for the resource
|
||||||
|
url = f"{self.base_url}/export"
|
||||||
|
|
||||||
|
payload = {
|
||||||
|
"~:wait": False,
|
||||||
|
"~:cmd": "~:get-resource",
|
||||||
|
"~:id": resource_id
|
||||||
|
}
|
||||||
|
headers = {
|
||||||
|
"Content-Type": "application/transit+json",
|
||||||
|
"Accept": "*/*"
|
||||||
|
}
|
||||||
|
if self.debug:
|
||||||
|
print(f"\nFetching export resource: {url}")
|
||||||
|
|
||||||
|
# Create a session with the auth token
|
||||||
|
export_session = requests.Session()
|
||||||
|
export_session.cookies.set("auth-token", token)
|
||||||
|
|
||||||
|
# Make the request
|
||||||
|
response = export_session.post(url, json=payload, headers=headers)
|
||||||
|
|
||||||
|
if self.debug and response.status_code != 200:
|
||||||
|
print(f"\nError response: {response.status_code}")
|
||||||
|
print(f"Response headers: {response.headers}")
|
||||||
|
|
||||||
|
response.raise_for_status()
|
||||||
|
|
||||||
|
# Get the content type
|
||||||
|
content_type = response.headers.get('Content-Type', '')
|
||||||
|
|
||||||
|
if self.debug:
|
||||||
|
print(f"\nResource fetched successfully")
|
||||||
|
print(f"Content-Type: {content_type}")
|
||||||
|
print(f"Content length: {len(response.content)} bytes")
|
||||||
|
|
||||||
|
# Determine filename if saving to file
|
||||||
|
if save_to_file:
|
||||||
|
if os.path.isdir(save_to_file):
|
||||||
|
# If save_to_file is a directory, we need to figure out the filename
|
||||||
|
filename = None
|
||||||
|
|
||||||
|
# Try to get filename from Content-Disposition header
|
||||||
|
content_disp = response.headers.get('Content-Disposition', '')
|
||||||
|
if 'filename=' in content_disp:
|
||||||
|
filename = content_disp.split('filename=')[1].strip('"\'')
|
||||||
|
|
||||||
|
# If we couldn't get a filename, use the resource_id with an extension
|
||||||
|
if not filename:
|
||||||
|
ext = content_type.split('/')[-1].split(';')[0]
|
||||||
|
if ext in ('jpeg', 'png', 'pdf', 'svg+xml'):
|
||||||
|
if ext == 'svg+xml':
|
||||||
|
ext = 'svg'
|
||||||
|
filename = f"{resource_id}.{ext}"
|
||||||
|
else:
|
||||||
|
filename = f"{resource_id}"
|
||||||
|
|
||||||
|
save_path = os.path.join(save_to_file, filename)
|
||||||
|
else:
|
||||||
|
# Use the provided path directly
|
||||||
|
save_path = save_to_file
|
||||||
|
|
||||||
|
# Ensure the directory exists
|
||||||
|
os.makedirs(os.path.dirname(os.path.abspath(save_path)), exist_ok=True)
|
||||||
|
|
||||||
|
# Save the content to file
|
||||||
|
with open(save_path, 'wb') as f:
|
||||||
|
f.write(response.content)
|
||||||
|
|
||||||
|
if self.debug:
|
||||||
|
print(f"\nSaved resource to {save_path}")
|
||||||
|
|
||||||
|
return save_path
|
||||||
|
else:
|
||||||
|
# Return the content
|
||||||
|
return response.content
|
||||||
|
|
||||||
|
def export_and_download(self, file_id: str, page_id: str, object_id: str,
|
||||||
|
save_to_file: Optional[str] = None, export_type: str = "png",
|
||||||
|
scale: int = 1, name: str = "Board", suffix: str = "",
|
||||||
|
email: Optional[str] = None, password: Optional[str] = None,
|
||||||
|
profile_id: Optional[str] = None) -> Union[bytes, str]:
|
||||||
|
"""
|
||||||
|
Create and download an export in one step.
|
||||||
|
|
||||||
|
This is a convenience method that combines create_export and get_export_resource.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file_id: The file ID
|
||||||
|
page_id: The page ID
|
||||||
|
object_id: The object ID to export
|
||||||
|
save_to_file: Path to save the file (if None, returns the content)
|
||||||
|
export_type: Type of export (png, svg, pdf)
|
||||||
|
scale: Scale factor for the export
|
||||||
|
name: Name for the export
|
||||||
|
suffix: Suffix to add to the export name
|
||||||
|
email: Email for authentication (if different from instance)
|
||||||
|
password: Password for authentication (if different from instance)
|
||||||
|
profile_id: Optional profile ID (if not provided, will be fetched automatically)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Either the file content as bytes, or the path to the saved file
|
||||||
|
"""
|
||||||
|
# Create the export
|
||||||
|
resource_id = self.create_export(
|
||||||
|
file_id=file_id,
|
||||||
|
page_id=page_id,
|
||||||
|
object_id=object_id,
|
||||||
|
export_type=export_type,
|
||||||
|
scale=scale,
|
||||||
|
email=email,
|
||||||
|
password=password,
|
||||||
|
profile_id=profile_id
|
||||||
|
)
|
||||||
|
|
||||||
|
# Download the resource
|
||||||
|
return self.get_export_resource(
|
||||||
|
resource_id=resource_id,
|
||||||
|
save_to_file=save_to_file,
|
||||||
|
email=email,
|
||||||
|
password=password
|
||||||
|
)
|
||||||
|
|
||||||
|
def extract_components(self, file_data: Dict[str, Any]) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Extract components from file data.
|
||||||
|
|
||||||
|
This processes a file's data to extract and normalize component information.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file_data: The file data from get_file
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary containing components information
|
||||||
|
"""
|
||||||
|
components = {}
|
||||||
|
components_index = file_data.get('data', {}).get('componentsIndex', {})
|
||||||
|
|
||||||
|
for component_id, component_data in components_index.items():
|
||||||
|
# Extract basic component info
|
||||||
|
component = {
|
||||||
|
'id': component_id,
|
||||||
|
'name': component_data.get('name', 'Unnamed'),
|
||||||
|
'path': component_data.get('path', []),
|
||||||
|
'shape': component_data.get('shape', ''),
|
||||||
|
'fileId': component_data.get('fileId', file_data.get('id')),
|
||||||
|
'created': component_data.get('created'),
|
||||||
|
'modified': component_data.get('modified')
|
||||||
|
}
|
||||||
|
|
||||||
|
# Add the component to our collection
|
||||||
|
components[component_id] = component
|
||||||
|
|
||||||
|
return {'components': components}
|
||||||
|
|
||||||
|
def analyze_file_structure(self, file_data: Dict[str, Any]) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Analyze file structure and return summary information.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file_data: The file data from get_file
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary containing analysis information
|
||||||
|
"""
|
||||||
|
data = file_data.get('data', {})
|
||||||
|
|
||||||
|
# Count pages
|
||||||
|
pages = data.get('pagesIndex', {})
|
||||||
|
page_count = len(pages)
|
||||||
|
|
||||||
|
# Count objects by type
|
||||||
|
object_types = {}
|
||||||
|
total_objects = 0
|
||||||
|
|
||||||
|
for page_id, page_data in pages.items():
|
||||||
|
objects = page_data.get('objects', {})
|
||||||
|
total_objects += len(objects)
|
||||||
|
|
||||||
|
for obj_id, obj_data in objects.items():
|
||||||
|
obj_type = obj_data.get('type', 'unknown')
|
||||||
|
object_types[obj_type] = object_types.get(obj_type, 0) + 1
|
||||||
|
|
||||||
|
# Count components
|
||||||
|
components = data.get('componentsIndex', {})
|
||||||
|
component_count = len(components)
|
||||||
|
|
||||||
|
# Count colors, typographies, etc.
|
||||||
|
colors = data.get('colorsIndex', {})
|
||||||
|
color_count = len(colors)
|
||||||
|
|
||||||
|
typographies = data.get('typographiesIndex', {})
|
||||||
|
typography_count = len(typographies)
|
||||||
|
|
||||||
|
return {
|
||||||
|
'pageCount': page_count,
|
||||||
|
'objectCount': total_objects,
|
||||||
|
'objectTypes': object_types,
|
||||||
|
'componentCount': component_count,
|
||||||
|
'colorCount': color_count,
|
||||||
|
'typographyCount': typography_count,
|
||||||
|
'fileName': file_data.get('name', 'Unknown'),
|
||||||
|
'fileId': file_data.get('id')
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
# Set up argument parser
|
||||||
|
parser = argparse.ArgumentParser(description='Penpot API Tool')
|
||||||
|
parser.add_argument('--debug', action='store_true', help='Enable debug output')
|
||||||
|
|
||||||
|
# Create subparsers for different commands
|
||||||
|
subparsers = parser.add_subparsers(dest='command', help='Command to run')
|
||||||
|
|
||||||
|
# List projects command
|
||||||
|
list_parser = subparsers.add_parser('list-projects', help='List all projects')
|
||||||
|
|
||||||
|
# Get project command
|
||||||
|
project_parser = subparsers.add_parser('get-project', help='Get project details')
|
||||||
|
project_parser.add_argument('--id', required=True, help='Project ID')
|
||||||
|
|
||||||
|
# List files command
|
||||||
|
files_parser = subparsers.add_parser('list-files', help='List files in a project')
|
||||||
|
files_parser.add_argument('--project-id', required=True, help='Project ID')
|
||||||
|
|
||||||
|
# Get file command
|
||||||
|
file_parser = subparsers.add_parser('get-file', help='Get file details')
|
||||||
|
file_parser.add_argument('--file-id', required=True, help='File ID')
|
||||||
|
file_parser.add_argument('--save', action='store_true', help='Save file data to JSON')
|
||||||
|
|
||||||
|
# Export command
|
||||||
|
export_parser = subparsers.add_parser('export', help='Export an object')
|
||||||
|
export_parser.add_argument(
|
||||||
|
'--profile-id',
|
||||||
|
required=False,
|
||||||
|
help='Profile ID (optional, will be fetched automatically if not provided)')
|
||||||
|
export_parser.add_argument('--file-id', required=True, help='File ID')
|
||||||
|
export_parser.add_argument('--page-id', required=True, help='Page ID')
|
||||||
|
export_parser.add_argument('--object-id', required=True, help='Object ID')
|
||||||
|
export_parser.add_argument(
|
||||||
|
'--type',
|
||||||
|
default='png',
|
||||||
|
choices=[
|
||||||
|
'png',
|
||||||
|
'svg',
|
||||||
|
'pdf'],
|
||||||
|
help='Export type')
|
||||||
|
export_parser.add_argument('--scale', type=int, default=1, help='Scale factor')
|
||||||
|
export_parser.add_argument('--output', required=True, help='Output file path')
|
||||||
|
|
||||||
|
# Parse arguments
|
||||||
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
# Create API client
|
||||||
|
api = PenpotAPI(debug=args.debug)
|
||||||
|
|
||||||
|
# Handle different commands
|
||||||
|
if args.command == 'list-projects':
|
||||||
|
projects = api.list_projects()
|
||||||
|
print(f"Found {len(projects)} projects:")
|
||||||
|
for project in projects:
|
||||||
|
print(f"- {project.get('name')} - {project.get('teamName')} (ID: {project.get('id')})")
|
||||||
|
|
||||||
|
elif args.command == 'get-project':
|
||||||
|
project = api.get_project(args.id)
|
||||||
|
if project:
|
||||||
|
print(f"Project: {project.get('name')}")
|
||||||
|
print(json.dumps(project, indent=2))
|
||||||
|
else:
|
||||||
|
print(f"Project not found: {args.id}")
|
||||||
|
|
||||||
|
elif args.command == 'list-files':
|
||||||
|
files = api.get_project_files(args.project_id)
|
||||||
|
print(f"Found {len(files)} files:")
|
||||||
|
for file in files:
|
||||||
|
print(f"- {file.get('name')} (ID: {file.get('id')})")
|
||||||
|
|
||||||
|
elif args.command == 'get-file':
|
||||||
|
file_data = api.get_file(args.file_id, save_data=args.save)
|
||||||
|
print(f"File: {file_data.get('name')}")
|
||||||
|
if args.save:
|
||||||
|
print(f"Data saved to {args.file_id}.json")
|
||||||
|
else:
|
||||||
|
print("File metadata:")
|
||||||
|
print(json.dumps({k: v for k, v in file_data.items() if k != 'data'}, indent=2))
|
||||||
|
|
||||||
|
elif args.command == 'export':
|
||||||
|
output_path = api.export_and_download(
|
||||||
|
file_id=args.file_id,
|
||||||
|
page_id=args.page_id,
|
||||||
|
object_id=args.object_id,
|
||||||
|
export_type=args.type,
|
||||||
|
scale=args.scale,
|
||||||
|
save_to_file=args.output,
|
||||||
|
profile_id=args.profile_id
|
||||||
|
)
|
||||||
|
print(f"Exported to: {output_path}")
|
||||||
|
else:
|
||||||
|
parser.print_help()
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
||||||
299
penpot_mcp/resources/penpot-schema.json
Normal file
299
penpot_mcp/resources/penpot-schema.json
Normal file
@@ -0,0 +1,299 @@
|
|||||||
|
{
|
||||||
|
"$schema": "http://json-schema.org/draft-07/schema#",
|
||||||
|
"type": "object",
|
||||||
|
"required": ["colors", "typographies", "pages", "components", "id", "tokensLib", "pagesIndex"],
|
||||||
|
"properties": {
|
||||||
|
"colors": {
|
||||||
|
"type": "object",
|
||||||
|
"patternProperties": {
|
||||||
|
"^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["path", "color", "name", "modifiedAt", "opacity", "id"],
|
||||||
|
"properties": {
|
||||||
|
"path": {"type": "string"},
|
||||||
|
"color": {"type": "string", "pattern": "^#[0-9A-Fa-f]{6}$"},
|
||||||
|
"name": {"type": "string"},
|
||||||
|
"modifiedAt": {"type": "string", "format": "date-time"},
|
||||||
|
"opacity": {"type": "number", "minimum": 0, "maximum": 1},
|
||||||
|
"id": {"type": "string", "format": "uuid"}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"typographies": {
|
||||||
|
"type": "object",
|
||||||
|
"patternProperties": {
|
||||||
|
"^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["lineHeight", "path", "fontStyle", "textTransform", "fontId", "fontSize", "fontWeight", "name", "modifiedAt", "fontVariantId", "id", "letterSpacing", "fontFamily"],
|
||||||
|
"properties": {
|
||||||
|
"lineHeight": {"type": "string"},
|
||||||
|
"path": {"type": "string"},
|
||||||
|
"fontStyle": {"type": "string", "enum": ["normal"]},
|
||||||
|
"textTransform": {"type": "string", "enum": ["uppercase", "none"]},
|
||||||
|
"fontId": {"type": "string"},
|
||||||
|
"fontSize": {"type": "string"},
|
||||||
|
"fontWeight": {"type": "string"},
|
||||||
|
"name": {"type": "string"},
|
||||||
|
"modifiedAt": {"type": "string", "format": "date-time"},
|
||||||
|
"fontVariantId": {"type": "string"},
|
||||||
|
"id": {"type": "string", "format": "uuid"},
|
||||||
|
"letterSpacing": {"type": "string"},
|
||||||
|
"fontFamily": {"type": "string"}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"pages": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {"type": "string", "format": "uuid"}
|
||||||
|
},
|
||||||
|
"components": {
|
||||||
|
"type": "object",
|
||||||
|
"patternProperties": {
|
||||||
|
"^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["id", "name", "path", "modifiedAt", "mainInstanceId", "mainInstancePage"],
|
||||||
|
"properties": {
|
||||||
|
"id": {"type": "string", "format": "uuid"},
|
||||||
|
"name": {"type": "string"},
|
||||||
|
"path": {"type": "string"},
|
||||||
|
"modifiedAt": {"type": "string", "format": "date-time"},
|
||||||
|
"mainInstanceId": {"type": "string", "format": "uuid"},
|
||||||
|
"mainInstancePage": {"type": "string", "format": "uuid"},
|
||||||
|
"annotation": {"type": "string"}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"id": {"type": "string", "format": "uuid"},
|
||||||
|
"tokensLib": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["sets", "themes", "activeThemes"],
|
||||||
|
"properties": {
|
||||||
|
"sets": {
|
||||||
|
"type": "object",
|
||||||
|
"patternProperties": {
|
||||||
|
"^S-[a-z]+$": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["name", "description", "modifiedAt", "tokens"],
|
||||||
|
"properties": {
|
||||||
|
"name": {"type": "string"},
|
||||||
|
"description": {"type": "string"},
|
||||||
|
"modifiedAt": {"type": "string", "format": "date-time"},
|
||||||
|
"tokens": {
|
||||||
|
"type": "object",
|
||||||
|
"patternProperties": {
|
||||||
|
"^[a-z][a-z0-9.-]*$": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["name", "type", "value", "description", "modifiedAt"],
|
||||||
|
"properties": {
|
||||||
|
"name": {"type": "string"},
|
||||||
|
"type": {"type": "string", "enum": ["dimensions", "sizing", "color", "border-radius", "spacing", "stroke-width", "rotation", "opacity"]},
|
||||||
|
"value": {"type": "string"},
|
||||||
|
"description": {"type": "string"},
|
||||||
|
"modifiedAt": {"type": "string", "format": "date-time"}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"themes": {
|
||||||
|
"type": "object",
|
||||||
|
"patternProperties": {
|
||||||
|
".*": {
|
||||||
|
"type": "object",
|
||||||
|
"patternProperties": {
|
||||||
|
".*": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["name", "group", "description", "isSource", "id", "modifiedAt", "sets"],
|
||||||
|
"properties": {
|
||||||
|
"name": {"type": "string"},
|
||||||
|
"group": {"type": "string"},
|
||||||
|
"description": {"type": "string"},
|
||||||
|
"isSource": {"type": "boolean"},
|
||||||
|
"id": {"type": "string", "format": "uuid"},
|
||||||
|
"modifiedAt": {"type": "string", "format": "date-time"},
|
||||||
|
"sets": {"type": "array", "items": {"type": "string"}}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"activeThemes": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {"type": "string"}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"options": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"componentsV2": {"type": "boolean"}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"pagesIndex": {
|
||||||
|
"type": "object",
|
||||||
|
"patternProperties": {
|
||||||
|
"^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["options", "objects", "id", "name"],
|
||||||
|
"properties": {
|
||||||
|
"options": {"type": "object"},
|
||||||
|
"objects": {
|
||||||
|
"type": "object",
|
||||||
|
"patternProperties": {
|
||||||
|
"^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["id", "name", "type"],
|
||||||
|
"properties": {
|
||||||
|
"id": {"type": "string", "format": "uuid"},
|
||||||
|
"name": {"type": "string"},
|
||||||
|
"type": {"type": "string", "enum": ["frame", "rect", "text"]},
|
||||||
|
"x": {"type": "number"},
|
||||||
|
"y": {"type": "number"},
|
||||||
|
"width": {"type": "number"},
|
||||||
|
"height": {"type": "number"},
|
||||||
|
"rotation": {"type": "number"},
|
||||||
|
"selrect": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"x": {"type": "number"},
|
||||||
|
"y": {"type": "number"},
|
||||||
|
"width": {"type": "number"},
|
||||||
|
"height": {"type": "number"},
|
||||||
|
"x1": {"type": "number"},
|
||||||
|
"y1": {"type": "number"},
|
||||||
|
"x2": {"type": "number"},
|
||||||
|
"y2": {"type": "number"}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"points": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"x": {"type": "number"},
|
||||||
|
"y": {"type": "number"}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"transform": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"a": {"type": "number"},
|
||||||
|
"b": {"type": "number"},
|
||||||
|
"c": {"type": "number"},
|
||||||
|
"d": {"type": "number"},
|
||||||
|
"e": {"type": "number"},
|
||||||
|
"f": {"type": "number"}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"transformInverse": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"a": {"type": "number"},
|
||||||
|
"b": {"type": "number"},
|
||||||
|
"c": {"type": "number"},
|
||||||
|
"d": {"type": "number"},
|
||||||
|
"e": {"type": "number"},
|
||||||
|
"f": {"type": "number"}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"parentId": {"type": "string", "format": "uuid"},
|
||||||
|
"frameId": {"type": "string", "format": "uuid"},
|
||||||
|
"flipX": {"type": ["null", "boolean"]},
|
||||||
|
"flipY": {"type": ["null", "boolean"]},
|
||||||
|
"hideFillOnExport": {"type": "boolean"},
|
||||||
|
"growType": {"type": "string", "enum": ["fixed", "auto-height"]},
|
||||||
|
"hideInViewer": {"type": "boolean"},
|
||||||
|
"r1": {"type": "number"},
|
||||||
|
"r2": {"type": "number"},
|
||||||
|
"r3": {"type": "number"},
|
||||||
|
"r4": {"type": "number"},
|
||||||
|
"proportion": {"type": "number"},
|
||||||
|
"proportionLock": {"type": "boolean"},
|
||||||
|
"componentRoot": {"type": "boolean"},
|
||||||
|
"componentId": {"type": "string", "format": "uuid"},
|
||||||
|
"mainInstance": {"type": "boolean"},
|
||||||
|
"componentFile": {"type": "string", "format": "uuid"},
|
||||||
|
"strokes": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"strokeStyle": {"type": "string"},
|
||||||
|
"strokeAlignment": {"type": "string"},
|
||||||
|
"strokeWidth": {"type": "number"},
|
||||||
|
"strokeColor": {"type": "string"},
|
||||||
|
"strokeOpacity": {"type": "number"}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"fills": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"fillColor": {"type": "string"},
|
||||||
|
"fillOpacity": {"type": "number"},
|
||||||
|
"fillImage": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"name": {"type": "string"},
|
||||||
|
"width": {"type": "number"},
|
||||||
|
"height": {"type": "number"},
|
||||||
|
"mtype": {"type": "string"},
|
||||||
|
"id": {"type": "string", "format": "uuid"},
|
||||||
|
"keepAspectRatio": {"type": "boolean"}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"shapes": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {"type": "string", "format": "uuid"}
|
||||||
|
},
|
||||||
|
"content": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"type": {"type": "string"},
|
||||||
|
"children": {"type": "array"}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"appliedTokens": {"type": "object"},
|
||||||
|
"positionData": {"type": "array"},
|
||||||
|
"layoutItemMarginType": {"type": "string"},
|
||||||
|
"constraintsV": {"type": "string"},
|
||||||
|
"constraintsH": {"type": "string"},
|
||||||
|
"layoutItemMargin": {"type": "object"},
|
||||||
|
"layoutGapType": {"type": "string"},
|
||||||
|
"layoutPadding": {"type": "object"},
|
||||||
|
"layoutWrapType": {"type": "string"},
|
||||||
|
"layout": {"type": "string"},
|
||||||
|
"layoutAlignItems": {"type": "string"},
|
||||||
|
"layoutPaddingType": {"type": "string"},
|
||||||
|
"layoutItemHSizing": {"type": "string"},
|
||||||
|
"layoutGap": {"type": "object"},
|
||||||
|
"layoutItemVSizing": {"type": "string"},
|
||||||
|
"layoutJustifyContent": {"type": "string"},
|
||||||
|
"layoutFlexDir": {"type": "string"},
|
||||||
|
"layoutAlignContent": {"type": "string"},
|
||||||
|
"shapeRef": {"type": "string", "format": "uuid"}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"id": {"type": "string", "format": "uuid"},
|
||||||
|
"name": {"type": "string"}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
295
penpot_mcp/resources/penpot-tree-schema.json
Normal file
295
penpot_mcp/resources/penpot-tree-schema.json
Normal file
@@ -0,0 +1,295 @@
|
|||||||
|
{
|
||||||
|
"$schema": "http://json-schema.org/draft-07/schema#",
|
||||||
|
"type": "object",
|
||||||
|
"required": ["colors", "typographies", "pages", "components", "id", "tokensLib", "pagesIndex"],
|
||||||
|
"properties": {
|
||||||
|
"colors": {
|
||||||
|
"type": "object",
|
||||||
|
"patternProperties": {
|
||||||
|
"^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["path", "color", "name", "modifiedAt", "opacity", "id"],
|
||||||
|
"properties": {
|
||||||
|
"path": {"type": "string"},
|
||||||
|
"color": {"type": "string", "pattern": "^#[0-9A-Fa-f]{6}$"},
|
||||||
|
"name": {"type": "string"},
|
||||||
|
"modifiedAt": {"type": "string", "format": "date-time"},
|
||||||
|
"opacity": {"type": "number", "minimum": 0, "maximum": 1},
|
||||||
|
"id": {"type": "string", "format": "uuid"}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"typographies": {
|
||||||
|
"type": "object",
|
||||||
|
"patternProperties": {
|
||||||
|
"^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["lineHeight", "path", "fontStyle", "textTransform", "fontId", "fontSize", "fontWeight", "name", "modifiedAt", "fontVariantId", "id", "letterSpacing", "fontFamily"],
|
||||||
|
"properties": {
|
||||||
|
"lineHeight": {"type": "string"},
|
||||||
|
"path": {"type": "string"},
|
||||||
|
"fontStyle": {"type": "string", "enum": ["normal"]},
|
||||||
|
"textTransform": {"type": "string", "enum": ["uppercase", "none"]},
|
||||||
|
"fontId": {"type": "string"},
|
||||||
|
"fontSize": {"type": "string"},
|
||||||
|
"fontWeight": {"type": "string"},
|
||||||
|
"name": {"type": "string"},
|
||||||
|
"modifiedAt": {"type": "string", "format": "date-time"},
|
||||||
|
"fontVariantId": {"type": "string"},
|
||||||
|
"id": {"type": "string", "format": "uuid"},
|
||||||
|
"letterSpacing": {"type": "string"},
|
||||||
|
"fontFamily": {"type": "string"}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"components": {
|
||||||
|
"type": "object",
|
||||||
|
"patternProperties": {
|
||||||
|
"^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["id", "name", "path", "modifiedAt", "mainInstanceId", "mainInstancePage"],
|
||||||
|
"properties": {
|
||||||
|
"id": {"type": "string", "format": "uuid"},
|
||||||
|
"name": {"type": "string"},
|
||||||
|
"path": {"type": "string"},
|
||||||
|
"modifiedAt": {"type": "string", "format": "date-time"},
|
||||||
|
"mainInstanceId": {"type": "string", "format": "uuid"},
|
||||||
|
"mainInstancePage": {"type": "string", "format": "uuid"},
|
||||||
|
"annotation": {"type": "string"}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"id": {"type": "string", "format": "uuid"},
|
||||||
|
"tokensLib": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["sets", "themes", "activeThemes"],
|
||||||
|
"properties": {
|
||||||
|
"sets": {
|
||||||
|
"type": "object",
|
||||||
|
"patternProperties": {
|
||||||
|
"^S-[a-z]+$": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["name", "description", "modifiedAt", "tokens"],
|
||||||
|
"properties": {
|
||||||
|
"name": {"type": "string"},
|
||||||
|
"description": {"type": "string"},
|
||||||
|
"modifiedAt": {"type": "string", "format": "date-time"},
|
||||||
|
"tokens": {
|
||||||
|
"type": "object",
|
||||||
|
"patternProperties": {
|
||||||
|
"^[a-z][a-z0-9.-]*$": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["name", "type", "value", "description", "modifiedAt"],
|
||||||
|
"properties": {
|
||||||
|
"name": {"type": "string"},
|
||||||
|
"type": {"type": "string", "enum": ["dimensions", "sizing", "color", "border-radius", "spacing", "stroke-width", "rotation", "opacity"]},
|
||||||
|
"value": {"type": "string"},
|
||||||
|
"description": {"type": "string"},
|
||||||
|
"modifiedAt": {"type": "string", "format": "date-time"}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"themes": {
|
||||||
|
"type": "object",
|
||||||
|
"patternProperties": {
|
||||||
|
".*": {
|
||||||
|
"type": "object",
|
||||||
|
"patternProperties": {
|
||||||
|
".*": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["name", "group", "description", "isSource", "id", "modifiedAt", "sets"],
|
||||||
|
"properties": {
|
||||||
|
"name": {"type": "string"},
|
||||||
|
"group": {"type": "string"},
|
||||||
|
"description": {"type": "string"},
|
||||||
|
"isSource": {"type": "boolean"},
|
||||||
|
"id": {"type": "string", "format": "uuid"},
|
||||||
|
"modifiedAt": {"type": "string", "format": "date-time"},
|
||||||
|
"sets": {"type": "array", "items": {"type": "string"}}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"activeThemes": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {"type": "string"}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"options": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"componentsV2": {"type": "boolean"}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"objects": {
|
||||||
|
"type": "object",
|
||||||
|
"patternProperties": {
|
||||||
|
"^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["options", "objects", "id", "name"],
|
||||||
|
"properties": {
|
||||||
|
"options": {"type": "object"},
|
||||||
|
"objects": {
|
||||||
|
"type": "object",
|
||||||
|
"patternProperties": {
|
||||||
|
"^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$": {
|
||||||
|
"type": "object",
|
||||||
|
"required": ["id", "name", "type"],
|
||||||
|
"properties": {
|
||||||
|
"id": {"type": "string", "format": "uuid"},
|
||||||
|
"name": {"type": "string"},
|
||||||
|
"type": {"type": "string", "enum": ["frame", "rect", "text"]},
|
||||||
|
"x": {"type": "number"},
|
||||||
|
"y": {"type": "number"},
|
||||||
|
"width": {"type": "number"},
|
||||||
|
"height": {"type": "number"},
|
||||||
|
"rotation": {"type": "number"},
|
||||||
|
"selrect": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"x": {"type": "number"},
|
||||||
|
"y": {"type": "number"},
|
||||||
|
"width": {"type": "number"},
|
||||||
|
"height": {"type": "number"},
|
||||||
|
"x1": {"type": "number"},
|
||||||
|
"y1": {"type": "number"},
|
||||||
|
"x2": {"type": "number"},
|
||||||
|
"y2": {"type": "number"}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"points": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"x": {"type": "number"},
|
||||||
|
"y": {"type": "number"}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"transform": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"a": {"type": "number"},
|
||||||
|
"b": {"type": "number"},
|
||||||
|
"c": {"type": "number"},
|
||||||
|
"d": {"type": "number"},
|
||||||
|
"e": {"type": "number"},
|
||||||
|
"f": {"type": "number"}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"transformInverse": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"a": {"type": "number"},
|
||||||
|
"b": {"type": "number"},
|
||||||
|
"c": {"type": "number"},
|
||||||
|
"d": {"type": "number"},
|
||||||
|
"e": {"type": "number"},
|
||||||
|
"f": {"type": "number"}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"parentId": {"type": "string", "format": "uuid"},
|
||||||
|
"frameId": {"type": "string", "format": "uuid"},
|
||||||
|
"flipX": {"type": ["null", "boolean"]},
|
||||||
|
"flipY": {"type": ["null", "boolean"]},
|
||||||
|
"hideFillOnExport": {"type": "boolean"},
|
||||||
|
"growType": {"type": "string", "enum": ["fixed", "auto-height"]},
|
||||||
|
"hideInViewer": {"type": "boolean"},
|
||||||
|
"r1": {"type": "number"},
|
||||||
|
"r2": {"type": "number"},
|
||||||
|
"r3": {"type": "number"},
|
||||||
|
"r4": {"type": "number"},
|
||||||
|
"proportion": {"type": "number"},
|
||||||
|
"proportionLock": {"type": "boolean"},
|
||||||
|
"componentRoot": {"type": "boolean"},
|
||||||
|
"componentId": {"type": "string", "format": "uuid"},
|
||||||
|
"mainInstance": {"type": "boolean"},
|
||||||
|
"componentFile": {"type": "string", "format": "uuid"},
|
||||||
|
"strokes": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"strokeStyle": {"type": "string"},
|
||||||
|
"strokeAlignment": {"type": "string"},
|
||||||
|
"strokeWidth": {"type": "number"},
|
||||||
|
"strokeColor": {"type": "string"},
|
||||||
|
"strokeOpacity": {"type": "number"}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"fills": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"fillColor": {"type": "string"},
|
||||||
|
"fillOpacity": {"type": "number"},
|
||||||
|
"fillImage": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"name": {"type": "string"},
|
||||||
|
"width": {"type": "number"},
|
||||||
|
"height": {"type": "number"},
|
||||||
|
"mtype": {"type": "string"},
|
||||||
|
"id": {"type": "string", "format": "uuid"},
|
||||||
|
"keepAspectRatio": {"type": "boolean"}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"shapes": {
|
||||||
|
"type": "array",
|
||||||
|
"items": {"type": "string", "format": "uuid"}
|
||||||
|
},
|
||||||
|
"content": {
|
||||||
|
"type": "object",
|
||||||
|
"properties": {
|
||||||
|
"type": {"type": "string"},
|
||||||
|
"children": {"type": "array"}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"appliedTokens": {"type": "object"},
|
||||||
|
"positionData": {"type": "array"},
|
||||||
|
"layoutItemMarginType": {"type": "string"},
|
||||||
|
"constraintsV": {"type": "string"},
|
||||||
|
"constraintsH": {"type": "string"},
|
||||||
|
"layoutItemMargin": {"type": "object"},
|
||||||
|
"layoutGapType": {"type": "string"},
|
||||||
|
"layoutPadding": {"type": "object"},
|
||||||
|
"layoutWrapType": {"type": "string"},
|
||||||
|
"layout": {"type": "string"},
|
||||||
|
"layoutAlignItems": {"type": "string"},
|
||||||
|
"layoutPaddingType": {"type": "string"},
|
||||||
|
"layoutItemHSizing": {"type": "string"},
|
||||||
|
"layoutGap": {"type": "object"},
|
||||||
|
"layoutItemVSizing": {"type": "string"},
|
||||||
|
"layoutJustifyContent": {"type": "string"},
|
||||||
|
"layoutFlexDir": {"type": "string"},
|
||||||
|
"layoutAlignContent": {"type": "string"},
|
||||||
|
"shapeRef": {"type": "string", "format": "uuid"}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"id": {"type": "string", "format": "uuid"},
|
||||||
|
"name": {"type": "string"}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
1
penpot_mcp/server/__init__.py
Normal file
1
penpot_mcp/server/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""Server implementation for the Penpot MCP server."""
|
||||||
279
penpot_mcp/server/client.py
Normal file
279
penpot_mcp/server/client.py
Normal file
@@ -0,0 +1,279 @@
|
|||||||
|
"""Client for connecting to the Penpot MCP server."""
|
||||||
|
|
||||||
|
import asyncio
|
||||||
|
from typing import Any, Dict, List, Optional
|
||||||
|
|
||||||
|
from mcp import ClientSession, StdioServerParameters
|
||||||
|
from mcp.client.stdio import stdio_client
|
||||||
|
|
||||||
|
|
||||||
|
class PenpotMCPClient:
|
||||||
|
"""Client for interacting with the Penpot MCP server."""
|
||||||
|
|
||||||
|
def __init__(self, server_command="python", server_args=None, env=None):
|
||||||
|
"""
|
||||||
|
Initialize the Penpot MCP client.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
server_command: The command to run the server
|
||||||
|
server_args: Arguments to pass to the server command
|
||||||
|
env: Environment variables for the server process
|
||||||
|
"""
|
||||||
|
self.server_command = server_command
|
||||||
|
self.server_args = server_args or ["-m", "penpot_mcp.server.mcp_server"]
|
||||||
|
self.env = env
|
||||||
|
self.session = None
|
||||||
|
|
||||||
|
async def connect(self):
|
||||||
|
"""
|
||||||
|
Connect to the MCP server.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The client session
|
||||||
|
"""
|
||||||
|
# Create server parameters for stdio connection
|
||||||
|
server_params = StdioServerParameters(
|
||||||
|
command=self.server_command,
|
||||||
|
args=self.server_args,
|
||||||
|
env=self.env,
|
||||||
|
)
|
||||||
|
|
||||||
|
# Connect to the server
|
||||||
|
read, write = await stdio_client(server_params).__aenter__()
|
||||||
|
self.session = await ClientSession(read, write).__aenter__()
|
||||||
|
|
||||||
|
# Initialize the connection
|
||||||
|
await self.session.initialize()
|
||||||
|
|
||||||
|
return self.session
|
||||||
|
|
||||||
|
async def disconnect(self):
|
||||||
|
"""Disconnect from the server."""
|
||||||
|
if self.session:
|
||||||
|
await self.session.__aexit__(None, None, None)
|
||||||
|
self.session = None
|
||||||
|
|
||||||
|
async def list_resources(self) -> List[Dict[str, Any]]:
|
||||||
|
"""
|
||||||
|
List available resources from the server.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of resource information
|
||||||
|
"""
|
||||||
|
if not self.session:
|
||||||
|
raise RuntimeError("Not connected to server")
|
||||||
|
|
||||||
|
return await self.session.list_resources()
|
||||||
|
|
||||||
|
async def list_tools(self) -> List[Dict[str, Any]]:
|
||||||
|
"""
|
||||||
|
List available tools from the server.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
List of tool information
|
||||||
|
"""
|
||||||
|
if not self.session:
|
||||||
|
raise RuntimeError("Not connected to server")
|
||||||
|
|
||||||
|
return await self.session.list_tools()
|
||||||
|
|
||||||
|
async def get_server_info(self) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Get server information.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Server information
|
||||||
|
"""
|
||||||
|
if not self.session:
|
||||||
|
raise RuntimeError("Not connected to server")
|
||||||
|
|
||||||
|
info, _ = await self.session.read_resource("server://info")
|
||||||
|
return info
|
||||||
|
|
||||||
|
async def list_projects(self) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
List Penpot projects.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Project information
|
||||||
|
"""
|
||||||
|
if not self.session:
|
||||||
|
raise RuntimeError("Not connected to server")
|
||||||
|
|
||||||
|
return await self.session.call_tool("list_projects")
|
||||||
|
|
||||||
|
async def get_project(self, project_id: str) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Get details for a specific project.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
project_id: The project ID
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Project information
|
||||||
|
"""
|
||||||
|
if not self.session:
|
||||||
|
raise RuntimeError("Not connected to server")
|
||||||
|
|
||||||
|
return await self.session.call_tool("get_project", {"project_id": project_id})
|
||||||
|
|
||||||
|
async def get_project_files(self, project_id: str) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Get files for a specific project.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
project_id: The project ID
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
File information
|
||||||
|
"""
|
||||||
|
if not self.session:
|
||||||
|
raise RuntimeError("Not connected to server")
|
||||||
|
|
||||||
|
return await self.session.call_tool("get_project_files", {"project_id": project_id})
|
||||||
|
|
||||||
|
async def get_file(self, file_id: str, features: Optional[List[str]] = None,
|
||||||
|
project_id: Optional[str] = None) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Get details for a specific file.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file_id: The file ID
|
||||||
|
features: List of features to include
|
||||||
|
project_id: Optional project ID
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
File information
|
||||||
|
"""
|
||||||
|
if not self.session:
|
||||||
|
raise RuntimeError("Not connected to server")
|
||||||
|
|
||||||
|
params = {"file_id": file_id}
|
||||||
|
if features:
|
||||||
|
params["features"] = features
|
||||||
|
if project_id:
|
||||||
|
params["project_id"] = project_id
|
||||||
|
|
||||||
|
return await self.session.call_tool("get_file", params)
|
||||||
|
|
||||||
|
async def get_components(self) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Get components from the server.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Component information
|
||||||
|
"""
|
||||||
|
if not self.session:
|
||||||
|
raise RuntimeError("Not connected to server")
|
||||||
|
|
||||||
|
components, _ = await self.session.read_resource("content://components")
|
||||||
|
return components
|
||||||
|
|
||||||
|
async def export_object(self, file_id: str, page_id: str, object_id: str,
|
||||||
|
export_type: str = "png", scale: int = 1,
|
||||||
|
save_to_file: Optional[str] = None) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Export an object from a Penpot file.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file_id: The ID of the file containing the object
|
||||||
|
page_id: The ID of the page containing the object
|
||||||
|
object_id: The ID of the object to export
|
||||||
|
export_type: Export format (png, svg, pdf)
|
||||||
|
scale: Scale factor for the export
|
||||||
|
save_to_file: Optional path to save the exported file
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
If save_to_file is None: Dictionary with the exported image data
|
||||||
|
If save_to_file is provided: Dictionary with the saved file path
|
||||||
|
"""
|
||||||
|
if not self.session:
|
||||||
|
raise RuntimeError("Not connected to server")
|
||||||
|
|
||||||
|
params = {
|
||||||
|
"file_id": file_id,
|
||||||
|
"page_id": page_id,
|
||||||
|
"object_id": object_id,
|
||||||
|
"export_type": export_type,
|
||||||
|
"scale": scale
|
||||||
|
}
|
||||||
|
|
||||||
|
result = await self.session.call_tool("export_object", params)
|
||||||
|
|
||||||
|
# The result is now directly an Image object which has 'data' and 'format' fields
|
||||||
|
|
||||||
|
# If the client wants to save the file
|
||||||
|
if save_to_file:
|
||||||
|
import os
|
||||||
|
|
||||||
|
# Create directory if it doesn't exist
|
||||||
|
os.makedirs(os.path.dirname(os.path.abspath(save_to_file)), exist_ok=True)
|
||||||
|
|
||||||
|
# Save to file
|
||||||
|
with open(save_to_file, "wb") as f:
|
||||||
|
f.write(result["data"])
|
||||||
|
|
||||||
|
return {"file_path": save_to_file, "format": result.get("format")}
|
||||||
|
|
||||||
|
# Otherwise return the result as is
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
async def run_client_example():
|
||||||
|
"""Run a simple example using the client."""
|
||||||
|
# Create and connect the client
|
||||||
|
client = PenpotMCPClient()
|
||||||
|
await client.connect()
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Get server info
|
||||||
|
print("Getting server info...")
|
||||||
|
server_info = await client.get_server_info()
|
||||||
|
print(f"Server info: {server_info}")
|
||||||
|
|
||||||
|
# List projects
|
||||||
|
print("\nListing projects...")
|
||||||
|
projects_result = await client.list_projects()
|
||||||
|
if "error" in projects_result:
|
||||||
|
print(f"Error: {projects_result['error']}")
|
||||||
|
else:
|
||||||
|
projects = projects_result.get("projects", [])
|
||||||
|
print(f"Found {len(projects)} projects:")
|
||||||
|
for project in projects[:5]: # Show first 5 projects
|
||||||
|
print(f"- {project.get('name', 'Unknown')} (ID: {project.get('id', 'N/A')})")
|
||||||
|
|
||||||
|
# Example of exporting an object (uncomment and update with actual IDs to test)
|
||||||
|
"""
|
||||||
|
print("\nExporting object...")
|
||||||
|
# Replace with actual IDs from your Penpot account
|
||||||
|
export_result = await client.export_object(
|
||||||
|
file_id="your-file-id",
|
||||||
|
page_id="your-page-id",
|
||||||
|
object_id="your-object-id",
|
||||||
|
export_type="png",
|
||||||
|
scale=2,
|
||||||
|
save_to_file="exported_object.png"
|
||||||
|
)
|
||||||
|
print(f"Export saved to: {export_result.get('file_path')}")
|
||||||
|
|
||||||
|
# Or get the image data directly without saving
|
||||||
|
image_data = await client.export_object(
|
||||||
|
file_id="your-file-id",
|
||||||
|
page_id="your-page-id",
|
||||||
|
object_id="your-object-id"
|
||||||
|
)
|
||||||
|
print(f"Received image in format: {image_data.get('format')}")
|
||||||
|
print(f"Image size: {len(image_data.get('data'))} bytes")
|
||||||
|
"""
|
||||||
|
finally:
|
||||||
|
# Disconnect from the server
|
||||||
|
await client.disconnect()
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
"""Run the client example."""
|
||||||
|
asyncio.run(run_client_example())
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
431
penpot_mcp/server/mcp_server.py
Normal file
431
penpot_mcp/server/mcp_server.py
Normal file
@@ -0,0 +1,431 @@
|
|||||||
|
"""
|
||||||
|
Main MCP server implementation for Penpot.
|
||||||
|
|
||||||
|
This module defines the MCP server with resources and tools for interacting with
|
||||||
|
the Penpot design platform.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import hashlib
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
import re
|
||||||
|
import argparse
|
||||||
|
import sys
|
||||||
|
from typing import List, Optional, Dict
|
||||||
|
from mcp.server.fastmcp import FastMCP, Image
|
||||||
|
|
||||||
|
from penpot_mcp.api.penpot_api import PenpotAPI
|
||||||
|
from penpot_mcp.tools.penpot_tree import get_object_subtree_with_fields
|
||||||
|
from penpot_mcp.utils import config
|
||||||
|
from penpot_mcp.utils.cache import MemoryCache
|
||||||
|
from penpot_mcp.utils.http_server import ImageServer
|
||||||
|
|
||||||
|
|
||||||
|
class PenpotMCPServer:
|
||||||
|
"""Penpot MCP Server implementation."""
|
||||||
|
|
||||||
|
def __init__(self, name="Penpot MCP Server", test_mode=False):
|
||||||
|
"""
|
||||||
|
Initialize the Penpot MCP Server.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
name: Server name
|
||||||
|
test_mode: If True, certain features like HTTP server will be disabled for testing
|
||||||
|
"""
|
||||||
|
# Initialize the MCP server
|
||||||
|
self.mcp = FastMCP(name, instructions="""
|
||||||
|
I can help you generate code from your Penpot UI designs. My primary aim is to convert Penpot design components into functional code.
|
||||||
|
|
||||||
|
The typical workflow for code generation from Penpot designs is:
|
||||||
|
|
||||||
|
1. List your projects using 'list_projects' to find the project containing your designs
|
||||||
|
2. List files within the project using 'get_project_files' to locate the specific design file
|
||||||
|
3. Search for the target component within the file using 'search_object' to find the component you want to convert
|
||||||
|
4. Retrieve the Penpot tree schema using 'penpot_tree_schema' to understand which fields are available in the object tree
|
||||||
|
5. Get a cropped version of the object tree with a screenshot using 'get_object_tree' to see the component structure and visual representation
|
||||||
|
6. Get the full screenshot of the object using 'get_rendered_component' for detailed visual reference
|
||||||
|
|
||||||
|
For complex designs, you may need multiple iterations of 'get_object_tree' and 'get_rendered_component' due to LLM context limits.
|
||||||
|
|
||||||
|
Use the resources to access schemas, cached files, and rendered objects (screenshots) as needed.
|
||||||
|
|
||||||
|
Let me know which Penpot design you'd like to convert to code, and I'll guide you through the process!
|
||||||
|
""")
|
||||||
|
|
||||||
|
# Initialize the Penpot API
|
||||||
|
self.api = PenpotAPI(
|
||||||
|
base_url=config.PENPOT_API_URL,
|
||||||
|
debug=config.DEBUG
|
||||||
|
)
|
||||||
|
|
||||||
|
# Initialize memory cache
|
||||||
|
self.file_cache = MemoryCache(ttl_seconds=600) # 10 minutes
|
||||||
|
|
||||||
|
# Storage for rendered component images
|
||||||
|
self.rendered_components: Dict[str, Image] = {}
|
||||||
|
|
||||||
|
# Initialize HTTP server for images if enabled and not in test mode
|
||||||
|
self.image_server = None
|
||||||
|
self.image_server_url = None
|
||||||
|
|
||||||
|
# Detect if running in a test environment
|
||||||
|
is_test_env = test_mode or 'pytest' in sys.modules
|
||||||
|
|
||||||
|
if config.ENABLE_HTTP_SERVER and not is_test_env:
|
||||||
|
try:
|
||||||
|
self.image_server = ImageServer(
|
||||||
|
host=config.HTTP_SERVER_HOST,
|
||||||
|
port=config.HTTP_SERVER_PORT
|
||||||
|
)
|
||||||
|
# Start the server and get the URL with actual port assigned
|
||||||
|
self.image_server_url = self.image_server.start()
|
||||||
|
print(f"Image server started at {self.image_server_url}")
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Warning: Failed to start image server: {str(e)}")
|
||||||
|
|
||||||
|
# Register resources and tools
|
||||||
|
if config.RESOURCES_AS_TOOLS:
|
||||||
|
self._register_resources(resources_only=True)
|
||||||
|
self._register_tools(include_resource_tools=True)
|
||||||
|
else:
|
||||||
|
self._register_resources(resources_only=False)
|
||||||
|
self._register_tools(include_resource_tools=False)
|
||||||
|
|
||||||
|
def _register_resources(self, resources_only=False):
|
||||||
|
"""Register all MCP resources. If resources_only is True, only register server://info as a resource."""
|
||||||
|
@self.mcp.resource("server://info")
|
||||||
|
def server_info() -> dict:
|
||||||
|
"""Provide information about the server."""
|
||||||
|
info = {
|
||||||
|
"status": "online",
|
||||||
|
"name": "Penpot MCP Server",
|
||||||
|
"description": "Model Context Provider for Penpot",
|
||||||
|
"api_url": config.PENPOT_API_URL
|
||||||
|
}
|
||||||
|
|
||||||
|
if self.image_server and self.image_server.is_running:
|
||||||
|
info["image_server"] = self.image_server_url
|
||||||
|
|
||||||
|
return info
|
||||||
|
if resources_only:
|
||||||
|
return
|
||||||
|
@self.mcp.resource("penpot://schema", mime_type="application/schema+json")
|
||||||
|
def penpot_schema() -> dict:
|
||||||
|
"""Provide the Penpot API schema as JSON."""
|
||||||
|
schema_path = os.path.join(config.RESOURCES_PATH, 'penpot-schema.json')
|
||||||
|
try:
|
||||||
|
with open(schema_path, 'r') as f:
|
||||||
|
return json.load(f)
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": f"Failed to load schema: {str(e)}"}
|
||||||
|
@self.mcp.resource("penpot://tree-schema", mime_type="application/schema+json")
|
||||||
|
def penpot_tree_schema() -> dict:
|
||||||
|
"""Provide the Penpot object tree schema as JSON."""
|
||||||
|
schema_path = os.path.join(config.RESOURCES_PATH, 'penpot-tree-schema.json')
|
||||||
|
try:
|
||||||
|
with open(schema_path, 'r') as f:
|
||||||
|
return json.load(f)
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": f"Failed to load tree schema: {str(e)}"}
|
||||||
|
@self.mcp.resource("rendered-component://{component_id}", mime_type="image/png")
|
||||||
|
def get_rendered_component(component_id: str) -> Image:
|
||||||
|
"""Return a rendered component image by its ID."""
|
||||||
|
if component_id in self.rendered_components:
|
||||||
|
return self.rendered_components[component_id]
|
||||||
|
raise Exception(f"Component with ID {component_id} not found")
|
||||||
|
@self.mcp.resource("penpot://cached-files")
|
||||||
|
def get_cached_files() -> dict:
|
||||||
|
"""List all files currently stored in the cache."""
|
||||||
|
return self.file_cache.get_all_cached_files()
|
||||||
|
|
||||||
|
def _register_tools(self, include_resource_tools=False):
|
||||||
|
"""Register all MCP tools. If include_resource_tools is True, also register resource logic as tools."""
|
||||||
|
@self.mcp.tool()
|
||||||
|
def list_projects() -> dict:
|
||||||
|
"""Retrieve a list of all available Penpot projects."""
|
||||||
|
try:
|
||||||
|
projects = self.api.list_projects()
|
||||||
|
return {"projects": projects}
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": str(e)}
|
||||||
|
@self.mcp.tool()
|
||||||
|
def get_project_files(project_id: str) -> dict:
|
||||||
|
"""Get all files contained within a specific Penpot project.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
project_id: The ID of the Penpot project
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
files = self.api.get_project_files(project_id)
|
||||||
|
return {"files": files}
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": str(e)}
|
||||||
|
def get_cached_file(file_id: str) -> dict:
|
||||||
|
"""Internal helper to retrieve a file, using cache if available.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file_id: The ID of the Penpot file
|
||||||
|
"""
|
||||||
|
cached_data = self.file_cache.get(file_id)
|
||||||
|
if cached_data is not None:
|
||||||
|
return cached_data
|
||||||
|
try:
|
||||||
|
file_data = self.api.get_file(file_id=file_id)
|
||||||
|
self.file_cache.set(file_id, file_data)
|
||||||
|
return file_data
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": str(e)}
|
||||||
|
@self.mcp.tool()
|
||||||
|
def get_file(file_id: str) -> dict:
|
||||||
|
"""Retrieve a Penpot file by its ID and cache it. Don't use this tool for code generation, use 'get_object_tree' instead.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file_id: The ID of the Penpot file
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
file_data = self.api.get_file(file_id=file_id)
|
||||||
|
self.file_cache.set(file_id, file_data)
|
||||||
|
return file_data
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": str(e)}
|
||||||
|
@self.mcp.tool()
|
||||||
|
def export_object(
|
||||||
|
file_id: str,
|
||||||
|
page_id: str,
|
||||||
|
object_id: str,
|
||||||
|
export_type: str = "png",
|
||||||
|
scale: int = 1) -> Image:
|
||||||
|
"""Export a Penpot design object as an image.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file_id: The ID of the Penpot file
|
||||||
|
page_id: The ID of the page containing the object
|
||||||
|
object_id: The ID of the object to export
|
||||||
|
export_type: Image format (png, svg, etc.)
|
||||||
|
scale: Scale factor for the exported image
|
||||||
|
"""
|
||||||
|
temp_filename = None
|
||||||
|
try:
|
||||||
|
import tempfile
|
||||||
|
temp_dir = tempfile.gettempdir()
|
||||||
|
temp_filename = os.path.join(temp_dir, f"{object_id}.{export_type}")
|
||||||
|
output_path = self.api.export_and_download(
|
||||||
|
file_id=file_id,
|
||||||
|
page_id=page_id,
|
||||||
|
object_id=object_id,
|
||||||
|
export_type=export_type,
|
||||||
|
scale=scale,
|
||||||
|
save_to_file=temp_filename
|
||||||
|
)
|
||||||
|
with open(output_path, "rb") as f:
|
||||||
|
file_content = f.read()
|
||||||
|
|
||||||
|
image = Image(data=file_content, format=export_type)
|
||||||
|
|
||||||
|
# If HTTP server is enabled, add the image to the server
|
||||||
|
if self.image_server and self.image_server.is_running:
|
||||||
|
image_id = hashlib.md5(f"{file_id}:{page_id}:{object_id}".encode()).hexdigest()
|
||||||
|
# Use the current image_server_url to ensure the correct port
|
||||||
|
image_url = self.image_server.add_image(image_id, file_content, export_type)
|
||||||
|
# Add HTTP URL to the image metadata
|
||||||
|
image.http_url = image_url
|
||||||
|
|
||||||
|
return image
|
||||||
|
except Exception as e:
|
||||||
|
raise Exception(f"Export failed: {str(e)}")
|
||||||
|
finally:
|
||||||
|
if temp_filename and os.path.exists(temp_filename):
|
||||||
|
try:
|
||||||
|
os.remove(temp_filename)
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Warning: Failed to delete temporary file {temp_filename}: {str(e)}")
|
||||||
|
@self.mcp.tool()
|
||||||
|
def get_object_tree(
|
||||||
|
file_id: str,
|
||||||
|
object_id: str,
|
||||||
|
fields: List[str],
|
||||||
|
depth: int = -1,
|
||||||
|
format: str = "json"
|
||||||
|
) -> dict:
|
||||||
|
"""Get the object tree structure for a Penpot object ("tree" field) with rendered screenshot image of the object ("image.mcp_uri" field).
|
||||||
|
Args:
|
||||||
|
file_id: The ID of the Penpot file
|
||||||
|
object_id: The ID of the object to retrieve
|
||||||
|
fields: Specific fields to include in the tree (call "penpot_tree_schema" resource/tool for available fields)
|
||||||
|
depth: How deep to traverse the object tree (-1 for full depth)
|
||||||
|
format: Output format ('json' or 'yaml')
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
file_data = get_cached_file(file_id)
|
||||||
|
if "error" in file_data:
|
||||||
|
return file_data
|
||||||
|
result = get_object_subtree_with_fields(
|
||||||
|
file_data,
|
||||||
|
object_id,
|
||||||
|
include_fields=fields,
|
||||||
|
depth=depth
|
||||||
|
)
|
||||||
|
if "error" in result:
|
||||||
|
return result
|
||||||
|
simplified_tree = result["tree"]
|
||||||
|
page_id = result["page_id"]
|
||||||
|
final_result = {"tree": simplified_tree}
|
||||||
|
|
||||||
|
try:
|
||||||
|
image = export_object(
|
||||||
|
file_id=file_id,
|
||||||
|
page_id=page_id,
|
||||||
|
object_id=object_id
|
||||||
|
)
|
||||||
|
image_id = hashlib.md5(f"{file_id}:{object_id}".encode()).hexdigest()
|
||||||
|
self.rendered_components[image_id] = image
|
||||||
|
|
||||||
|
# Image URI preferences:
|
||||||
|
# 1. HTTP server URL if available
|
||||||
|
# 2. Fallback to MCP resource URI
|
||||||
|
image_uri = f"render_component://{image_id}"
|
||||||
|
if hasattr(image, 'http_url'):
|
||||||
|
final_result["image"] = {
|
||||||
|
"uri": image.http_url,
|
||||||
|
"mcp_uri": image_uri,
|
||||||
|
"format": image.format if hasattr(image, 'format') else "png"
|
||||||
|
}
|
||||||
|
else:
|
||||||
|
final_result["image"] = {
|
||||||
|
"uri": image_uri,
|
||||||
|
"format": image.format if hasattr(image, 'format') else "png"
|
||||||
|
}
|
||||||
|
except Exception as e:
|
||||||
|
final_result["image_error"] = str(e)
|
||||||
|
if format.lower() == "yaml":
|
||||||
|
try:
|
||||||
|
import yaml
|
||||||
|
yaml_result = yaml.dump(final_result, default_flow_style=False, sort_keys=False)
|
||||||
|
return {"yaml_result": yaml_result}
|
||||||
|
except ImportError:
|
||||||
|
return {"format_error": "YAML format requested but PyYAML package is not installed"}
|
||||||
|
except Exception as e:
|
||||||
|
return {"format_error": f"Error formatting as YAML: {str(e)}"}
|
||||||
|
return final_result
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": str(e)}
|
||||||
|
@self.mcp.tool()
|
||||||
|
def search_object(file_id: str, query: str) -> dict:
|
||||||
|
"""Search for objects within a Penpot file by name.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file_id: The ID of the Penpot file to search in
|
||||||
|
query: Search string (supports regex patterns)
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
file_data = get_cached_file(file_id)
|
||||||
|
if "error" in file_data:
|
||||||
|
return file_data
|
||||||
|
pattern = re.compile(query, re.IGNORECASE)
|
||||||
|
matches = []
|
||||||
|
data = file_data.get('data', {})
|
||||||
|
for page_id, page_data in data.get('pagesIndex', {}).items():
|
||||||
|
page_name = page_data.get('name', 'Unnamed')
|
||||||
|
for obj_id, obj_data in page_data.get('objects', {}).items():
|
||||||
|
obj_name = obj_data.get('name', '')
|
||||||
|
if pattern.search(obj_name):
|
||||||
|
matches.append({
|
||||||
|
'id': obj_id,
|
||||||
|
'name': obj_name,
|
||||||
|
'page_id': page_id,
|
||||||
|
'page_name': page_name,
|
||||||
|
'object_type': obj_data.get('type', 'unknown')
|
||||||
|
})
|
||||||
|
return {'objects': matches}
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": str(e)}
|
||||||
|
if include_resource_tools:
|
||||||
|
@self.mcp.tool()
|
||||||
|
def penpot_schema() -> dict:
|
||||||
|
"""Provide the Penpot API schema as JSON."""
|
||||||
|
schema_path = os.path.join(config.RESOURCES_PATH, 'penpot-schema.json')
|
||||||
|
try:
|
||||||
|
with open(schema_path, 'r') as f:
|
||||||
|
return json.load(f)
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": f"Failed to load schema: {str(e)}"}
|
||||||
|
@self.mcp.tool()
|
||||||
|
def penpot_tree_schema() -> dict:
|
||||||
|
"""Provide the Penpot object tree schema as JSON."""
|
||||||
|
schema_path = os.path.join(config.RESOURCES_PATH, 'penpot-tree-schema.json')
|
||||||
|
try:
|
||||||
|
with open(schema_path, 'r') as f:
|
||||||
|
return json.load(f)
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": f"Failed to load tree schema: {str(e)}"}
|
||||||
|
@self.mcp.tool()
|
||||||
|
def get_rendered_component(component_id: str) -> Image:
|
||||||
|
"""Return a rendered component image by its ID."""
|
||||||
|
if component_id in self.rendered_components:
|
||||||
|
return self.rendered_components[component_id]
|
||||||
|
raise Exception(f"Component with ID {component_id} not found")
|
||||||
|
@self.mcp.tool()
|
||||||
|
def get_cached_files() -> dict:
|
||||||
|
"""List all files currently stored in the cache."""
|
||||||
|
return self.file_cache.get_all_cached_files()
|
||||||
|
|
||||||
|
def run(self, port=None, debug=None, mode=None):
|
||||||
|
"""
|
||||||
|
Run the MCP server.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
port: Port to run on (overrides config) - only used in 'sse' mode
|
||||||
|
debug: Debug mode (overrides config)
|
||||||
|
mode: MCP mode ('stdio' or 'sse', overrides config)
|
||||||
|
"""
|
||||||
|
# Use provided values or fall back to config
|
||||||
|
debug = debug if debug is not None else config.DEBUG
|
||||||
|
|
||||||
|
# Get mode from parameter, environment variable, or default to stdio
|
||||||
|
mode = mode or os.environ.get('MODE', 'stdio')
|
||||||
|
|
||||||
|
# Validate mode
|
||||||
|
if mode not in ['stdio', 'sse']:
|
||||||
|
print(f"Invalid mode: {mode}. Using stdio mode.")
|
||||||
|
mode = 'stdio'
|
||||||
|
|
||||||
|
if mode == 'sse':
|
||||||
|
print(f"Starting Penpot MCP Server on port {port} (debug={debug}, mode={mode})")
|
||||||
|
else:
|
||||||
|
print(f"Starting Penpot MCP Server (debug={debug}, mode={mode})")
|
||||||
|
|
||||||
|
# Start HTTP server if enabled and not already running
|
||||||
|
if config.ENABLE_HTTP_SERVER and self.image_server and not self.image_server.is_running:
|
||||||
|
try:
|
||||||
|
self.image_server_url = self.image_server.start()
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Warning: Failed to start image server: {str(e)}")
|
||||||
|
|
||||||
|
self.mcp.run(mode)
|
||||||
|
|
||||||
|
|
||||||
|
def create_server():
|
||||||
|
"""Create and configure a new server instance."""
|
||||||
|
# Detect if running in a test environment
|
||||||
|
is_test_env = 'pytest' in sys.modules
|
||||||
|
return PenpotMCPServer(test_mode=is_test_env)
|
||||||
|
|
||||||
|
|
||||||
|
# Create a global server instance with a standard name for the MCP tool
|
||||||
|
server = create_server()
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
"""Entry point for the console script."""
|
||||||
|
parser = argparse.ArgumentParser(description='Run the Penpot MCP Server')
|
||||||
|
parser.add_argument('--port', type=int, help='Port to run on')
|
||||||
|
parser.add_argument('--debug', action='store_true', help='Enable debug mode')
|
||||||
|
parser.add_argument('--mode', choices=['stdio', 'sse'], default=os.environ.get('MODE', 'stdio'),
|
||||||
|
help='MCP mode (stdio or sse)')
|
||||||
|
|
||||||
|
args = parser.parse_args()
|
||||||
|
server.run(port=args.port, debug=args.debug, mode=args.mode)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
1
penpot_mcp/tools/__init__.py
Normal file
1
penpot_mcp/tools/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""Tool implementations for the Penpot MCP server."""
|
||||||
1
penpot_mcp/tools/cli/__init__.py
Normal file
1
penpot_mcp/tools/cli/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""Command-line interface tools for Penpot MCP."""
|
||||||
62
penpot_mcp/tools/cli/tree_cmd.py
Normal file
62
penpot_mcp/tools/cli/tree_cmd.py
Normal file
@@ -0,0 +1,62 @@
|
|||||||
|
"""Command-line interface for the Penpot tree visualization tool."""
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import json
|
||||||
|
import sys
|
||||||
|
from typing import Any, Dict
|
||||||
|
|
||||||
|
from penpot_mcp.tools.penpot_tree import build_tree, export_tree_to_dot, print_tree
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args() -> argparse.Namespace:
|
||||||
|
"""Parse command line arguments."""
|
||||||
|
parser = argparse.ArgumentParser(description='Generate a tree from a Penpot JSON file')
|
||||||
|
parser.add_argument('input_file', help='Path to the Penpot JSON file')
|
||||||
|
parser.add_argument('--filter', '-f', help='Filter nodes by regex pattern')
|
||||||
|
parser.add_argument('--export', '-e', help='Export tree to a file (supports PNG, SVG, etc.)')
|
||||||
|
return parser.parse_args()
|
||||||
|
|
||||||
|
|
||||||
|
def load_penpot_file(file_path: str) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Load a Penpot JSON file.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file_path: Path to the JSON file
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The loaded JSON data
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
FileNotFoundError: If the file doesn't exist
|
||||||
|
json.JSONDecodeError: If the file isn't valid JSON
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
with open(file_path, 'r') as f:
|
||||||
|
return json.load(f)
|
||||||
|
except FileNotFoundError:
|
||||||
|
sys.exit(f"Error: File not found: {file_path}")
|
||||||
|
except json.JSONDecodeError:
|
||||||
|
sys.exit(f"Error: Invalid JSON file: {file_path}")
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> None:
|
||||||
|
"""Main entry point for the command."""
|
||||||
|
args = parse_args()
|
||||||
|
|
||||||
|
# Load the Penpot file
|
||||||
|
data = load_penpot_file(args.input_file)
|
||||||
|
|
||||||
|
# Build the tree
|
||||||
|
root = build_tree(data)
|
||||||
|
|
||||||
|
# Export the tree if requested
|
||||||
|
if args.export:
|
||||||
|
export_tree_to_dot(root, args.export, args.filter)
|
||||||
|
|
||||||
|
# Print the tree
|
||||||
|
print_tree(root, args.filter)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
||||||
100
penpot_mcp/tools/cli/validate_cmd.py
Normal file
100
penpot_mcp/tools/cli/validate_cmd.py
Normal file
@@ -0,0 +1,100 @@
|
|||||||
|
"""Command-line interface for validating Penpot files against a schema."""
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
from typing import Any, Dict, Optional, Tuple
|
||||||
|
|
||||||
|
from jsonschema import SchemaError, ValidationError, validate
|
||||||
|
|
||||||
|
from penpot_mcp.utils import config
|
||||||
|
|
||||||
|
|
||||||
|
def parse_args() -> argparse.Namespace:
|
||||||
|
"""Parse command line arguments."""
|
||||||
|
parser = argparse.ArgumentParser(description='Validate a Penpot JSON file against a schema')
|
||||||
|
parser.add_argument('input_file', help='Path to the Penpot JSON file to validate')
|
||||||
|
parser.add_argument(
|
||||||
|
'--schema',
|
||||||
|
'-s',
|
||||||
|
default=os.path.join(
|
||||||
|
config.RESOURCES_PATH,
|
||||||
|
'penpot-schema.json'),
|
||||||
|
help='Path to the JSON schema file (default: resources/penpot-schema.json)')
|
||||||
|
parser.add_argument('--verbose', '-v', action='store_true',
|
||||||
|
help='Enable verbose output with detailed validation errors')
|
||||||
|
return parser.parse_args()
|
||||||
|
|
||||||
|
|
||||||
|
def load_json_file(file_path: str) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Load a JSON file.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file_path: Path to the JSON file
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The loaded JSON data
|
||||||
|
|
||||||
|
Raises:
|
||||||
|
FileNotFoundError: If the file doesn't exist
|
||||||
|
json.JSONDecodeError: If the file isn't valid JSON
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
with open(file_path, 'r') as f:
|
||||||
|
return json.load(f)
|
||||||
|
except FileNotFoundError:
|
||||||
|
sys.exit(f"Error: File not found: {file_path}")
|
||||||
|
except json.JSONDecodeError:
|
||||||
|
sys.exit(f"Error: Invalid JSON file: {file_path}")
|
||||||
|
|
||||||
|
|
||||||
|
def validate_penpot_file(data: Dict[str, Any], schema: Dict[str,
|
||||||
|
Any]) -> Tuple[bool, Optional[str]]:
|
||||||
|
"""
|
||||||
|
Validate a Penpot file against a schema.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
data: The Penpot file data
|
||||||
|
schema: The JSON schema
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Tuple of (is_valid, error_message)
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
validate(instance=data, schema=schema)
|
||||||
|
return True, None
|
||||||
|
except ValidationError as e:
|
||||||
|
return False, str(e)
|
||||||
|
except SchemaError as e:
|
||||||
|
return False, f"Schema error: {str(e)}"
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> None:
|
||||||
|
"""Main entry point for the command."""
|
||||||
|
args = parse_args()
|
||||||
|
|
||||||
|
# Load the files
|
||||||
|
print(f"Loading Penpot file: {args.input_file}")
|
||||||
|
data = load_json_file(args.input_file)
|
||||||
|
|
||||||
|
print(f"Loading schema file: {args.schema}")
|
||||||
|
schema = load_json_file(args.schema)
|
||||||
|
|
||||||
|
# Validate the file
|
||||||
|
print("Validating file...")
|
||||||
|
is_valid, error = validate_penpot_file(data, schema)
|
||||||
|
|
||||||
|
if is_valid:
|
||||||
|
print("✅ Validation successful! The file conforms to the schema.")
|
||||||
|
else:
|
||||||
|
print("❌ Validation failed!")
|
||||||
|
if args.verbose and error:
|
||||||
|
print("\nError details:")
|
||||||
|
print(error)
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == '__main__':
|
||||||
|
main()
|
||||||
472
penpot_mcp/tools/penpot_tree.py
Normal file
472
penpot_mcp/tools/penpot_tree.py
Normal file
@@ -0,0 +1,472 @@
|
|||||||
|
"""
|
||||||
|
Tool for building and visualizing the structure of Penpot files as a tree.
|
||||||
|
|
||||||
|
This module provides functionality to parse Penpot file data and generate
|
||||||
|
a tree representation, which can be displayed or exported.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import re
|
||||||
|
from typing import Any, Dict, Optional, Union, List
|
||||||
|
|
||||||
|
from anytree import Node, RenderTree
|
||||||
|
from anytree.exporter import DotExporter
|
||||||
|
|
||||||
|
|
||||||
|
def build_tree(data: Dict[str, Any]) -> Node:
|
||||||
|
"""
|
||||||
|
Build a tree representation of Penpot file data.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
data: The Penpot file data
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The root node of the tree
|
||||||
|
"""
|
||||||
|
# Create nodes dictionary with ID as key
|
||||||
|
nodes = {}
|
||||||
|
|
||||||
|
# Create a synthetic root node with a special ID that won't conflict
|
||||||
|
synthetic_root_id = "SYNTHETIC-ROOT"
|
||||||
|
root = Node(f"{synthetic_root_id} (root) - Root")
|
||||||
|
nodes[synthetic_root_id] = root
|
||||||
|
|
||||||
|
# Add components section
|
||||||
|
components_node = Node(f"components (section) - Components", parent=root)
|
||||||
|
|
||||||
|
# Store component annotations for later reference
|
||||||
|
component_annotations = {}
|
||||||
|
|
||||||
|
# Process components
|
||||||
|
for comp_id, comp_data in data.get('components', {}).items():
|
||||||
|
comp_name = comp_data.get('name', 'Unnamed')
|
||||||
|
comp_node = Node(f"{comp_id} (component) - {comp_name}", parent=components_node)
|
||||||
|
nodes[comp_id] = comp_node
|
||||||
|
|
||||||
|
# Store annotation if present
|
||||||
|
if 'annotation' in comp_data and comp_data['annotation']:
|
||||||
|
component_annotations[comp_id] = comp_data['annotation']
|
||||||
|
|
||||||
|
# First pass: create all page nodes
|
||||||
|
for page_id, page_data in data.get('pagesIndex', {}).items():
|
||||||
|
# Create page node
|
||||||
|
page_name = page_data.get('name', 'Unnamed')
|
||||||
|
page_node = Node(f"{page_id} (page) - {page_name}", parent=root)
|
||||||
|
nodes[page_id] = page_node
|
||||||
|
|
||||||
|
# Second pass: process each page and its objects
|
||||||
|
for page_id, page_data in data.get('pagesIndex', {}).items():
|
||||||
|
page_name = page_data.get('name', 'Unnamed')
|
||||||
|
|
||||||
|
# Create a page-specific dictionary for objects to avoid ID collisions
|
||||||
|
page_nodes = {}
|
||||||
|
|
||||||
|
# First, create all object nodes for this page
|
||||||
|
for obj_id, obj_data in page_data.get('objects', {}).items():
|
||||||
|
obj_type = obj_data.get('type', 'unknown')
|
||||||
|
obj_name = obj_data.get('name', 'Unnamed')
|
||||||
|
|
||||||
|
# Make a unique key that includes the page ID to avoid collisions
|
||||||
|
page_obj_id = f"{page_id}:{obj_id}"
|
||||||
|
|
||||||
|
node = Node(f"{obj_id} ({obj_type}) - {obj_name}")
|
||||||
|
page_nodes[obj_id] = node # Store with original ID for this page's lookup
|
||||||
|
|
||||||
|
# Store additional properties for filtering
|
||||||
|
node.obj_type = obj_type
|
||||||
|
node.obj_name = obj_name
|
||||||
|
node.obj_id = obj_id
|
||||||
|
|
||||||
|
# Add component reference if this is a component instance
|
||||||
|
if 'componentId' in obj_data and obj_data['componentId'] in nodes:
|
||||||
|
comp_ref = obj_data['componentId']
|
||||||
|
node.componentRef = comp_ref
|
||||||
|
|
||||||
|
# If this component has an annotation, store it
|
||||||
|
if comp_ref in component_annotations:
|
||||||
|
node.componentAnnotation = component_annotations[comp_ref]
|
||||||
|
|
||||||
|
# Identify the all-zeros root frame for this page
|
||||||
|
all_zeros_id = "00000000-0000-0000-0000-000000000000"
|
||||||
|
page_root_frame = None
|
||||||
|
|
||||||
|
# First, find and connect the all-zeros root frame if it exists
|
||||||
|
if all_zeros_id in page_data.get('objects', {}):
|
||||||
|
page_root_frame = page_nodes[all_zeros_id]
|
||||||
|
page_root_frame.parent = nodes[page_id]
|
||||||
|
|
||||||
|
# Then build parent-child relationships for this page
|
||||||
|
for obj_id, obj_data in page_data.get('objects', {}).items():
|
||||||
|
# Skip the all-zeros root frame as we already processed it
|
||||||
|
if obj_id == all_zeros_id:
|
||||||
|
continue
|
||||||
|
|
||||||
|
parent_id = obj_data.get('parentId')
|
||||||
|
|
||||||
|
# Skip if parent ID is the same as object ID (circular reference)
|
||||||
|
if parent_id and parent_id == obj_id:
|
||||||
|
print(
|
||||||
|
f"Warning: Object {obj_id} references itself as parent. Attaching to page instead.")
|
||||||
|
page_nodes[obj_id].parent = nodes[page_id]
|
||||||
|
elif parent_id and parent_id in page_nodes:
|
||||||
|
# Check for circular references in the node hierarchy
|
||||||
|
is_circular = False
|
||||||
|
check_node = page_nodes[parent_id]
|
||||||
|
while check_node.parent is not None:
|
||||||
|
if hasattr(check_node.parent, 'obj_id') and check_node.parent.obj_id == obj_id:
|
||||||
|
is_circular = True
|
||||||
|
break
|
||||||
|
check_node = check_node.parent
|
||||||
|
|
||||||
|
if is_circular:
|
||||||
|
print(
|
||||||
|
f"Warning: Circular reference detected for {obj_id}. Attaching to page instead.")
|
||||||
|
page_nodes[obj_id].parent = nodes[page_id]
|
||||||
|
else:
|
||||||
|
page_nodes[obj_id].parent = page_nodes[parent_id]
|
||||||
|
else:
|
||||||
|
# If no parent or parent not found, connect to the all-zeros root frame if it exists,
|
||||||
|
# otherwise connect to the page
|
||||||
|
if page_root_frame:
|
||||||
|
page_nodes[obj_id].parent = page_root_frame
|
||||||
|
else:
|
||||||
|
page_nodes[obj_id].parent = nodes[page_id]
|
||||||
|
|
||||||
|
return root
|
||||||
|
|
||||||
|
|
||||||
|
def print_tree(root: Node, filter_pattern: Optional[str] = None) -> None:
|
||||||
|
"""
|
||||||
|
Print a tree representation to the console, with optional filtering.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
root: The root node of the tree
|
||||||
|
filter_pattern: Optional regex pattern to filter nodes
|
||||||
|
"""
|
||||||
|
matched_nodes = []
|
||||||
|
|
||||||
|
# Apply filtering
|
||||||
|
if filter_pattern:
|
||||||
|
# Find all nodes that match the filter
|
||||||
|
pattern = re.compile(filter_pattern, re.IGNORECASE)
|
||||||
|
|
||||||
|
# Helper function to check if a node matches the filter
|
||||||
|
def matches_filter(node):
|
||||||
|
if not hasattr(node, 'obj_type') and not hasattr(node, 'obj_name'):
|
||||||
|
return False # Root node or section nodes
|
||||||
|
|
||||||
|
if pattern.search(
|
||||||
|
node.obj_type) or pattern.search(
|
||||||
|
node.obj_name) or pattern.search(
|
||||||
|
node.obj_id):
|
||||||
|
return True
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Find all matching nodes and their paths to root
|
||||||
|
for pre, _, node in RenderTree(root):
|
||||||
|
if matches_filter(node):
|
||||||
|
matched_nodes.append(node)
|
||||||
|
|
||||||
|
# If we found matches, only print these nodes and their ancestors
|
||||||
|
if matched_nodes:
|
||||||
|
print(f"Filtered results matching '{filter_pattern}':")
|
||||||
|
|
||||||
|
# Build a set of all nodes to show (matching nodes and their ancestors)
|
||||||
|
nodes_to_show = set()
|
||||||
|
for node in matched_nodes:
|
||||||
|
# Add the node and all its ancestors
|
||||||
|
current = node
|
||||||
|
while current is not None:
|
||||||
|
nodes_to_show.add(current)
|
||||||
|
current = current.parent
|
||||||
|
|
||||||
|
# Print the filtered tree
|
||||||
|
for pre, _, node in RenderTree(root):
|
||||||
|
if node in nodes_to_show:
|
||||||
|
node_name = node.name
|
||||||
|
if hasattr(node, 'componentRef'):
|
||||||
|
comp_ref_str = f" (refs component: {node.componentRef}"
|
||||||
|
if hasattr(node, 'componentAnnotation'):
|
||||||
|
comp_ref_str += f" - Note: {node.componentAnnotation}"
|
||||||
|
comp_ref_str += ")"
|
||||||
|
node_name += comp_ref_str
|
||||||
|
|
||||||
|
# Highlight matched nodes
|
||||||
|
if node in matched_nodes:
|
||||||
|
print(f"{pre}{node_name} <-- MATCH")
|
||||||
|
else:
|
||||||
|
print(f"{pre}{node_name}")
|
||||||
|
|
||||||
|
print(f"\nFound {len(matched_nodes)} matching objects.")
|
||||||
|
return
|
||||||
|
|
||||||
|
# If no filter or no matches, print the entire tree
|
||||||
|
for pre, _, node in RenderTree(root):
|
||||||
|
node_name = node.name
|
||||||
|
if hasattr(node, 'componentRef'):
|
||||||
|
comp_ref_str = f" (refs component: {node.componentRef}"
|
||||||
|
if hasattr(node, 'componentAnnotation'):
|
||||||
|
comp_ref_str += f" - Note: {node.componentAnnotation}"
|
||||||
|
comp_ref_str += ")"
|
||||||
|
node_name += comp_ref_str
|
||||||
|
print(f"{pre}{node_name}")
|
||||||
|
|
||||||
|
|
||||||
|
def export_tree_to_dot(root: Node, output_file: str, filter_pattern: Optional[str] = None) -> bool:
|
||||||
|
"""
|
||||||
|
Export the tree to a DOT file (Graphviz format).
|
||||||
|
|
||||||
|
Args:
|
||||||
|
root: The root node of the tree
|
||||||
|
output_file: Path to save the exported file
|
||||||
|
filter_pattern: Optional regex pattern to filter nodes
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
True if successful, False otherwise
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
# If filtering, we may want to only export the filtered tree
|
||||||
|
if filter_pattern:
|
||||||
|
# TODO: Implement filtered export
|
||||||
|
pass
|
||||||
|
|
||||||
|
DotExporter(root).to_picture(output_file)
|
||||||
|
print(f"Tree exported to {output_file}")
|
||||||
|
return True
|
||||||
|
except Exception as e:
|
||||||
|
print(f"Warning: Could not export to {output_file}: {e}")
|
||||||
|
print("Make sure Graphviz is installed: https://graphviz.org/download/")
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
def find_page_containing_object(content: Dict[str, Any], object_id: str) -> Optional[str]:
|
||||||
|
"""
|
||||||
|
Find which page contains the specified object.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
content: The Penpot file content
|
||||||
|
object_id: The ID of the object to find
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The page ID containing the object, or None if not found
|
||||||
|
"""
|
||||||
|
# Helper function to recursively search for an object in the hierarchy
|
||||||
|
def find_object_in_hierarchy(objects_dict, target_id):
|
||||||
|
# Check if the object is directly in the dictionary
|
||||||
|
if target_id in objects_dict:
|
||||||
|
return True
|
||||||
|
|
||||||
|
# Check if the object is a child of any object in the dictionary
|
||||||
|
for obj_id, obj_data in objects_dict.items():
|
||||||
|
# Look for objects that have shapes (children)
|
||||||
|
if "shapes" in obj_data and target_id in obj_data["shapes"]:
|
||||||
|
return True
|
||||||
|
|
||||||
|
# Check in children elements if any
|
||||||
|
if "children" in obj_data:
|
||||||
|
child_objects = {child["id"]: child for child in obj_data["children"]}
|
||||||
|
if find_object_in_hierarchy(child_objects, target_id):
|
||||||
|
return True
|
||||||
|
|
||||||
|
return False
|
||||||
|
|
||||||
|
# Check each page
|
||||||
|
for page_id, page_data in content.get('pagesIndex', {}).items():
|
||||||
|
objects_dict = page_data.get('objects', {})
|
||||||
|
if find_object_in_hierarchy(objects_dict, object_id):
|
||||||
|
return page_id
|
||||||
|
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def find_object_in_tree(tree: Node, target_id: str) -> Optional[Dict[str, Any]]:
|
||||||
|
"""
|
||||||
|
Find an object in the tree by its ID and return its subtree as a dictionary.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
tree: The root node of the tree
|
||||||
|
target_id: The ID of the object to find
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary representation of the object's subtree, or None if not found
|
||||||
|
"""
|
||||||
|
# Helper function to search in a node's children
|
||||||
|
def find_object_in_children(node, target_id):
|
||||||
|
for child in node.children:
|
||||||
|
if hasattr(child, 'obj_id') and child.obj_id == target_id:
|
||||||
|
return convert_node_to_dict(child)
|
||||||
|
|
||||||
|
result = find_object_in_children(child, target_id)
|
||||||
|
if result:
|
||||||
|
return result
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Iterate through the tree's children
|
||||||
|
for child in tree.children:
|
||||||
|
# Check if this is a page node (contains "(page)" in its name)
|
||||||
|
if "(page)" in child.name:
|
||||||
|
# Check all objects under this page
|
||||||
|
for obj in child.children:
|
||||||
|
if hasattr(obj, 'obj_id') and obj.obj_id == target_id:
|
||||||
|
return convert_node_to_dict(obj)
|
||||||
|
|
||||||
|
# Check children recursively
|
||||||
|
result = find_object_in_children(obj, target_id)
|
||||||
|
if result:
|
||||||
|
return result
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def convert_node_to_dict(node: Node) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Convert an anytree.Node to a dictionary format for API response.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
node: The node to convert
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary representation of the node and its subtree
|
||||||
|
"""
|
||||||
|
result = {
|
||||||
|
'id': node.obj_id if hasattr(node, 'obj_id') else None,
|
||||||
|
'type': node.obj_type if hasattr(node, 'obj_type') else None,
|
||||||
|
'name': node.obj_name if hasattr(node, 'obj_name') else None,
|
||||||
|
'children': []
|
||||||
|
}
|
||||||
|
|
||||||
|
# Add component reference if available
|
||||||
|
if hasattr(node, 'componentRef'):
|
||||||
|
result['componentRef'] = node.componentRef
|
||||||
|
|
||||||
|
# Add component annotation if available
|
||||||
|
if hasattr(node, 'componentAnnotation'):
|
||||||
|
result['componentAnnotation'] = node.componentAnnotation
|
||||||
|
|
||||||
|
# Recursively add children
|
||||||
|
for child in node.children:
|
||||||
|
result['children'].append(convert_node_to_dict(child))
|
||||||
|
|
||||||
|
return result
|
||||||
|
|
||||||
|
|
||||||
|
def get_object_subtree(file_data: Dict[str, Any], object_id: str) -> Dict[str, Union[Dict, str]]:
|
||||||
|
"""
|
||||||
|
Get a simplified tree representation of an object and its children.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file_data: The Penpot file data
|
||||||
|
object_id: The ID of the object to get the tree for
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary containing the simplified tree or an error message
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
# Get the content from file data
|
||||||
|
content = file_data.get('data')
|
||||||
|
|
||||||
|
# Find which page contains the object
|
||||||
|
page_id = find_page_containing_object(content, object_id)
|
||||||
|
|
||||||
|
if not page_id:
|
||||||
|
return {"error": f"Object {object_id} not found in file"}
|
||||||
|
|
||||||
|
# Build the full tree
|
||||||
|
full_tree = build_tree(content)
|
||||||
|
|
||||||
|
# Find the object in the full tree and extract its subtree
|
||||||
|
simplified_tree = find_object_in_tree(full_tree, object_id)
|
||||||
|
|
||||||
|
if not simplified_tree:
|
||||||
|
return {"error": f"Object {object_id} not found in tree structure"}
|
||||||
|
|
||||||
|
return {
|
||||||
|
"tree": simplified_tree,
|
||||||
|
"page_id": page_id
|
||||||
|
}
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": str(e)}
|
||||||
|
|
||||||
|
|
||||||
|
def get_object_subtree_with_fields(file_data: Dict[str, Any], object_id: str,
|
||||||
|
include_fields: Optional[List[str]] = None,
|
||||||
|
depth: int = -1) -> Dict[str, Any]:
|
||||||
|
"""
|
||||||
|
Get a filtered tree representation of an object with only specified fields.
|
||||||
|
|
||||||
|
This function finds an object in the Penpot file data and returns a subtree
|
||||||
|
with the object as the root, including only the specified fields and limiting
|
||||||
|
the depth of the tree if requested.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file_data: The Penpot file data
|
||||||
|
object_id: The ID of the object to get the tree for
|
||||||
|
include_fields: List of field names to include in the output (None means include all)
|
||||||
|
depth: Maximum depth of the tree (-1 means no limit)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary containing the filtered tree or an error message
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
# Get the content from file data
|
||||||
|
content = file_data.get('data', file_data)
|
||||||
|
|
||||||
|
# Find which page contains the object
|
||||||
|
page_id = find_page_containing_object(content, object_id)
|
||||||
|
|
||||||
|
if not page_id:
|
||||||
|
return {"error": f"Object {object_id} not found in file"}
|
||||||
|
|
||||||
|
# Get the page data
|
||||||
|
page_data = content.get('pagesIndex', {}).get(page_id, {})
|
||||||
|
objects_dict = page_data.get('objects', {})
|
||||||
|
|
||||||
|
# Check if the object exists in this page
|
||||||
|
if object_id not in objects_dict:
|
||||||
|
return {"error": f"Object {object_id} not found in page {page_id}"}
|
||||||
|
|
||||||
|
# Function to recursively build the filtered object tree
|
||||||
|
def build_filtered_object_tree(obj_id: str, current_depth: int = 0):
|
||||||
|
if obj_id not in objects_dict:
|
||||||
|
return None
|
||||||
|
|
||||||
|
obj_data = objects_dict[obj_id]
|
||||||
|
|
||||||
|
# Create a new dict with only the requested fields or all fields if None
|
||||||
|
if include_fields is None:
|
||||||
|
filtered_obj = obj_data.copy()
|
||||||
|
else:
|
||||||
|
filtered_obj = {field: obj_data[field] for field in include_fields if field in obj_data}
|
||||||
|
|
||||||
|
# Always include the id field
|
||||||
|
filtered_obj['id'] = obj_id
|
||||||
|
|
||||||
|
# If depth limit reached, don't process children
|
||||||
|
if depth != -1 and current_depth >= depth:
|
||||||
|
return filtered_obj
|
||||||
|
|
||||||
|
# Find all children of this object
|
||||||
|
children = []
|
||||||
|
for child_id, child_data in objects_dict.items():
|
||||||
|
if child_data.get('parentId') == obj_id:
|
||||||
|
child_tree = build_filtered_object_tree(child_id, current_depth + 1)
|
||||||
|
if child_tree:
|
||||||
|
children.append(child_tree)
|
||||||
|
|
||||||
|
# Add children field only if we have children
|
||||||
|
if children:
|
||||||
|
filtered_obj['children'] = children
|
||||||
|
|
||||||
|
return filtered_obj
|
||||||
|
|
||||||
|
# Build the filtered tree starting from the requested object
|
||||||
|
object_tree = build_filtered_object_tree(object_id)
|
||||||
|
|
||||||
|
if not object_tree:
|
||||||
|
return {"error": f"Failed to build object tree for {object_id}"}
|
||||||
|
|
||||||
|
return {
|
||||||
|
"tree": object_tree,
|
||||||
|
"page_id": page_id
|
||||||
|
}
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": str(e)}
|
||||||
1
penpot_mcp/utils/__init__.py
Normal file
1
penpot_mcp/utils/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""Utility functions and helper modules for the Penpot MCP server."""
|
||||||
83
penpot_mcp/utils/cache.py
Normal file
83
penpot_mcp/utils/cache.py
Normal file
@@ -0,0 +1,83 @@
|
|||||||
|
"""
|
||||||
|
Cache utilities for Penpot MCP server.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import time
|
||||||
|
from typing import Optional, Dict, Any
|
||||||
|
|
||||||
|
class MemoryCache:
|
||||||
|
"""In-memory cache implementation with TTL support."""
|
||||||
|
|
||||||
|
def __init__(self, ttl_seconds: int = 600):
|
||||||
|
"""
|
||||||
|
Initialize the memory cache.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
ttl_seconds: Time to live in seconds (default 10 minutes)
|
||||||
|
"""
|
||||||
|
self.ttl_seconds = ttl_seconds
|
||||||
|
self._cache: Dict[str, Dict[str, Any]] = {}
|
||||||
|
|
||||||
|
def get(self, file_id: str) -> Optional[Dict[str, Any]]:
|
||||||
|
"""
|
||||||
|
Get a file from cache if it exists and is not expired.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file_id: The ID of the file to retrieve
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
The cached file data or None if not found/expired
|
||||||
|
"""
|
||||||
|
if file_id not in self._cache:
|
||||||
|
return None
|
||||||
|
|
||||||
|
cache_data = self._cache[file_id]
|
||||||
|
|
||||||
|
# Check if cache is expired
|
||||||
|
if time.time() - cache_data['timestamp'] > self.ttl_seconds:
|
||||||
|
del self._cache[file_id] # Remove expired cache
|
||||||
|
return None
|
||||||
|
|
||||||
|
return cache_data['data']
|
||||||
|
|
||||||
|
def set(self, file_id: str, data: Dict[str, Any]) -> None:
|
||||||
|
"""
|
||||||
|
Store a file in cache.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
file_id: The ID of the file to cache
|
||||||
|
data: The file data to cache
|
||||||
|
"""
|
||||||
|
self._cache[file_id] = {
|
||||||
|
'timestamp': time.time(),
|
||||||
|
'data': data
|
||||||
|
}
|
||||||
|
|
||||||
|
def clear(self) -> None:
|
||||||
|
"""Clear all cached files."""
|
||||||
|
self._cache.clear()
|
||||||
|
|
||||||
|
def get_all_cached_files(self) -> Dict[str, Dict[str, Any]]:
|
||||||
|
"""
|
||||||
|
Get all valid cached files.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Dictionary mapping file IDs to their cached data
|
||||||
|
"""
|
||||||
|
result = {}
|
||||||
|
current_time = time.time()
|
||||||
|
|
||||||
|
# Create a list of expired keys to remove
|
||||||
|
expired_keys = []
|
||||||
|
|
||||||
|
for file_id, cache_data in self._cache.items():
|
||||||
|
if current_time - cache_data['timestamp'] <= self.ttl_seconds:
|
||||||
|
result[file_id] = cache_data['data']
|
||||||
|
else:
|
||||||
|
expired_keys.append(file_id)
|
||||||
|
|
||||||
|
# Remove expired entries
|
||||||
|
for key in expired_keys:
|
||||||
|
del self._cache[key]
|
||||||
|
|
||||||
|
return result
|
||||||
25
penpot_mcp/utils/config.py
Normal file
25
penpot_mcp/utils/config.py
Normal file
@@ -0,0 +1,25 @@
|
|||||||
|
"""Configuration module for the Penpot MCP server."""
|
||||||
|
|
||||||
|
import os
|
||||||
|
|
||||||
|
from dotenv import find_dotenv, load_dotenv
|
||||||
|
|
||||||
|
# Load environment variables
|
||||||
|
load_dotenv(find_dotenv())
|
||||||
|
|
||||||
|
# Server configuration
|
||||||
|
PORT = int(os.environ.get('PORT', 5000))
|
||||||
|
DEBUG = os.environ.get('DEBUG', 'true').lower() == 'true'
|
||||||
|
RESOURCES_AS_TOOLS = os.environ.get('RESOURCES_AS_TOOLS', 'true').lower() == 'true'
|
||||||
|
|
||||||
|
# HTTP server for exported images
|
||||||
|
ENABLE_HTTP_SERVER = os.environ.get('ENABLE_HTTP_SERVER', 'true').lower() == 'true'
|
||||||
|
HTTP_SERVER_HOST = os.environ.get('HTTP_SERVER_HOST', 'localhost')
|
||||||
|
HTTP_SERVER_PORT = int(os.environ.get('HTTP_SERVER_PORT', 0))
|
||||||
|
|
||||||
|
# Penpot API configuration
|
||||||
|
PENPOT_API_URL = os.environ.get('PENPOT_API_URL', 'https://design.penpot.app/api')
|
||||||
|
PENPOT_USERNAME = os.environ.get('PENPOT_USERNAME')
|
||||||
|
PENPOT_PASSWORD = os.environ.get('PENPOT_PASSWORD')
|
||||||
|
|
||||||
|
RESOURCES_PATH = os.path.join(os.path.dirname(os.path.dirname(__file__)), 'resources')
|
||||||
128
penpot_mcp/utils/http_server.py
Normal file
128
penpot_mcp/utils/http_server.py
Normal file
@@ -0,0 +1,128 @@
|
|||||||
|
"""HTTP server module for serving exported images from memory."""
|
||||||
|
|
||||||
|
import io
|
||||||
|
import json
|
||||||
|
import threading
|
||||||
|
from http.server import BaseHTTPRequestHandler, HTTPServer
|
||||||
|
import socketserver
|
||||||
|
|
||||||
|
class InMemoryImageHandler(BaseHTTPRequestHandler):
|
||||||
|
"""HTTP request handler for serving images stored in memory."""
|
||||||
|
|
||||||
|
# Class variable to store images
|
||||||
|
images = {}
|
||||||
|
|
||||||
|
def do_GET(self):
|
||||||
|
"""Handle GET requests."""
|
||||||
|
# Remove query parameters if any
|
||||||
|
path = self.path.split('?', 1)[0]
|
||||||
|
path = path.split('#', 1)[0]
|
||||||
|
|
||||||
|
# Extract image ID from path
|
||||||
|
# Expected path format: /images/{image_id}.{format}
|
||||||
|
parts = path.split('/')
|
||||||
|
if len(parts) == 3 and parts[1] == 'images':
|
||||||
|
# Extract image_id by removing the file extension if present
|
||||||
|
image_id_with_ext = parts[2]
|
||||||
|
image_id = image_id_with_ext.split('.')[0]
|
||||||
|
|
||||||
|
if image_id in self.images:
|
||||||
|
img_data = self.images[image_id]['data']
|
||||||
|
img_format = self.images[image_id]['format']
|
||||||
|
|
||||||
|
# Set content type based on format
|
||||||
|
content_type = f"image/{img_format}"
|
||||||
|
if img_format == 'svg':
|
||||||
|
content_type = 'image/svg+xml'
|
||||||
|
|
||||||
|
self.send_response(200)
|
||||||
|
self.send_header('Content-type', content_type)
|
||||||
|
self.send_header('Content-length', len(img_data))
|
||||||
|
self.end_headers()
|
||||||
|
self.wfile.write(img_data)
|
||||||
|
return
|
||||||
|
|
||||||
|
# Return 404 if image not found
|
||||||
|
self.send_response(404)
|
||||||
|
self.send_header('Content-type', 'application/json')
|
||||||
|
self.end_headers()
|
||||||
|
response = {'error': 'Image not found'}
|
||||||
|
self.wfile.write(json.dumps(response).encode())
|
||||||
|
|
||||||
|
|
||||||
|
class ImageServer:
|
||||||
|
"""Server for in-memory images."""
|
||||||
|
|
||||||
|
def __init__(self, host='localhost', port=0):
|
||||||
|
"""Initialize the HTTP server.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
host: Host address to listen on
|
||||||
|
port: Port to listen on (0 means use a random available port)
|
||||||
|
"""
|
||||||
|
self.host = host
|
||||||
|
self.port = port
|
||||||
|
self.server = None
|
||||||
|
self.server_thread = None
|
||||||
|
self.is_running = False
|
||||||
|
self.base_url = None
|
||||||
|
|
||||||
|
def start(self):
|
||||||
|
"""Start the HTTP server in a background thread.
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
Base URL of the server with actual port used
|
||||||
|
"""
|
||||||
|
if self.is_running:
|
||||||
|
return self.base_url
|
||||||
|
|
||||||
|
# Create TCP server with address reuse enabled
|
||||||
|
class ReuseAddressTCPServer(socketserver.TCPServer):
|
||||||
|
allow_reuse_address = True
|
||||||
|
|
||||||
|
self.server = ReuseAddressTCPServer((self.host, self.port), InMemoryImageHandler)
|
||||||
|
|
||||||
|
# Get the actual port that was assigned
|
||||||
|
self.port = self.server.socket.getsockname()[1]
|
||||||
|
self.base_url = f"http://{self.host}:{self.port}"
|
||||||
|
|
||||||
|
# Start server in a separate thread
|
||||||
|
self.server_thread = threading.Thread(target=self.server.serve_forever)
|
||||||
|
self.server_thread.daemon = True # Don't keep process running if main thread exits
|
||||||
|
self.server_thread.start()
|
||||||
|
self.is_running = True
|
||||||
|
|
||||||
|
print(f"Image server started at {self.base_url}")
|
||||||
|
return self.base_url
|
||||||
|
|
||||||
|
def stop(self):
|
||||||
|
"""Stop the HTTP server."""
|
||||||
|
if not self.is_running:
|
||||||
|
return
|
||||||
|
|
||||||
|
self.server.shutdown()
|
||||||
|
self.server.server_close()
|
||||||
|
self.is_running = False
|
||||||
|
print("Image server stopped")
|
||||||
|
|
||||||
|
def add_image(self, image_id, image_data, image_format='png'):
|
||||||
|
"""Add image to in-memory storage.
|
||||||
|
|
||||||
|
Args:
|
||||||
|
image_id: Unique identifier for the image
|
||||||
|
image_data: Binary image data
|
||||||
|
image_format: Image format (png, jpg, etc.)
|
||||||
|
|
||||||
|
Returns:
|
||||||
|
URL to access the image
|
||||||
|
"""
|
||||||
|
InMemoryImageHandler.images[image_id] = {
|
||||||
|
'data': image_data,
|
||||||
|
'format': image_format
|
||||||
|
}
|
||||||
|
return f"{self.base_url}/images/{image_id}.{image_format}"
|
||||||
|
|
||||||
|
def remove_image(self, image_id):
|
||||||
|
"""Remove image from in-memory storage."""
|
||||||
|
if image_id in InMemoryImageHandler.images:
|
||||||
|
del InMemoryImageHandler.images[image_id]
|
||||||
138
pyproject.toml
Normal file
138
pyproject.toml
Normal file
@@ -0,0 +1,138 @@
|
|||||||
|
[build-system]
|
||||||
|
requires = ["setuptools>=61.0", "wheel"]
|
||||||
|
build-backend = "setuptools.build_meta"
|
||||||
|
|
||||||
|
[project]
|
||||||
|
name = "penpot-mcp"
|
||||||
|
version = "0.1.0"
|
||||||
|
description = "Model Context Protocol server for Penpot"
|
||||||
|
readme = "README.md"
|
||||||
|
license = {text = "MIT"}
|
||||||
|
authors = [
|
||||||
|
{name = "Montevive AI Team", email = "info@montevive.ai"}
|
||||||
|
]
|
||||||
|
keywords = ["penpot", "mcp", "llm", "ai", "design", "prototyping"]
|
||||||
|
classifiers = [
|
||||||
|
"Development Status :: 3 - Alpha",
|
||||||
|
"Intended Audience :: Developers",
|
||||||
|
"License :: OSI Approved :: MIT License",
|
||||||
|
"Programming Language :: Python :: 3",
|
||||||
|
"Programming Language :: Python :: 3.12",
|
||||||
|
"Topic :: Software Development :: Libraries :: Python Modules",
|
||||||
|
"Topic :: Multimedia :: Graphics :: Graphics Conversion",
|
||||||
|
"Topic :: Scientific/Engineering :: Artificial Intelligence",
|
||||||
|
]
|
||||||
|
requires-python = ">=3.12"
|
||||||
|
dependencies = [
|
||||||
|
"mcp>=1.7.0",
|
||||||
|
"python-dotenv>=1.0.0",
|
||||||
|
"requests>=2.26.0",
|
||||||
|
"gunicorn>=20.1.0",
|
||||||
|
"anytree>=2.8.0",
|
||||||
|
"jsonschema>=4.0.0",
|
||||||
|
"PyYAML>=6.0.0",
|
||||||
|
]
|
||||||
|
|
||||||
|
[project.optional-dependencies]
|
||||||
|
dev = [
|
||||||
|
"pytest>=7.4.0",
|
||||||
|
"pytest-mock>=3.11.1",
|
||||||
|
"pytest-cov>=4.1.0",
|
||||||
|
"flake8>=6.1.0",
|
||||||
|
"flake8-docstrings>=1.7.0",
|
||||||
|
"pre-commit>=3.5.0",
|
||||||
|
"isort>=5.12.0",
|
||||||
|
"autopep8>=2.0.4",
|
||||||
|
"pyupgrade>=3.13.0",
|
||||||
|
"setuptools>=65.5.0",
|
||||||
|
]
|
||||||
|
cli = [
|
||||||
|
"mcp[cli]>=1.7.0",
|
||||||
|
]
|
||||||
|
|
||||||
|
[project.urls]
|
||||||
|
Homepage = "https://github.com/montevive/penpot-mcp"
|
||||||
|
Repository = "https://github.com/montevive/penpot-mcp.git"
|
||||||
|
Issues = "https://github.com/montevive/penpot-mcp/issues"
|
||||||
|
Documentation = "https://github.com/montevive/penpot-mcp#readme"
|
||||||
|
|
||||||
|
[project.scripts]
|
||||||
|
penpot-mcp = "penpot_mcp.server.mcp_server:main"
|
||||||
|
penpot-client = "penpot_mcp.server.client:main"
|
||||||
|
penpot-tree = "penpot_mcp.tools.cli.tree_cmd:main"
|
||||||
|
penpot-validate = "penpot_mcp.tools.cli.validate_cmd:main"
|
||||||
|
|
||||||
|
[tool.setuptools.packages.find]
|
||||||
|
where = ["."]
|
||||||
|
include = ["penpot_mcp*"]
|
||||||
|
|
||||||
|
[tool.setuptools.package-data]
|
||||||
|
penpot_mcp = ["resources/*.json"]
|
||||||
|
|
||||||
|
# pytest configuration
|
||||||
|
[tool.pytest.ini_options]
|
||||||
|
testpaths = ["tests"]
|
||||||
|
python_files = ["test_*.py", "*_test.py"]
|
||||||
|
python_classes = ["Test*"]
|
||||||
|
python_functions = ["test_*"]
|
||||||
|
addopts = [
|
||||||
|
"--strict-markers",
|
||||||
|
"--strict-config",
|
||||||
|
"--verbose",
|
||||||
|
]
|
||||||
|
markers = [
|
||||||
|
"slow: marks tests as slow (deselect with '-m \"not slow\"')",
|
||||||
|
"integration: marks tests as integration tests",
|
||||||
|
]
|
||||||
|
|
||||||
|
# Coverage configuration
|
||||||
|
[tool.coverage.run]
|
||||||
|
source = ["penpot_mcp"]
|
||||||
|
omit = [
|
||||||
|
"*/tests/*",
|
||||||
|
"*/test_*",
|
||||||
|
"*/__pycache__/*",
|
||||||
|
"*/venv/*",
|
||||||
|
"*/.venv/*",
|
||||||
|
]
|
||||||
|
|
||||||
|
[tool.coverage.report]
|
||||||
|
exclude_lines = [
|
||||||
|
"pragma: no cover",
|
||||||
|
"def __repr__",
|
||||||
|
"if self.debug:",
|
||||||
|
"if settings.DEBUG",
|
||||||
|
"raise AssertionError",
|
||||||
|
"raise NotImplementedError",
|
||||||
|
"if 0:",
|
||||||
|
"if __name__ == .__main__.:",
|
||||||
|
"class .*\\bProtocol\\):",
|
||||||
|
"@(abc\\.)?abstractmethod",
|
||||||
|
]
|
||||||
|
|
||||||
|
# isort configuration
|
||||||
|
[tool.isort]
|
||||||
|
profile = "black"
|
||||||
|
multi_line_output = 3
|
||||||
|
line_length = 88
|
||||||
|
known_first_party = ["penpot_mcp"]
|
||||||
|
skip = [".venv", "venv", "__pycache__"]
|
||||||
|
|
||||||
|
# Black configuration (if you decide to use it)
|
||||||
|
[tool.black]
|
||||||
|
line-length = 88
|
||||||
|
target-version = ['py312']
|
||||||
|
include = '\.pyi?$'
|
||||||
|
extend-exclude = '''
|
||||||
|
/(
|
||||||
|
# directories
|
||||||
|
\.eggs
|
||||||
|
| \.git
|
||||||
|
| \.hg
|
||||||
|
| \.mypy_cache
|
||||||
|
| \.tox
|
||||||
|
| \.venv
|
||||||
|
| build
|
||||||
|
| dist
|
||||||
|
)/
|
||||||
|
'''
|
||||||
1
tests/__init__.py
Normal file
1
tests/__init__.py
Normal file
@@ -0,0 +1 @@
|
|||||||
|
"""Package tests."""
|
||||||
52
tests/conftest.py
Normal file
52
tests/conftest.py
Normal file
@@ -0,0 +1,52 @@
|
|||||||
|
"""Test configuration for Penpot MCP tests."""
|
||||||
|
|
||||||
|
import os
|
||||||
|
from unittest.mock import MagicMock
|
||||||
|
|
||||||
|
import pytest
|
||||||
|
|
||||||
|
from penpot_mcp.api.penpot_api import PenpotAPI
|
||||||
|
from penpot_mcp.server.mcp_server import PenpotMCPServer
|
||||||
|
|
||||||
|
# Add the project root directory to the Python path
|
||||||
|
os.path.abspath(os.path.join(os.path.dirname(__file__), '..'))
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def mock_penpot_api(monkeypatch):
|
||||||
|
"""Create a mock PenpotAPI object."""
|
||||||
|
mock_api = MagicMock(spec=PenpotAPI)
|
||||||
|
# Add default behavior to the mock
|
||||||
|
mock_api.list_projects.return_value = [
|
||||||
|
{"id": "project1", "name": "Test Project 1"},
|
||||||
|
{"id": "project2", "name": "Test Project 2"}
|
||||||
|
]
|
||||||
|
mock_api.get_project_files.return_value = [
|
||||||
|
{"id": "file1", "name": "Test File 1"},
|
||||||
|
{"id": "file2", "name": "Test File 2"}
|
||||||
|
]
|
||||||
|
mock_api.get_file.return_value = {
|
||||||
|
"id": "file1",
|
||||||
|
"name": "Test File",
|
||||||
|
"data": {
|
||||||
|
"pages": [
|
||||||
|
{
|
||||||
|
"id": "page1",
|
||||||
|
"name": "Page 1",
|
||||||
|
"objects": {
|
||||||
|
"obj1": {"id": "obj1", "name": "Object 1", "type": "frame"},
|
||||||
|
"obj2": {"id": "obj2", "name": "Object 2", "type": "text"}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return mock_api
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def mock_server(mock_penpot_api):
|
||||||
|
"""Create a mock PenpotMCPServer with a mock API."""
|
||||||
|
server = PenpotMCPServer(name="Test Server")
|
||||||
|
server.api = mock_penpot_api
|
||||||
|
return server
|
||||||
86
tests/test_cache.py
Normal file
86
tests/test_cache.py
Normal file
@@ -0,0 +1,86 @@
|
|||||||
|
"""
|
||||||
|
Tests for the memory caching functionality.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import time
|
||||||
|
import pytest
|
||||||
|
from penpot_mcp.utils.cache import MemoryCache
|
||||||
|
|
||||||
|
@pytest.fixture
|
||||||
|
def memory_cache():
|
||||||
|
"""Create a MemoryCache instance with a short TTL for testing."""
|
||||||
|
return MemoryCache(ttl_seconds=2)
|
||||||
|
|
||||||
|
def test_cache_set_get(memory_cache):
|
||||||
|
"""Test setting and getting a file from cache."""
|
||||||
|
test_data = {"test": "data"}
|
||||||
|
file_id = "test123"
|
||||||
|
|
||||||
|
# Set data in cache
|
||||||
|
memory_cache.set(file_id, test_data)
|
||||||
|
|
||||||
|
# Get data from cache
|
||||||
|
cached_data = memory_cache.get(file_id)
|
||||||
|
assert cached_data == test_data
|
||||||
|
|
||||||
|
def test_cache_expiration(memory_cache):
|
||||||
|
"""Test that cached files expire after TTL."""
|
||||||
|
test_data = {"test": "data"}
|
||||||
|
file_id = "test123"
|
||||||
|
|
||||||
|
# Set data in cache
|
||||||
|
memory_cache.set(file_id, test_data)
|
||||||
|
|
||||||
|
# Data should be available immediately
|
||||||
|
assert memory_cache.get(file_id) == test_data
|
||||||
|
|
||||||
|
# Wait for cache to expire
|
||||||
|
time.sleep(3)
|
||||||
|
|
||||||
|
# Data should be expired
|
||||||
|
assert memory_cache.get(file_id) is None
|
||||||
|
|
||||||
|
def test_cache_clear(memory_cache):
|
||||||
|
"""Test clearing the cache."""
|
||||||
|
test_data = {"test": "data"}
|
||||||
|
file_id = "test123"
|
||||||
|
|
||||||
|
# Set data in cache
|
||||||
|
memory_cache.set(file_id, test_data)
|
||||||
|
|
||||||
|
# Verify data is cached
|
||||||
|
assert memory_cache.get(file_id) == test_data
|
||||||
|
|
||||||
|
# Clear cache
|
||||||
|
memory_cache.clear()
|
||||||
|
|
||||||
|
# Verify data is gone
|
||||||
|
assert memory_cache.get(file_id) is None
|
||||||
|
|
||||||
|
def test_get_all_cached_files(memory_cache):
|
||||||
|
"""Test getting all cached files."""
|
||||||
|
test_data1 = {"test": "data1"}
|
||||||
|
test_data2 = {"test": "data2"}
|
||||||
|
|
||||||
|
# Set multiple files in cache
|
||||||
|
memory_cache.set("file1", test_data1)
|
||||||
|
memory_cache.set("file2", test_data2)
|
||||||
|
|
||||||
|
# Get all cached files
|
||||||
|
all_files = memory_cache.get_all_cached_files()
|
||||||
|
|
||||||
|
# Verify all files are present
|
||||||
|
assert len(all_files) == 2
|
||||||
|
assert all_files["file1"] == test_data1
|
||||||
|
assert all_files["file2"] == test_data2
|
||||||
|
|
||||||
|
# Wait for cache to expire
|
||||||
|
time.sleep(3)
|
||||||
|
|
||||||
|
# Verify expired files are removed
|
||||||
|
all_files = memory_cache.get_all_cached_files()
|
||||||
|
assert len(all_files) == 0
|
||||||
|
|
||||||
|
def test_cache_nonexistent_file(memory_cache):
|
||||||
|
"""Test getting a nonexistent file from cache."""
|
||||||
|
assert memory_cache.get("nonexistent") is None
|
||||||
38
tests/test_config.py
Normal file
38
tests/test_config.py
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
"""Tests for config module."""
|
||||||
|
|
||||||
|
from penpot_mcp.utils import config
|
||||||
|
|
||||||
|
|
||||||
|
def test_config_values():
|
||||||
|
"""Test that config has the expected values and types."""
|
||||||
|
assert isinstance(config.PORT, int)
|
||||||
|
assert isinstance(config.DEBUG, bool)
|
||||||
|
assert isinstance(config.PENPOT_API_URL, str)
|
||||||
|
assert config.RESOURCES_PATH is not None
|
||||||
|
|
||||||
|
|
||||||
|
def test_environment_variable_override(monkeypatch):
|
||||||
|
"""Test that environment variables override default config values."""
|
||||||
|
# Save original values
|
||||||
|
original_port = config.PORT
|
||||||
|
original_debug = config.DEBUG
|
||||||
|
original_api_url = config.PENPOT_API_URL
|
||||||
|
|
||||||
|
# Override with environment variables
|
||||||
|
monkeypatch.setenv("PORT", "8080")
|
||||||
|
monkeypatch.setenv("DEBUG", "false")
|
||||||
|
monkeypatch.setenv("PENPOT_API_URL", "https://test.example.com/api")
|
||||||
|
|
||||||
|
# Reload the config module to apply the environment variables
|
||||||
|
import importlib
|
||||||
|
importlib.reload(config)
|
||||||
|
|
||||||
|
# Check the new values
|
||||||
|
assert config.PORT == 8080
|
||||||
|
assert config.DEBUG is False
|
||||||
|
assert config.PENPOT_API_URL == "https://test.example.com/api"
|
||||||
|
|
||||||
|
# Restore original values
|
||||||
|
monkeypatch.setattr(config, "PORT", original_port)
|
||||||
|
monkeypatch.setattr(config, "DEBUG", original_debug)
|
||||||
|
monkeypatch.setattr(config, "PENPOT_API_URL", original_api_url)
|
||||||
1074
tests/test_mcp_server.py
Normal file
1074
tests/test_mcp_server.py
Normal file
File diff suppressed because it is too large
Load Diff
1090
tests/test_penpot_tree.py
Normal file
1090
tests/test_penpot_tree.py
Normal file
File diff suppressed because it is too large
Load Diff
939
uv.lock
generated
Normal file
939
uv.lock
generated
Normal file
@@ -0,0 +1,939 @@
|
|||||||
|
version = 1
|
||||||
|
revision = 2
|
||||||
|
requires-python = ">=3.12"
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "annotated-types"
|
||||||
|
version = "0.7.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/ee/67/531ea369ba64dcff5ec9c3402f9f51bf748cec26dde048a2f973a4eea7f5/annotated_types-0.7.0.tar.gz", hash = "sha256:aff07c09a53a08bc8cfccb9c85b05f1aa9a2a6f23728d790723543408344ce89", size = 16081, upload-time = "2024-05-20T21:33:25.928Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/78/b6/6307fbef88d9b5ee7421e68d78a9f162e0da4900bc5f5793f6d3d0e34fb8/annotated_types-0.7.0-py3-none-any.whl", hash = "sha256:1f02e8b43a8fbbc3f3e0d4f0f4bfc8131bcb4eebe8849b8e5c773f3a1c582a53", size = 13643, upload-time = "2024-05-20T21:33:24.1Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "anyio"
|
||||||
|
version = "4.9.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "idna" },
|
||||||
|
{ name = "sniffio" },
|
||||||
|
{ name = "typing-extensions", marker = "python_full_version < '3.13'" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/95/7d/4c1bd541d4dffa1b52bd83fb8527089e097a106fc90b467a7313b105f840/anyio-4.9.0.tar.gz", hash = "sha256:673c0c244e15788651a4ff38710fea9675823028a6f08a5eda409e0c9840a028", size = 190949, upload-time = "2025-03-17T00:02:54.77Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a1/ee/48ca1a7c89ffec8b6a0c5d02b89c305671d5ffd8d3c94acf8b8c408575bb/anyio-4.9.0-py3-none-any.whl", hash = "sha256:9f76d541cad6e36af7beb62e978876f3b41e3e04f2c1fbf0884604c0a9c4d93c", size = 100916, upload-time = "2025-03-17T00:02:52.713Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "anytree"
|
||||||
|
version = "2.13.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/bc/a8/eb55fab589c56f9b6be2b3fd6997aa04bb6f3da93b01154ce6fc8e799db2/anytree-2.13.0.tar.gz", hash = "sha256:c9d3aa6825fdd06af7ebb05b4ef291d2db63e62bb1f9b7d9b71354be9d362714", size = 48389, upload-time = "2025-04-08T21:06:30.662Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7b/98/f6aa7fe0783e42be3093d8ef1b0ecdc22c34c0d69640dfb37f56925cb141/anytree-2.13.0-py3-none-any.whl", hash = "sha256:4cbcf10df36b1f1cba131b7e487ff3edafc9d6e932a3c70071b5b768bab901ff", size = 45077, upload-time = "2025-04-08T21:06:29.494Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "attrs"
|
||||||
|
version = "25.3.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/5a/b0/1367933a8532ee6ff8d63537de4f1177af4bff9f3e829baf7331f595bb24/attrs-25.3.0.tar.gz", hash = "sha256:75d7cefc7fb576747b2c81b4442d4d4a1ce0900973527c011d1030fd3bf4af1b", size = 812032, upload-time = "2025-03-13T11:10:22.779Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/77/06/bb80f5f86020c4551da315d78b3ab75e8228f89f0162f2c3a819e407941a/attrs-25.3.0-py3-none-any.whl", hash = "sha256:427318ce031701fea540783410126f03899a97ffc6f61596ad581ac2e40e3bc3", size = 63815, upload-time = "2025-03-13T11:10:21.14Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "autopep8"
|
||||||
|
version = "2.3.2"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "pycodestyle" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/50/d8/30873d2b7b57dee9263e53d142da044c4600a46f2d28374b3e38b023df16/autopep8-2.3.2.tar.gz", hash = "sha256:89440a4f969197b69a995e4ce0661b031f455a9f776d2c5ba3dbd83466931758", size = 92210, upload-time = "2025-01-14T14:46:18.454Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9e/43/53afb8ba17218f19b77c7834128566c5bbb100a0ad9ba2e8e89d089d7079/autopep8-2.3.2-py2.py3-none-any.whl", hash = "sha256:ce8ad498672c845a0c3de2629c15b635ec2b05ef8177a6e7c91c74f3e9b51128", size = 45807, upload-time = "2025-01-14T14:46:15.466Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "certifi"
|
||||||
|
version = "2025.4.26"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/e8/9e/c05b3920a3b7d20d3d3310465f50348e5b3694f4f88c6daf736eef3024c4/certifi-2025.4.26.tar.gz", hash = "sha256:0a816057ea3cdefcef70270d2c515e4506bbc954f417fa5ade2021213bb8f0c6", size = 160705, upload-time = "2025-04-26T02:12:29.51Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/4a/7e/3db2bd1b1f9e95f7cddca6d6e75e2f2bd9f51b1246e546d88addca0106bd/certifi-2025.4.26-py3-none-any.whl", hash = "sha256:30350364dfe371162649852c63336a15c70c6510c2ad5015b21c2345311805f3", size = 159618, upload-time = "2025-04-26T02:12:27.662Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "cfgv"
|
||||||
|
version = "3.4.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/11/74/539e56497d9bd1d484fd863dd69cbbfa653cd2aa27abfe35653494d85e94/cfgv-3.4.0.tar.gz", hash = "sha256:e52591d4c5f5dead8e0f673fb16db7949d2cfb3f7da4582893288f0ded8fe560", size = 7114, upload-time = "2023-08-12T20:38:17.776Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c5/55/51844dd50c4fc7a33b653bfaba4c2456f06955289ca770a5dbd5fd267374/cfgv-3.4.0-py2.py3-none-any.whl", hash = "sha256:b7265b1f29fd3316bfcd2b330d63d024f2bfd8bcb8b0272f8e19a504856c48f9", size = 7249, upload-time = "2023-08-12T20:38:16.269Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "charset-normalizer"
|
||||||
|
version = "3.4.2"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/e4/33/89c2ced2b67d1c2a61c19c6751aa8902d46ce3dacb23600a283619f5a12d/charset_normalizer-3.4.2.tar.gz", hash = "sha256:5baececa9ecba31eff645232d59845c07aa030f0c81ee70184a90d35099a0e63", size = 126367, upload-time = "2025-05-02T08:34:42.01Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d7/a4/37f4d6035c89cac7930395a35cc0f1b872e652eaafb76a6075943754f095/charset_normalizer-3.4.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:0c29de6a1a95f24b9a1aa7aefd27d2487263f00dfd55a77719b530788f75cff7", size = 199936, upload-time = "2025-05-02T08:32:33.712Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ee/8a/1a5e33b73e0d9287274f899d967907cd0bf9c343e651755d9307e0dbf2b3/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:cddf7bd982eaa998934a91f69d182aec997c6c468898efe6679af88283b498d3", size = 143790, upload-time = "2025-05-02T08:32:35.768Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/66/52/59521f1d8e6ab1482164fa21409c5ef44da3e9f653c13ba71becdd98dec3/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:fcbe676a55d7445b22c10967bceaaf0ee69407fbe0ece4d032b6eb8d4565982a", size = 153924, upload-time = "2025-05-02T08:32:37.284Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/86/2d/fb55fdf41964ec782febbf33cb64be480a6b8f16ded2dbe8db27a405c09f/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d41c4d287cfc69060fa91cae9683eacffad989f1a10811995fa309df656ec214", size = 146626, upload-time = "2025-05-02T08:32:38.803Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8c/73/6ede2ec59bce19b3edf4209d70004253ec5f4e319f9a2e3f2f15601ed5f7/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4e594135de17ab3866138f496755f302b72157d115086d100c3f19370839dd3a", size = 148567, upload-time = "2025-05-02T08:32:40.251Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/09/14/957d03c6dc343c04904530b6bef4e5efae5ec7d7990a7cbb868e4595ee30/charset_normalizer-3.4.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cf713fe9a71ef6fd5adf7a79670135081cd4431c2943864757f0fa3a65b1fafd", size = 150957, upload-time = "2025-05-02T08:32:41.705Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0d/c8/8174d0e5c10ccebdcb1b53cc959591c4c722a3ad92461a273e86b9f5a302/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:a370b3e078e418187da8c3674eddb9d983ec09445c99a3a263c2011993522981", size = 145408, upload-time = "2025-05-02T08:32:43.709Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/58/aa/8904b84bc8084ac19dc52feb4f5952c6df03ffb460a887b42615ee1382e8/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:a955b438e62efdf7e0b7b52a64dc5c3396e2634baa62471768a64bc2adb73d5c", size = 153399, upload-time = "2025-05-02T08:32:46.197Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c2/26/89ee1f0e264d201cb65cf054aca6038c03b1a0c6b4ae998070392a3ce605/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:7222ffd5e4de8e57e03ce2cef95a4c43c98fcb72ad86909abdfc2c17d227fc1b", size = 156815, upload-time = "2025-05-02T08:32:48.105Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/fd/07/68e95b4b345bad3dbbd3a8681737b4338ff2c9df29856a6d6d23ac4c73cb/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:bee093bf902e1d8fc0ac143c88902c3dfc8941f7ea1d6a8dd2bcb786d33db03d", size = 154537, upload-time = "2025-05-02T08:32:49.719Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/77/1a/5eefc0ce04affb98af07bc05f3bac9094513c0e23b0562d64af46a06aae4/charset_normalizer-3.4.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:dedb8adb91d11846ee08bec4c8236c8549ac721c245678282dcb06b221aab59f", size = 149565, upload-time = "2025-05-02T08:32:51.404Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/37/a0/2410e5e6032a174c95e0806b1a6585eb21e12f445ebe239fac441995226a/charset_normalizer-3.4.2-cp312-cp312-win32.whl", hash = "sha256:db4c7bf0e07fc3b7d89ac2a5880a6a8062056801b83ff56d8464b70f65482b6c", size = 98357, upload-time = "2025-05-02T08:32:53.079Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/6c/4f/c02d5c493967af3eda9c771ad4d2bbc8df6f99ddbeb37ceea6e8716a32bc/charset_normalizer-3.4.2-cp312-cp312-win_amd64.whl", hash = "sha256:5a9979887252a82fefd3d3ed2a8e3b937a7a809f65dcb1e068b090e165bbe99e", size = 105776, upload-time = "2025-05-02T08:32:54.573Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ea/12/a93df3366ed32db1d907d7593a94f1fe6293903e3e92967bebd6950ed12c/charset_normalizer-3.4.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:926ca93accd5d36ccdabd803392ddc3e03e6d4cd1cf17deff3b989ab8e9dbcf0", size = 199622, upload-time = "2025-05-02T08:32:56.363Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/04/93/bf204e6f344c39d9937d3c13c8cd5bbfc266472e51fc8c07cb7f64fcd2de/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:eba9904b0f38a143592d9fc0e19e2df0fa2e41c3c3745554761c5f6447eedabf", size = 143435, upload-time = "2025-05-02T08:32:58.551Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/22/2a/ea8a2095b0bafa6c5b5a55ffdc2f924455233ee7b91c69b7edfcc9e02284/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3fddb7e2c84ac87ac3a947cb4e66d143ca5863ef48e4a5ecb83bd48619e4634e", size = 153653, upload-time = "2025-05-02T08:33:00.342Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b6/57/1b090ff183d13cef485dfbe272e2fe57622a76694061353c59da52c9a659/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:98f862da73774290f251b9df8d11161b6cf25b599a66baf087c1ffe340e9bfd1", size = 146231, upload-time = "2025-05-02T08:33:02.081Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e2/28/ffc026b26f441fc67bd21ab7f03b313ab3fe46714a14b516f931abe1a2d8/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c9379d65defcab82d07b2a9dfbfc2e95bc8fe0ebb1b176a3190230a3ef0e07c", size = 148243, upload-time = "2025-05-02T08:33:04.063Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c0/0f/9abe9bd191629c33e69e47c6ef45ef99773320e9ad8e9cb08b8ab4a8d4cb/charset_normalizer-3.4.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:e635b87f01ebc977342e2697d05b56632f5f879a4f15955dfe8cef2448b51691", size = 150442, upload-time = "2025-05-02T08:33:06.418Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/67/7c/a123bbcedca91d5916c056407f89a7f5e8fdfce12ba825d7d6b9954a1a3c/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:1c95a1e2902a8b722868587c0e1184ad5c55631de5afc0eb96bc4b0d738092c0", size = 145147, upload-time = "2025-05-02T08:33:08.183Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ec/fe/1ac556fa4899d967b83e9893788e86b6af4d83e4726511eaaad035e36595/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:ef8de666d6179b009dce7bcb2ad4c4a779f113f12caf8dc77f0162c29d20490b", size = 153057, upload-time = "2025-05-02T08:33:09.986Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2b/ff/acfc0b0a70b19e3e54febdd5301a98b72fa07635e56f24f60502e954c461/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:32fc0341d72e0f73f80acb0a2c94216bd704f4f0bce10aedea38f30502b271ff", size = 156454, upload-time = "2025-05-02T08:33:11.814Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/92/08/95b458ce9c740d0645feb0e96cea1f5ec946ea9c580a94adfe0b617f3573/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:289200a18fa698949d2b39c671c2cc7a24d44096784e76614899a7ccf2574b7b", size = 154174, upload-time = "2025-05-02T08:33:13.707Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/78/be/8392efc43487ac051eee6c36d5fbd63032d78f7728cb37aebcc98191f1ff/charset_normalizer-3.4.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4a476b06fbcf359ad25d34a057b7219281286ae2477cc5ff5e3f70a246971148", size = 149166, upload-time = "2025-05-02T08:33:15.458Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/44/96/392abd49b094d30b91d9fbda6a69519e95802250b777841cf3bda8fe136c/charset_normalizer-3.4.2-cp313-cp313-win32.whl", hash = "sha256:aaeeb6a479c7667fbe1099af9617c83aaca22182d6cf8c53966491a0f1b7ffb7", size = 98064, upload-time = "2025-05-02T08:33:17.06Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e9/b0/0200da600134e001d91851ddc797809e2fe0ea72de90e09bec5a2fbdaccb/charset_normalizer-3.4.2-cp313-cp313-win_amd64.whl", hash = "sha256:aa6af9e7d59f9c12b33ae4e9450619cf2488e2bbe9b44030905877f0b2324980", size = 105641, upload-time = "2025-05-02T08:33:18.753Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/20/94/c5790835a017658cbfabd07f3bfb549140c3ac458cfc196323996b10095a/charset_normalizer-3.4.2-py3-none-any.whl", hash = "sha256:7f56930ab0abd1c45cd15be65cc741c28b1c9a34876ce8c17a2fa107810c0af0", size = 52626, upload-time = "2025-05-02T08:34:40.053Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "click"
|
||||||
|
version = "8.2.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "colorama", marker = "sys_platform == 'win32'" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/60/6c/8ca2efa64cf75a977a0d7fac081354553ebe483345c734fb6b6515d96bbc/click-8.2.1.tar.gz", hash = "sha256:27c491cc05d968d271d5a1db13e3b5a184636d9d930f148c50b038f0d0646202", size = 286342, upload-time = "2025-05-20T23:19:49.832Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/85/32/10bb5764d90a8eee674e9dc6f4db6a0ab47c8c4d0d83c27f7c39ac415a4d/click-8.2.1-py3-none-any.whl", hash = "sha256:61a3265b914e850b85317d0b3109c7f8cd35a670f963866005d6ef1d5175a12b", size = 102215, upload-time = "2025-05-20T23:19:47.796Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "colorama"
|
||||||
|
version = "0.4.6"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/d8/53/6f443c9a4a8358a93a6792e2acffb9d9d5cb0a5cfd8802644b7b1c9a02e4/colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44", size = 27697, upload-time = "2022-10-25T02:36:22.414Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d1/d6/3965ed04c63042e047cb6a3e6ed1a63a35087b6a609aa3a15ed8ac56c221/colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6", size = 25335, upload-time = "2022-10-25T02:36:20.889Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "coverage"
|
||||||
|
version = "7.8.2"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/ba/07/998afa4a0ecdf9b1981ae05415dad2d4e7716e1b1f00abbd91691ac09ac9/coverage-7.8.2.tar.gz", hash = "sha256:a886d531373a1f6ff9fad2a2ba4a045b68467b779ae729ee0b3b10ac20033b27", size = 812759, upload-time = "2025-05-23T11:39:57.856Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8d/2a/1da1ada2e3044fcd4a3254fb3576e160b8fe5b36d705c8a31f793423f763/coverage-7.8.2-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:e2f6fe3654468d061942591aef56686131335b7a8325684eda85dacdf311356c", size = 211876, upload-time = "2025-05-23T11:38:29.01Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/70/e9/3d715ffd5b6b17a8be80cd14a8917a002530a99943cc1939ad5bb2aa74b9/coverage-7.8.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:76090fab50610798cc05241bf83b603477c40ee87acd358b66196ab0ca44ffa1", size = 212130, upload-time = "2025-05-23T11:38:30.675Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a0/02/fdce62bb3c21649abfd91fbdcf041fb99be0d728ff00f3f9d54d97ed683e/coverage-7.8.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2bd0a0a5054be160777a7920b731a0570284db5142abaaf81bcbb282b8d99279", size = 246176, upload-time = "2025-05-23T11:38:32.395Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a7/52/decbbed61e03b6ffe85cd0fea360a5e04a5a98a7423f292aae62423b8557/coverage-7.8.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:da23ce9a3d356d0affe9c7036030b5c8f14556bd970c9b224f9c8205505e3b99", size = 243068, upload-time = "2025-05-23T11:38:33.989Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/38/6c/d0e9c0cce18faef79a52778219a3c6ee8e336437da8eddd4ab3dbd8fadff/coverage-7.8.2-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c9392773cffeb8d7e042a7b15b82a414011e9d2b5fdbbd3f7e6a6b17d5e21b20", size = 245328, upload-time = "2025-05-23T11:38:35.568Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f0/70/f703b553a2f6b6c70568c7e398ed0789d47f953d67fbba36a327714a7bca/coverage-7.8.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:876cbfd0b09ce09d81585d266c07a32657beb3eaec896f39484b631555be0fe2", size = 245099, upload-time = "2025-05-23T11:38:37.627Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ec/fb/4cbb370dedae78460c3aacbdad9d249e853f3bc4ce5ff0e02b1983d03044/coverage-7.8.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:3da9b771c98977a13fbc3830f6caa85cae6c9c83911d24cb2d218e9394259c57", size = 243314, upload-time = "2025-05-23T11:38:39.238Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/39/9f/1afbb2cb9c8699b8bc38afdce00a3b4644904e6a38c7bf9005386c9305ec/coverage-7.8.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:9a990f6510b3292686713bfef26d0049cd63b9c7bb17e0864f133cbfd2e6167f", size = 244489, upload-time = "2025-05-23T11:38:40.845Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/79/fa/f3e7ec7d220bff14aba7a4786ae47043770cbdceeea1803083059c878837/coverage-7.8.2-cp312-cp312-win32.whl", hash = "sha256:bf8111cddd0f2b54d34e96613e7fbdd59a673f0cf5574b61134ae75b6f5a33b8", size = 214366, upload-time = "2025-05-23T11:38:43.551Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/54/aa/9cbeade19b7e8e853e7ffc261df885d66bf3a782c71cba06c17df271f9e6/coverage-7.8.2-cp312-cp312-win_amd64.whl", hash = "sha256:86a323a275e9e44cdf228af9b71c5030861d4d2610886ab920d9945672a81223", size = 215165, upload-time = "2025-05-23T11:38:45.148Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c4/73/e2528bf1237d2448f882bbebaec5c3500ef07301816c5c63464b9da4d88a/coverage-7.8.2-cp312-cp312-win_arm64.whl", hash = "sha256:820157de3a589e992689ffcda8639fbabb313b323d26388d02e154164c57b07f", size = 213548, upload-time = "2025-05-23T11:38:46.74Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1a/93/eb6400a745ad3b265bac36e8077fdffcf0268bdbbb6c02b7220b624c9b31/coverage-7.8.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:ea561010914ec1c26ab4188aef8b1567272ef6de096312716f90e5baa79ef8ca", size = 211898, upload-time = "2025-05-23T11:38:49.066Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1b/7c/bdbf113f92683024406a1cd226a199e4200a2001fc85d6a6e7e299e60253/coverage-7.8.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:cb86337a4fcdd0e598ff2caeb513ac604d2f3da6d53df2c8e368e07ee38e277d", size = 212171, upload-time = "2025-05-23T11:38:51.207Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/91/22/594513f9541a6b88eb0dba4d5da7d71596dadef6b17a12dc2c0e859818a9/coverage-7.8.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:26a4636ddb666971345541b59899e969f3b301143dd86b0ddbb570bd591f1e85", size = 245564, upload-time = "2025-05-23T11:38:52.857Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1f/f4/2860fd6abeebd9f2efcfe0fd376226938f22afc80c1943f363cd3c28421f/coverage-7.8.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5040536cf9b13fb033f76bcb5e1e5cb3b57c4807fef37db9e0ed129c6a094257", size = 242719, upload-time = "2025-05-23T11:38:54.529Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/89/60/f5f50f61b6332451520e6cdc2401700c48310c64bc2dd34027a47d6ab4ca/coverage-7.8.2-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dc67994df9bcd7e0150a47ef41278b9e0a0ea187caba72414b71dc590b99a108", size = 244634, upload-time = "2025-05-23T11:38:57.326Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3b/70/7f4e919039ab7d944276c446b603eea84da29ebcf20984fb1fdf6e602028/coverage-7.8.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:6e6c86888fd076d9e0fe848af0a2142bf606044dc5ceee0aa9eddb56e26895a0", size = 244824, upload-time = "2025-05-23T11:38:59.421Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/26/45/36297a4c0cea4de2b2c442fe32f60c3991056c59cdc3cdd5346fbb995c97/coverage-7.8.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:684ca9f58119b8e26bef860db33524ae0365601492e86ba0b71d513f525e7050", size = 242872, upload-time = "2025-05-23T11:39:01.049Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a4/71/e041f1b9420f7b786b1367fa2a375703889ef376e0d48de9f5723fb35f11/coverage-7.8.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8165584ddedb49204c4e18da083913bdf6a982bfb558632a79bdaadcdafd0d48", size = 244179, upload-time = "2025-05-23T11:39:02.709Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/bd/db/3c2bf49bdc9de76acf2491fc03130c4ffc51469ce2f6889d2640eb563d77/coverage-7.8.2-cp313-cp313-win32.whl", hash = "sha256:34759ee2c65362163699cc917bdb2a54114dd06d19bab860725f94ef45a3d9b7", size = 214393, upload-time = "2025-05-23T11:39:05.457Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c6/dc/947e75d47ebbb4b02d8babb1fad4ad381410d5bc9da7cfca80b7565ef401/coverage-7.8.2-cp313-cp313-win_amd64.whl", hash = "sha256:2f9bc608fbafaee40eb60a9a53dbfb90f53cc66d3d32c2849dc27cf5638a21e3", size = 215194, upload-time = "2025-05-23T11:39:07.171Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/90/31/a980f7df8a37eaf0dc60f932507fda9656b3a03f0abf188474a0ea188d6d/coverage-7.8.2-cp313-cp313-win_arm64.whl", hash = "sha256:9fe449ee461a3b0c7105690419d0b0aba1232f4ff6d120a9e241e58a556733f7", size = 213580, upload-time = "2025-05-23T11:39:08.862Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8a/6a/25a37dd90f6c95f59355629417ebcb74e1c34e38bb1eddf6ca9b38b0fc53/coverage-7.8.2-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:8369a7c8ef66bded2b6484053749ff220dbf83cba84f3398c84c51a6f748a008", size = 212734, upload-time = "2025-05-23T11:39:11.109Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/36/8b/3a728b3118988725f40950931abb09cd7f43b3c740f4640a59f1db60e372/coverage-7.8.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:159b81df53a5fcbc7d45dae3adad554fdbde9829a994e15227b3f9d816d00b36", size = 212959, upload-time = "2025-05-23T11:39:12.751Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/53/3c/212d94e6add3a3c3f412d664aee452045ca17a066def8b9421673e9482c4/coverage-7.8.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e6fcbbd35a96192d042c691c9e0c49ef54bd7ed865846a3c9d624c30bb67ce46", size = 257024, upload-time = "2025-05-23T11:39:15.569Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a4/40/afc03f0883b1e51bbe804707aae62e29c4e8c8bbc365c75e3e4ddeee9ead/coverage-7.8.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:05364b9cc82f138cc86128dc4e2e1251c2981a2218bfcd556fe6b0fbaa3501be", size = 252867, upload-time = "2025-05-23T11:39:17.64Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/18/a2/3699190e927b9439c6ded4998941a3c1d6fa99e14cb28d8536729537e307/coverage-7.8.2-cp313-cp313t-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:46d532db4e5ff3979ce47d18e2fe8ecad283eeb7367726da0e5ef88e4fe64740", size = 255096, upload-time = "2025-05-23T11:39:19.328Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b4/06/16e3598b9466456b718eb3e789457d1a5b8bfb22e23b6e8bbc307df5daf0/coverage-7.8.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:4000a31c34932e7e4fa0381a3d6deb43dc0c8f458e3e7ea6502e6238e10be625", size = 256276, upload-time = "2025-05-23T11:39:21.077Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a7/d5/4b5a120d5d0223050a53d2783c049c311eea1709fa9de12d1c358e18b707/coverage-7.8.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:43ff5033d657cd51f83015c3b7a443287250dc14e69910577c3e03bd2e06f27b", size = 254478, upload-time = "2025-05-23T11:39:22.838Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ba/85/f9ecdb910ecdb282b121bfcaa32fa8ee8cbd7699f83330ee13ff9bbf1a85/coverage-7.8.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:94316e13f0981cbbba132c1f9f365cac1d26716aaac130866ca812006f662199", size = 255255, upload-time = "2025-05-23T11:39:24.644Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/50/63/2d624ac7d7ccd4ebbd3c6a9eba9d7fc4491a1226071360d59dd84928ccb2/coverage-7.8.2-cp313-cp313t-win32.whl", hash = "sha256:3f5673888d3676d0a745c3d0e16da338c5eea300cb1f4ada9c872981265e76d8", size = 215109, upload-time = "2025-05-23T11:39:26.722Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/22/5e/7053b71462e970e869111c1853afd642212568a350eba796deefdfbd0770/coverage-7.8.2-cp313-cp313t-win_amd64.whl", hash = "sha256:2c08b05ee8d7861e45dc5a2cc4195c8c66dca5ac613144eb6ebeaff2d502e73d", size = 216268, upload-time = "2025-05-23T11:39:28.429Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/07/69/afa41aa34147655543dbe96994f8a246daf94b361ccf5edfd5df62ce066a/coverage-7.8.2-cp313-cp313t-win_arm64.whl", hash = "sha256:1e1448bb72b387755e1ff3ef1268a06617afd94188164960dba8d0245a46004b", size = 214071, upload-time = "2025-05-23T11:39:30.55Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a0/1a/0b9c32220ad694d66062f571cc5cedfa9997b64a591e8a500bb63de1bd40/coverage-7.8.2-py3-none-any.whl", hash = "sha256:726f32ee3713f7359696331a18daf0c3b3a70bb0ae71141b9d3c52be7c595e32", size = 203623, upload-time = "2025-05-23T11:39:53.846Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "distlib"
|
||||||
|
version = "0.3.9"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/0d/dd/1bec4c5ddb504ca60fc29472f3d27e8d4da1257a854e1d96742f15c1d02d/distlib-0.3.9.tar.gz", hash = "sha256:a60f20dea646b8a33f3e7772f74dc0b2d0772d2837ee1342a00645c81edf9403", size = 613923, upload-time = "2024-10-09T18:35:47.551Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/91/a1/cf2472db20f7ce4a6be1253a81cfdf85ad9c7885ffbed7047fb72c24cf87/distlib-0.3.9-py2.py3-none-any.whl", hash = "sha256:47f8c22fd27c27e25a65601af709b38e4f0a45ea4fc2e710f65755fa8caaaf87", size = 468973, upload-time = "2024-10-09T18:35:44.272Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "filelock"
|
||||||
|
version = "3.18.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/0a/10/c23352565a6544bdc5353e0b15fc1c563352101f30e24bf500207a54df9a/filelock-3.18.0.tar.gz", hash = "sha256:adbc88eabb99d2fec8c9c1b229b171f18afa655400173ddc653d5d01501fb9f2", size = 18075, upload-time = "2025-03-14T07:11:40.47Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/4d/36/2a115987e2d8c300a974597416d9de88f2444426de9571f4b59b2cca3acc/filelock-3.18.0-py3-none-any.whl", hash = "sha256:c401f4f8377c4464e6db25fff06205fd89bdd83b65eb0488ed1b160f780e21de", size = 16215, upload-time = "2025-03-14T07:11:39.145Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "flake8"
|
||||||
|
version = "7.2.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "mccabe" },
|
||||||
|
{ name = "pycodestyle" },
|
||||||
|
{ name = "pyflakes" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/e7/c4/5842fc9fc94584c455543540af62fd9900faade32511fab650e9891ec225/flake8-7.2.0.tar.gz", hash = "sha256:fa558ae3f6f7dbf2b4f22663e5343b6b6023620461f8d4ff2019ef4b5ee70426", size = 48177, upload-time = "2025-03-29T20:08:39.329Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/83/5c/0627be4c9976d56b1217cb5187b7504e7fd7d3503f8bfd312a04077bd4f7/flake8-7.2.0-py2.py3-none-any.whl", hash = "sha256:93b92ba5bdb60754a6da14fa3b93a9361fd00a59632ada61fd7b130436c40343", size = 57786, upload-time = "2025-03-29T20:08:37.902Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "flake8-docstrings"
|
||||||
|
version = "1.7.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "flake8" },
|
||||||
|
{ name = "pydocstyle" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/93/24/f839e3a06e18f4643ccb81370909a497297909f15106e6af2fecdef46894/flake8_docstrings-1.7.0.tar.gz", hash = "sha256:4c8cc748dc16e6869728699e5d0d685da9a10b0ea718e090b1ba088e67a941af", size = 5995, upload-time = "2023-01-25T14:27:13.903Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3f/7d/76a278fa43250441ed9300c344f889c7fb1817080c8fb8996b840bf421c2/flake8_docstrings-1.7.0-py2.py3-none-any.whl", hash = "sha256:51f2344026da083fc084166a9353f5082b01f72901df422f74b4d953ae88ac75", size = 4994, upload-time = "2023-01-25T14:27:12.32Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "gunicorn"
|
||||||
|
version = "23.0.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "packaging" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/34/72/9614c465dc206155d93eff0ca20d42e1e35afc533971379482de953521a4/gunicorn-23.0.0.tar.gz", hash = "sha256:f014447a0101dc57e294f6c18ca6b40227a4c90e9bdb586042628030cba004ec", size = 375031, upload-time = "2024-08-10T20:25:27.378Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/cb/7d/6dac2a6e1eba33ee43f318edbed4ff29151a49b5d37f080aad1e6469bca4/gunicorn-23.0.0-py3-none-any.whl", hash = "sha256:ec400d38950de4dfd418cff8328b2c8faed0edb0d517d3394e457c317908ca4d", size = 85029, upload-time = "2024-08-10T20:25:24.996Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "h11"
|
||||||
|
version = "0.16.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/01/ee/02a2c011bdab74c6fb3c75474d40b3052059d95df7e73351460c8588d963/h11-0.16.0.tar.gz", hash = "sha256:4e35b956cf45792e4caa5885e69fba00bdbc6ffafbfa020300e549b208ee5ff1", size = 101250, upload-time = "2025-04-24T03:35:25.427Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/04/4b/29cac41a4d98d144bf5f6d33995617b185d14b22401f75ca86f384e87ff1/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86", size = 37515, upload-time = "2025-04-24T03:35:24.344Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "httpcore"
|
||||||
|
version = "1.0.9"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "certifi" },
|
||||||
|
{ name = "h11" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/06/94/82699a10bca87a5556c9c59b5963f2d039dbd239f25bc2a63907a05a14cb/httpcore-1.0.9.tar.gz", hash = "sha256:6e34463af53fd2ab5d807f399a9b45ea31c3dfa2276f15a2c3f00afff6e176e8", size = 85484, upload-time = "2025-04-24T22:06:22.219Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7e/f5/f66802a942d491edb555dd61e3a9961140fd64c90bce1eafd741609d334d/httpcore-1.0.9-py3-none-any.whl", hash = "sha256:2d400746a40668fc9dec9810239072b40b4484b640a8c38fd654a024c7a1bf55", size = 78784, upload-time = "2025-04-24T22:06:20.566Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "httpx"
|
||||||
|
version = "0.28.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "anyio" },
|
||||||
|
{ name = "certifi" },
|
||||||
|
{ name = "httpcore" },
|
||||||
|
{ name = "idna" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/b1/df/48c586a5fe32a0f01324ee087459e112ebb7224f646c0b5023f5e79e9956/httpx-0.28.1.tar.gz", hash = "sha256:75e98c5f16b0f35b567856f597f06ff2270a374470a5c2392242528e3e3e42fc", size = 141406, upload-time = "2024-12-06T15:37:23.222Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2a/39/e50c7c3a983047577ee07d2a9e53faf5a69493943ec3f6a384bdc792deb2/httpx-0.28.1-py3-none-any.whl", hash = "sha256:d909fcccc110f8c7faf814ca82a9a4d816bc5a6dbfea25d6591d6985b8ba59ad", size = 73517, upload-time = "2024-12-06T15:37:21.509Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "httpx-sse"
|
||||||
|
version = "0.4.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/4c/60/8f4281fa9bbf3c8034fd54c0e7412e66edbab6bc74c4996bd616f8d0406e/httpx-sse-0.4.0.tar.gz", hash = "sha256:1e81a3a3070ce322add1d3529ed42eb5f70817f45ed6ec915ab753f961139721", size = 12624, upload-time = "2023-12-22T08:01:21.083Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e1/9b/a181f281f65d776426002f330c31849b86b31fc9d848db62e16f03ff739f/httpx_sse-0.4.0-py3-none-any.whl", hash = "sha256:f329af6eae57eaa2bdfd962b42524764af68075ea87370a2de920af5341e318f", size = 7819, upload-time = "2023-12-22T08:01:19.89Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "identify"
|
||||||
|
version = "2.6.12"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/a2/88/d193a27416618628a5eea64e3223acd800b40749a96ffb322a9b55a49ed1/identify-2.6.12.tar.gz", hash = "sha256:d8de45749f1efb108badef65ee8386f0f7bb19a7f26185f74de6367bffbaf0e6", size = 99254, upload-time = "2025-05-23T20:37:53.3Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7a/cd/18f8da995b658420625f7ef13f037be53ae04ec5ad33f9b718240dcfd48c/identify-2.6.12-py2.py3-none-any.whl", hash = "sha256:ad9672d5a72e0d2ff7c5c8809b62dfa60458626352fb0eb7b55e69bdc45334a2", size = 99145, upload-time = "2025-05-23T20:37:51.495Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "idna"
|
||||||
|
version = "3.10"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/f1/70/7703c29685631f5a7590aa73f1f1d3fa9a380e654b86af429e0934a32f7d/idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9", size = 190490, upload-time = "2024-09-15T18:07:39.745Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/76/c6/c88e154df9c4e1a2a66ccf0005a88dfb2650c1dffb6f5ce603dfbd452ce3/idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3", size = 70442, upload-time = "2024-09-15T18:07:37.964Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "iniconfig"
|
||||||
|
version = "2.1.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/f2/97/ebf4da567aa6827c909642694d71c9fcf53e5b504f2d96afea02718862f3/iniconfig-2.1.0.tar.gz", hash = "sha256:3abbd2e30b36733fee78f9c7f7308f2d0050e88f0087fd25c2645f63c773e1c7", size = 4793, upload-time = "2025-03-19T20:09:59.721Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2c/e1/e6716421ea10d38022b952c159d5161ca1193197fb744506875fbb87ea7b/iniconfig-2.1.0-py3-none-any.whl", hash = "sha256:9deba5723312380e77435581c6bf4935c94cbfab9b1ed33ef8d238ea168eb760", size = 6050, upload-time = "2025-03-19T20:10:01.071Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "isort"
|
||||||
|
version = "6.0.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/b8/21/1e2a441f74a653a144224d7d21afe8f4169e6c7c20bb13aec3a2dc3815e0/isort-6.0.1.tar.gz", hash = "sha256:1cb5df28dfbc742e490c5e41bad6da41b805b0a8be7bc93cd0fb2a8a890ac450", size = 821955, upload-time = "2025-02-26T21:13:16.955Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c1/11/114d0a5f4dabbdcedc1125dee0888514c3c3b16d3e9facad87ed96fad97c/isort-6.0.1-py3-none-any.whl", hash = "sha256:2dc5d7f65c9678d94c88dfc29161a320eec67328bc97aad576874cb4be1e9615", size = 94186, upload-time = "2025-02-26T21:13:14.911Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "jsonschema"
|
||||||
|
version = "4.23.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "attrs" },
|
||||||
|
{ name = "jsonschema-specifications" },
|
||||||
|
{ name = "referencing" },
|
||||||
|
{ name = "rpds-py" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/38/2e/03362ee4034a4c917f697890ccd4aec0800ccf9ded7f511971c75451deec/jsonschema-4.23.0.tar.gz", hash = "sha256:d71497fef26351a33265337fa77ffeb82423f3ea21283cd9467bb03999266bc4", size = 325778, upload-time = "2024-07-08T18:40:05.546Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/69/4a/4f9dbeb84e8850557c02365a0eee0649abe5eb1d84af92a25731c6c0f922/jsonschema-4.23.0-py3-none-any.whl", hash = "sha256:fbadb6f8b144a8f8cf9f0b89ba94501d143e50411a1278633f56a7acf7fd5566", size = 88462, upload-time = "2024-07-08T18:40:00.165Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "jsonschema-specifications"
|
||||||
|
version = "2025.4.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "referencing" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/bf/ce/46fbd9c8119cfc3581ee5643ea49464d168028cfb5caff5fc0596d0cf914/jsonschema_specifications-2025.4.1.tar.gz", hash = "sha256:630159c9f4dbea161a6a2205c3011cc4f18ff381b189fff48bb39b9bf26ae608", size = 15513, upload-time = "2025-04-23T12:34:07.418Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/01/0e/b27cdbaccf30b890c40ed1da9fd4a3593a5cf94dae54fb34f8a4b74fcd3f/jsonschema_specifications-2025.4.1-py3-none-any.whl", hash = "sha256:4653bffbd6584f7de83a67e0d620ef16900b390ddc7939d56684d6c81e33f1af", size = 18437, upload-time = "2025-04-23T12:34:05.422Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "markdown-it-py"
|
||||||
|
version = "3.0.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "mdurl" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/38/71/3b932df36c1a044d397a1f92d1cf91ee0a503d91e470cbd670aa66b07ed0/markdown-it-py-3.0.0.tar.gz", hash = "sha256:e3f60a94fa066dc52ec76661e37c851cb232d92f9886b15cb560aaada2df8feb", size = 74596, upload-time = "2023-06-03T06:41:14.443Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/42/d7/1ec15b46af6af88f19b8e5ffea08fa375d433c998b8a7639e76935c14f1f/markdown_it_py-3.0.0-py3-none-any.whl", hash = "sha256:355216845c60bd96232cd8d8c40e8f9765cc86f46880e43a8fd22dc1a1a8cab1", size = 87528, upload-time = "2023-06-03T06:41:11.019Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "mccabe"
|
||||||
|
version = "0.7.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/e7/ff/0ffefdcac38932a54d2b5eed4e0ba8a408f215002cd178ad1df0f2806ff8/mccabe-0.7.0.tar.gz", hash = "sha256:348e0240c33b60bbdf4e523192ef919f28cb2c3d7d5c7794f74009290f236325", size = 9658, upload-time = "2022-01-24T01:14:51.113Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/27/1a/1f68f9ba0c207934b35b86a8ca3aad8395a3d6dd7921c0686e23853ff5a9/mccabe-0.7.0-py2.py3-none-any.whl", hash = "sha256:6c2d30ab6be0e4a46919781807b4f0d834ebdd6c6e3dca0bda5a15f863427b6e", size = 7350, upload-time = "2022-01-24T01:14:49.62Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "mcp"
|
||||||
|
version = "1.9.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "anyio" },
|
||||||
|
{ name = "httpx" },
|
||||||
|
{ name = "httpx-sse" },
|
||||||
|
{ name = "pydantic" },
|
||||||
|
{ name = "pydantic-settings" },
|
||||||
|
{ name = "python-multipart" },
|
||||||
|
{ name = "sse-starlette" },
|
||||||
|
{ name = "starlette" },
|
||||||
|
{ name = "uvicorn", marker = "sys_platform != 'emscripten'" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/e7/bc/54aec2c334698cc575ca3b3481eed627125fb66544152fa1af927b1a495c/mcp-1.9.1.tar.gz", hash = "sha256:19879cd6dde3d763297617242888c2f695a95dfa854386a6a68676a646ce75e4", size = 316247, upload-time = "2025-05-22T15:52:21.26Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a6/c0/4ac795585a22a0a2d09cd2b1187b0252d2afcdebd01e10a68bbac4d34890/mcp-1.9.1-py3-none-any.whl", hash = "sha256:2900ded8ffafc3c8a7bfcfe8bc5204037e988e753ec398f371663e6a06ecd9a9", size = 130261, upload-time = "2025-05-22T15:52:19.702Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[package.optional-dependencies]
|
||||||
|
cli = [
|
||||||
|
{ name = "python-dotenv" },
|
||||||
|
{ name = "typer" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "mdurl"
|
||||||
|
version = "0.1.2"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/d6/54/cfe61301667036ec958cb99bd3efefba235e65cdeb9c84d24a8293ba1d90/mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba", size = 8729, upload-time = "2022-08-14T12:40:10.846Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b3/38/89ba8ad64ae25be8de66a6d463314cf1eb366222074cfda9ee839c56a4b4/mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8", size = 9979, upload-time = "2022-08-14T12:40:09.779Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "nodeenv"
|
||||||
|
version = "1.9.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/43/16/fc88b08840de0e0a72a2f9d8c6bae36be573e475a6326ae854bcc549fc45/nodeenv-1.9.1.tar.gz", hash = "sha256:6ec12890a2dab7946721edbfbcd91f3319c6ccc9aec47be7c7e6b7011ee6645f", size = 47437, upload-time = "2024-06-04T18:44:11.171Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d2/1d/1b658dbd2b9fa9c4c9f32accbfc0205d532c8c6194dc0f2a4c0428e7128a/nodeenv-1.9.1-py2.py3-none-any.whl", hash = "sha256:ba11c9782d29c27c70ffbdda2d7415098754709be8a7056d79a737cd901155c9", size = 22314, upload-time = "2024-06-04T18:44:08.352Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "packaging"
|
||||||
|
version = "25.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/a1/d4/1fc4078c65507b51b96ca8f8c3ba19e6a61c8253c72794544580a7b6c24d/packaging-25.0.tar.gz", hash = "sha256:d443872c98d677bf60f6a1f2f8c1cb748e8fe762d2bf9d3148b5599295b0fc4f", size = 165727, upload-time = "2025-04-19T11:48:59.673Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/20/12/38679034af332785aac8774540895e234f4d07f7545804097de4b666afd8/packaging-25.0-py3-none-any.whl", hash = "sha256:29572ef2b1f17581046b3a2227d5c611fb25ec70ca1ba8554b24b0e69331a484", size = 66469, upload-time = "2025-04-19T11:48:57.875Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "penpot-mcp"
|
||||||
|
version = "0.1.0"
|
||||||
|
source = { editable = "." }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "anytree" },
|
||||||
|
{ name = "gunicorn" },
|
||||||
|
{ name = "jsonschema" },
|
||||||
|
{ name = "mcp" },
|
||||||
|
{ name = "python-dotenv" },
|
||||||
|
{ name = "pyyaml" },
|
||||||
|
{ name = "requests" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[package.optional-dependencies]
|
||||||
|
cli = [
|
||||||
|
{ name = "mcp", extra = ["cli"] },
|
||||||
|
]
|
||||||
|
dev = [
|
||||||
|
{ name = "autopep8" },
|
||||||
|
{ name = "flake8" },
|
||||||
|
{ name = "flake8-docstrings" },
|
||||||
|
{ name = "isort" },
|
||||||
|
{ name = "pre-commit" },
|
||||||
|
{ name = "pytest" },
|
||||||
|
{ name = "pytest-cov" },
|
||||||
|
{ name = "pytest-mock" },
|
||||||
|
{ name = "pyupgrade" },
|
||||||
|
{ name = "setuptools" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[package.metadata]
|
||||||
|
requires-dist = [
|
||||||
|
{ name = "anytree", specifier = ">=2.8.0" },
|
||||||
|
{ name = "autopep8", marker = "extra == 'dev'", specifier = ">=2.0.4" },
|
||||||
|
{ name = "flake8", marker = "extra == 'dev'", specifier = ">=6.1.0" },
|
||||||
|
{ name = "flake8-docstrings", marker = "extra == 'dev'", specifier = ">=1.7.0" },
|
||||||
|
{ name = "gunicorn", specifier = ">=20.1.0" },
|
||||||
|
{ name = "isort", marker = "extra == 'dev'", specifier = ">=5.12.0" },
|
||||||
|
{ name = "jsonschema", specifier = ">=4.0.0" },
|
||||||
|
{ name = "mcp", specifier = ">=1.7.0" },
|
||||||
|
{ name = "mcp", extras = ["cli"], marker = "extra == 'cli'", specifier = ">=1.7.0" },
|
||||||
|
{ name = "pre-commit", marker = "extra == 'dev'", specifier = ">=3.5.0" },
|
||||||
|
{ name = "pytest", marker = "extra == 'dev'", specifier = ">=7.4.0" },
|
||||||
|
{ name = "pytest-cov", marker = "extra == 'dev'", specifier = ">=4.1.0" },
|
||||||
|
{ name = "pytest-mock", marker = "extra == 'dev'", specifier = ">=3.11.1" },
|
||||||
|
{ name = "python-dotenv", specifier = ">=1.0.0" },
|
||||||
|
{ name = "pyupgrade", marker = "extra == 'dev'", specifier = ">=3.13.0" },
|
||||||
|
{ name = "pyyaml", specifier = ">=6.0.0" },
|
||||||
|
{ name = "requests", specifier = ">=2.26.0" },
|
||||||
|
{ name = "setuptools", marker = "extra == 'dev'", specifier = ">=65.5.0" },
|
||||||
|
]
|
||||||
|
provides-extras = ["dev", "cli"]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "platformdirs"
|
||||||
|
version = "4.3.8"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/fe/8b/3c73abc9c759ecd3f1f7ceff6685840859e8070c4d947c93fae71f6a0bf2/platformdirs-4.3.8.tar.gz", hash = "sha256:3d512d96e16bcb959a814c9f348431070822a6496326a4be0911c40b5a74c2bc", size = 21362, upload-time = "2025-05-07T22:47:42.121Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/fe/39/979e8e21520d4e47a0bbe349e2713c0aac6f3d853d0e5b34d76206c439aa/platformdirs-4.3.8-py3-none-any.whl", hash = "sha256:ff7059bb7eb1179e2685604f4aaf157cfd9535242bd23742eadc3c13542139b4", size = 18567, upload-time = "2025-05-07T22:47:40.376Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pluggy"
|
||||||
|
version = "1.6.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/f9/e2/3e91f31a7d2b083fe6ef3fa267035b518369d9511ffab804f839851d2779/pluggy-1.6.0.tar.gz", hash = "sha256:7dcc130b76258d33b90f61b658791dede3486c3e6bfb003ee5c9bfb396dd22f3", size = 69412, upload-time = "2025-05-15T12:30:07.975Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/54/20/4d324d65cc6d9205fabedc306948156824eb9f0ee1633355a8f7ec5c66bf/pluggy-1.6.0-py3-none-any.whl", hash = "sha256:e920276dd6813095e9377c0bc5566d94c932c33b27a3e3945d8389c374dd4746", size = 20538, upload-time = "2025-05-15T12:30:06.134Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pre-commit"
|
||||||
|
version = "4.2.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "cfgv" },
|
||||||
|
{ name = "identify" },
|
||||||
|
{ name = "nodeenv" },
|
||||||
|
{ name = "pyyaml" },
|
||||||
|
{ name = "virtualenv" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/08/39/679ca9b26c7bb2999ff122d50faa301e49af82ca9c066ec061cfbc0c6784/pre_commit-4.2.0.tar.gz", hash = "sha256:601283b9757afd87d40c4c4a9b2b5de9637a8ea02eaff7adc2d0fb4e04841146", size = 193424, upload-time = "2025-03-18T21:35:20.987Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/88/74/a88bf1b1efeae488a0c0b7bdf71429c313722d1fc0f377537fbe554e6180/pre_commit-4.2.0-py2.py3-none-any.whl", hash = "sha256:a009ca7205f1eb497d10b845e52c838a98b6cdd2102a6c8e4540e94ee75c58bd", size = 220707, upload-time = "2025-03-18T21:35:19.343Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pycodestyle"
|
||||||
|
version = "2.13.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/04/6e/1f4a62078e4d95d82367f24e685aef3a672abfd27d1a868068fed4ed2254/pycodestyle-2.13.0.tar.gz", hash = "sha256:c8415bf09abe81d9c7f872502a6eee881fbe85d8763dd5b9924bb0a01d67efae", size = 39312, upload-time = "2025-03-29T17:33:30.669Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/07/be/b00116df1bfb3e0bb5b45e29d604799f7b91dd861637e4d448b4e09e6a3e/pycodestyle-2.13.0-py2.py3-none-any.whl", hash = "sha256:35863c5974a271c7a726ed228a14a4f6daf49df369d8c50cd9a6f58a5e143ba9", size = 31424, upload-time = "2025-03-29T17:33:29.405Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pydantic"
|
||||||
|
version = "2.11.5"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "annotated-types" },
|
||||||
|
{ name = "pydantic-core" },
|
||||||
|
{ name = "typing-extensions" },
|
||||||
|
{ name = "typing-inspection" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/f0/86/8ce9040065e8f924d642c58e4a344e33163a07f6b57f836d0d734e0ad3fb/pydantic-2.11.5.tar.gz", hash = "sha256:7f853db3d0ce78ce8bbb148c401c2cdd6431b3473c0cdff2755c7690952a7b7a", size = 787102, upload-time = "2025-05-22T21:18:08.761Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b5/69/831ed22b38ff9b4b64b66569f0e5b7b97cf3638346eb95a2147fdb49ad5f/pydantic-2.11.5-py3-none-any.whl", hash = "sha256:f9c26ba06f9747749ca1e5c94d6a85cb84254577553c8785576fd38fa64dc0f7", size = 444229, upload-time = "2025-05-22T21:18:06.329Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pydantic-core"
|
||||||
|
version = "2.33.2"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "typing-extensions" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/ad/88/5f2260bdfae97aabf98f1778d43f69574390ad787afb646292a638c923d4/pydantic_core-2.33.2.tar.gz", hash = "sha256:7cb8bc3605c29176e1b105350d2e6474142d7c1bd1d9327c4a9bdb46bf827acc", size = 435195, upload-time = "2025-04-23T18:33:52.104Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/18/8a/2b41c97f554ec8c71f2a8a5f85cb56a8b0956addfe8b0efb5b3d77e8bdc3/pydantic_core-2.33.2-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:a7ec89dc587667f22b6a0b6579c249fca9026ce7c333fc142ba42411fa243cdc", size = 2009000, upload-time = "2025-04-23T18:31:25.863Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a1/02/6224312aacb3c8ecbaa959897af57181fb6cf3a3d7917fd44d0f2917e6f2/pydantic_core-2.33.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3c6db6e52c6d70aa0d00d45cdb9b40f0433b96380071ea80b09277dba021ddf7", size = 1847996, upload-time = "2025-04-23T18:31:27.341Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d6/46/6dcdf084a523dbe0a0be59d054734b86a981726f221f4562aed313dbcb49/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4e61206137cbc65e6d5256e1166f88331d3b6238e082d9f74613b9b765fb9025", size = 1880957, upload-time = "2025-04-23T18:31:28.956Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ec/6b/1ec2c03837ac00886ba8160ce041ce4e325b41d06a034adbef11339ae422/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:eb8c529b2819c37140eb51b914153063d27ed88e3bdc31b71198a198e921e011", size = 1964199, upload-time = "2025-04-23T18:31:31.025Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2d/1d/6bf34d6adb9debd9136bd197ca72642203ce9aaaa85cfcbfcf20f9696e83/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c52b02ad8b4e2cf14ca7b3d918f3eb0ee91e63b3167c32591e57c4317e134f8f", size = 2120296, upload-time = "2025-04-23T18:31:32.514Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e0/94/2bd0aaf5a591e974b32a9f7123f16637776c304471a0ab33cf263cf5591a/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:96081f1605125ba0855dfda83f6f3df5ec90c61195421ba72223de35ccfb2f88", size = 2676109, upload-time = "2025-04-23T18:31:33.958Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f9/41/4b043778cf9c4285d59742281a769eac371b9e47e35f98ad321349cc5d61/pydantic_core-2.33.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8f57a69461af2a5fa6e6bbd7a5f60d3b7e6cebb687f55106933188e79ad155c1", size = 2002028, upload-time = "2025-04-23T18:31:39.095Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/cb/d5/7bb781bf2748ce3d03af04d5c969fa1308880e1dca35a9bd94e1a96a922e/pydantic_core-2.33.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:572c7e6c8bb4774d2ac88929e3d1f12bc45714ae5ee6d9a788a9fb35e60bb04b", size = 2100044, upload-time = "2025-04-23T18:31:41.034Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/fe/36/def5e53e1eb0ad896785702a5bbfd25eed546cdcf4087ad285021a90ed53/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:db4b41f9bd95fbe5acd76d89920336ba96f03e149097365afe1cb092fceb89a1", size = 2058881, upload-time = "2025-04-23T18:31:42.757Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/01/6c/57f8d70b2ee57fc3dc8b9610315949837fa8c11d86927b9bb044f8705419/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:fa854f5cf7e33842a892e5c73f45327760bc7bc516339fda888c75ae60edaeb6", size = 2227034, upload-time = "2025-04-23T18:31:44.304Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/27/b9/9c17f0396a82b3d5cbea4c24d742083422639e7bb1d5bf600e12cb176a13/pydantic_core-2.33.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:5f483cfb75ff703095c59e365360cb73e00185e01aaea067cd19acffd2ab20ea", size = 2234187, upload-time = "2025-04-23T18:31:45.891Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b0/6a/adf5734ffd52bf86d865093ad70b2ce543415e0e356f6cacabbc0d9ad910/pydantic_core-2.33.2-cp312-cp312-win32.whl", hash = "sha256:9cb1da0f5a471435a7bc7e439b8a728e8b61e59784b2af70d7c169f8dd8ae290", size = 1892628, upload-time = "2025-04-23T18:31:47.819Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/43/e4/5479fecb3606c1368d496a825d8411e126133c41224c1e7238be58b87d7e/pydantic_core-2.33.2-cp312-cp312-win_amd64.whl", hash = "sha256:f941635f2a3d96b2973e867144fde513665c87f13fe0e193c158ac51bfaaa7b2", size = 1955866, upload-time = "2025-04-23T18:31:49.635Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0d/24/8b11e8b3e2be9dd82df4b11408a67c61bb4dc4f8e11b5b0fc888b38118b5/pydantic_core-2.33.2-cp312-cp312-win_arm64.whl", hash = "sha256:cca3868ddfaccfbc4bfb1d608e2ccaaebe0ae628e1416aeb9c4d88c001bb45ab", size = 1888894, upload-time = "2025-04-23T18:31:51.609Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/46/8c/99040727b41f56616573a28771b1bfa08a3d3fe74d3d513f01251f79f172/pydantic_core-2.33.2-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:1082dd3e2d7109ad8b7da48e1d4710c8d06c253cbc4a27c1cff4fbcaa97a9e3f", size = 2015688, upload-time = "2025-04-23T18:31:53.175Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3a/cc/5999d1eb705a6cefc31f0b4a90e9f7fc400539b1a1030529700cc1b51838/pydantic_core-2.33.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f517ca031dfc037a9c07e748cefd8d96235088b83b4f4ba8939105d20fa1dcd6", size = 1844808, upload-time = "2025-04-23T18:31:54.79Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/6f/5e/a0a7b8885c98889a18b6e376f344da1ef323d270b44edf8174d6bce4d622/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0a9f2c9dd19656823cb8250b0724ee9c60a82f3cdf68a080979d13092a3b0fef", size = 1885580, upload-time = "2025-04-23T18:31:57.393Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3b/2a/953581f343c7d11a304581156618c3f592435523dd9d79865903272c256a/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:2b0a451c263b01acebe51895bfb0e1cc842a5c666efe06cdf13846c7418caa9a", size = 1973859, upload-time = "2025-04-23T18:31:59.065Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e6/55/f1a813904771c03a3f97f676c62cca0c0a4138654107c1b61f19c644868b/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1ea40a64d23faa25e62a70ad163571c0b342b8bf66d5fa612ac0dec4f069d916", size = 2120810, upload-time = "2025-04-23T18:32:00.78Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/aa/c3/053389835a996e18853ba107a63caae0b9deb4a276c6b472931ea9ae6e48/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:0fb2d542b4d66f9470e8065c5469ec676978d625a8b7a363f07d9a501a9cb36a", size = 2676498, upload-time = "2025-04-23T18:32:02.418Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/eb/3c/f4abd740877a35abade05e437245b192f9d0ffb48bbbbd708df33d3cda37/pydantic_core-2.33.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9fdac5d6ffa1b5a83bca06ffe7583f5576555e6c8b3a91fbd25ea7780f825f7d", size = 2000611, upload-time = "2025-04-23T18:32:04.152Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/59/a7/63ef2fed1837d1121a894d0ce88439fe3e3b3e48c7543b2a4479eb99c2bd/pydantic_core-2.33.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:04a1a413977ab517154eebb2d326da71638271477d6ad87a769102f7c2488c56", size = 2107924, upload-time = "2025-04-23T18:32:06.129Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/04/8f/2551964ef045669801675f1cfc3b0d74147f4901c3ffa42be2ddb1f0efc4/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:c8e7af2f4e0194c22b5b37205bfb293d166a7344a5b0d0eaccebc376546d77d5", size = 2063196, upload-time = "2025-04-23T18:32:08.178Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/26/bd/d9602777e77fc6dbb0c7db9ad356e9a985825547dce5ad1d30ee04903918/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:5c92edd15cd58b3c2d34873597a1e20f13094f59cf88068adb18947df5455b4e", size = 2236389, upload-time = "2025-04-23T18:32:10.242Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/42/db/0e950daa7e2230423ab342ae918a794964b053bec24ba8af013fc7c94846/pydantic_core-2.33.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:65132b7b4a1c0beded5e057324b7e16e10910c106d43675d9bd87d4f38dde162", size = 2239223, upload-time = "2025-04-23T18:32:12.382Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/58/4d/4f937099c545a8a17eb52cb67fe0447fd9a373b348ccfa9a87f141eeb00f/pydantic_core-2.33.2-cp313-cp313-win32.whl", hash = "sha256:52fb90784e0a242bb96ec53f42196a17278855b0f31ac7c3cc6f5c1ec4811849", size = 1900473, upload-time = "2025-04-23T18:32:14.034Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a0/75/4a0a9bac998d78d889def5e4ef2b065acba8cae8c93696906c3a91f310ca/pydantic_core-2.33.2-cp313-cp313-win_amd64.whl", hash = "sha256:c083a3bdd5a93dfe480f1125926afcdbf2917ae714bdb80b36d34318b2bec5d9", size = 1955269, upload-time = "2025-04-23T18:32:15.783Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f9/86/1beda0576969592f1497b4ce8e7bc8cbdf614c352426271b1b10d5f0aa64/pydantic_core-2.33.2-cp313-cp313-win_arm64.whl", hash = "sha256:e80b087132752f6b3d714f041ccf74403799d3b23a72722ea2e6ba2e892555b9", size = 1893921, upload-time = "2025-04-23T18:32:18.473Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a4/7d/e09391c2eebeab681df2b74bfe6c43422fffede8dc74187b2b0bf6fd7571/pydantic_core-2.33.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:61c18fba8e5e9db3ab908620af374db0ac1baa69f0f32df4f61ae23f15e586ac", size = 1806162, upload-time = "2025-04-23T18:32:20.188Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f1/3d/847b6b1fed9f8ed3bb95a9ad04fbd0b212e832d4f0f50ff4d9ee5a9f15cf/pydantic_core-2.33.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:95237e53bb015f67b63c91af7518a62a8660376a6a0db19b89acc77a4d6199f5", size = 1981560, upload-time = "2025-04-23T18:32:22.354Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/6f/9a/e73262f6c6656262b5fdd723ad90f518f579b7bc8622e43a942eec53c938/pydantic_core-2.33.2-cp313-cp313t-win_amd64.whl", hash = "sha256:c2fc0a768ef76c15ab9238afa6da7f69895bb5d1ee83aeea2e3509af4472d0b9", size = 1935777, upload-time = "2025-04-23T18:32:25.088Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pydantic-settings"
|
||||||
|
version = "2.9.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "pydantic" },
|
||||||
|
{ name = "python-dotenv" },
|
||||||
|
{ name = "typing-inspection" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/67/1d/42628a2c33e93f8e9acbde0d5d735fa0850f3e6a2f8cb1eb6c40b9a732ac/pydantic_settings-2.9.1.tar.gz", hash = "sha256:c509bf79d27563add44e8446233359004ed85066cd096d8b510f715e6ef5d268", size = 163234, upload-time = "2025-04-18T16:44:48.265Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b6/5f/d6d641b490fd3ec2c4c13b4244d68deea3a1b970a97be64f34fb5504ff72/pydantic_settings-2.9.1-py3-none-any.whl", hash = "sha256:59b4f431b1defb26fe620c71a7d3968a710d719f5f4cdbbdb7926edeb770f6ef", size = 44356, upload-time = "2025-04-18T16:44:46.617Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pydocstyle"
|
||||||
|
version = "6.3.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "snowballstemmer" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/e9/5c/d5385ca59fd065e3c6a5fe19f9bc9d5ea7f2509fa8c9c22fb6b2031dd953/pydocstyle-6.3.0.tar.gz", hash = "sha256:7ce43f0c0ac87b07494eb9c0b462c0b73e6ff276807f204d6b53edc72b7e44e1", size = 36796, upload-time = "2023-01-17T20:29:19.838Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/36/ea/99ddefac41971acad68f14114f38261c1f27dac0b3ec529824ebc739bdaa/pydocstyle-6.3.0-py3-none-any.whl", hash = "sha256:118762d452a49d6b05e194ef344a55822987a462831ade91ec5c06fd2169d019", size = 38038, upload-time = "2023-01-17T20:29:18.094Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pyflakes"
|
||||||
|
version = "3.3.2"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/af/cc/1df338bd7ed1fa7c317081dcf29bf2f01266603b301e6858856d346a12b3/pyflakes-3.3.2.tar.gz", hash = "sha256:6dfd61d87b97fba5dcfaaf781171ac16be16453be6d816147989e7f6e6a9576b", size = 64175, upload-time = "2025-03-31T13:21:20.34Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/15/40/b293a4fa769f3b02ab9e387c707c4cbdc34f073f945de0386107d4e669e6/pyflakes-3.3.2-py2.py3-none-any.whl", hash = "sha256:5039c8339cbb1944045f4ee5466908906180f13cc99cc9949348d10f82a5c32a", size = 63164, upload-time = "2025-03-31T13:21:18.503Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pygments"
|
||||||
|
version = "2.19.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/7c/2d/c3338d48ea6cc0feb8446d8e6937e1408088a72a39937982cc6111d17f84/pygments-2.19.1.tar.gz", hash = "sha256:61c16d2a8576dc0649d9f39e089b5f02bcd27fba10d8fb4dcc28173f7a45151f", size = 4968581, upload-time = "2025-01-06T17:26:30.443Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8a/0b/9fcc47d19c48b59121088dd6da2488a49d5f72dacf8262e2790a1d2c7d15/pygments-2.19.1-py3-none-any.whl", hash = "sha256:9ea1544ad55cecf4b8242fab6dd35a93bbce657034b0611ee383099054ab6d8c", size = 1225293, upload-time = "2025-01-06T17:26:25.553Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pytest"
|
||||||
|
version = "8.3.5"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "colorama", marker = "sys_platform == 'win32'" },
|
||||||
|
{ name = "iniconfig" },
|
||||||
|
{ name = "packaging" },
|
||||||
|
{ name = "pluggy" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/ae/3c/c9d525a414d506893f0cd8a8d0de7706446213181570cdbd766691164e40/pytest-8.3.5.tar.gz", hash = "sha256:f4efe70cc14e511565ac476b57c279e12a855b11f48f212af1080ef2263d3845", size = 1450891, upload-time = "2025-03-02T12:54:54.503Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/30/3d/64ad57c803f1fa1e963a7946b6e0fea4a70df53c1a7fed304586539c2bac/pytest-8.3.5-py3-none-any.whl", hash = "sha256:c69214aa47deac29fad6c2a4f590b9c4a9fdb16a403176fe154b79c0b4d4d820", size = 343634, upload-time = "2025-03-02T12:54:52.069Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pytest-cov"
|
||||||
|
version = "6.1.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "coverage" },
|
||||||
|
{ name = "pytest" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/25/69/5f1e57f6c5a39f81411b550027bf72842c4567ff5fd572bed1edc9e4b5d9/pytest_cov-6.1.1.tar.gz", hash = "sha256:46935f7aaefba760e716c2ebfbe1c216240b9592966e7da99ea8292d4d3e2a0a", size = 66857, upload-time = "2025-04-05T14:07:51.592Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/28/d0/def53b4a790cfb21483016430ed828f64830dd981ebe1089971cd10cab25/pytest_cov-6.1.1-py3-none-any.whl", hash = "sha256:bddf29ed2d0ab6f4df17b4c55b0a657287db8684af9c42ea546b21b1041b3dde", size = 23841, upload-time = "2025-04-05T14:07:49.641Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pytest-mock"
|
||||||
|
version = "3.14.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "pytest" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/71/28/67172c96ba684058a4d24ffe144d64783d2a270d0af0d9e792737bddc75c/pytest_mock-3.14.1.tar.gz", hash = "sha256:159e9edac4c451ce77a5cdb9fc5d1100708d2dd4ba3c3df572f14097351af80e", size = 33241, upload-time = "2025-05-26T13:58:45.167Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b2/05/77b60e520511c53d1c1ca75f1930c7dd8e971d0c4379b7f4b3f9644685ba/pytest_mock-3.14.1-py3-none-any.whl", hash = "sha256:178aefcd11307d874b4cd3100344e7e2d888d9791a6a1d9bfe90fbc1b74fd1d0", size = 9923, upload-time = "2025-05-26T13:58:43.487Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "python-dotenv"
|
||||||
|
version = "1.1.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/88/2c/7bb1416c5620485aa793f2de31d3df393d3686aa8a8506d11e10e13c5baf/python_dotenv-1.1.0.tar.gz", hash = "sha256:41f90bc6f5f177fb41f53e87666db362025010eb28f60a01c9143bfa33a2b2d5", size = 39920, upload-time = "2025-03-25T10:14:56.835Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1e/18/98a99ad95133c6a6e2005fe89faedf294a748bd5dc803008059409ac9b1e/python_dotenv-1.1.0-py3-none-any.whl", hash = "sha256:d7c01d9e2293916c18baf562d95698754b0dbbb5e74d457c45d4f6561fb9d55d", size = 20256, upload-time = "2025-03-25T10:14:55.034Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "python-multipart"
|
||||||
|
version = "0.0.20"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/f3/87/f44d7c9f274c7ee665a29b885ec97089ec5dc034c7f3fafa03da9e39a09e/python_multipart-0.0.20.tar.gz", hash = "sha256:8dd0cab45b8e23064ae09147625994d090fa46f5b0d1e13af944c331a7fa9d13", size = 37158, upload-time = "2024-12-16T19:45:46.972Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/45/58/38b5afbc1a800eeea951b9285d3912613f2603bdf897a4ab0f4bd7f405fc/python_multipart-0.0.20-py3-none-any.whl", hash = "sha256:8a62d3a8335e06589fe01f2a3e178cdcc632f3fbe0d492ad9ee0ec35aab1f104", size = 24546, upload-time = "2024-12-16T19:45:44.423Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pyupgrade"
|
||||||
|
version = "3.20.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "tokenize-rt" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/c0/75/3df66861bca41394f05c5b818943fd0535bc02d5c5c512f9d859dec921f3/pyupgrade-3.20.0.tar.gz", hash = "sha256:dd6a16c13fc1a7db45796008689a9a35420bd364d681430f640c5e54a3d351ea", size = 45007, upload-time = "2025-05-23T18:55:43.239Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/63/1c/8412744f89cbd251f159f790980492b38468530117f614108196665d3b1a/pyupgrade-3.20.0-py2.py3-none-any.whl", hash = "sha256:cd5bf842b863f50adad324a01c30aef60b9f698a9814848094818659c92cd1f4", size = 62452, upload-time = "2025-05-23T18:55:41.62Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "pyyaml"
|
||||||
|
version = "6.0.2"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/54/ed/79a089b6be93607fa5cdaedf301d7dfb23af5f25c398d5ead2525b063e17/pyyaml-6.0.2.tar.gz", hash = "sha256:d584d9ec91ad65861cc08d42e834324ef890a082e591037abe114850ff7bbc3e", size = 130631, upload-time = "2024-08-06T20:33:50.674Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/86/0c/c581167fc46d6d6d7ddcfb8c843a4de25bdd27e4466938109ca68492292c/PyYAML-6.0.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:c70c95198c015b85feafc136515252a261a84561b7b1d51e3384e0655ddf25ab", size = 183873, upload-time = "2024-08-06T20:32:25.131Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a8/0c/38374f5bb272c051e2a69281d71cba6fdb983413e6758b84482905e29a5d/PyYAML-6.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ce826d6ef20b1bc864f0a68340c8b3287705cae2f8b4b1d932177dcc76721725", size = 173302, upload-time = "2024-08-06T20:32:26.511Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c3/93/9916574aa8c00aa06bbac729972eb1071d002b8e158bd0e83a3b9a20a1f7/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1f71ea527786de97d1a0cc0eacd1defc0985dcf6b3f17bb77dcfc8c34bec4dc5", size = 739154, upload-time = "2024-08-06T20:32:28.363Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/95/0f/b8938f1cbd09739c6da569d172531567dbcc9789e0029aa070856f123984/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9b22676e8097e9e22e36d6b7bda33190d0d400f345f23d4065d48f4ca7ae0425", size = 766223, upload-time = "2024-08-06T20:32:30.058Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b9/2b/614b4752f2e127db5cc206abc23a8c19678e92b23c3db30fc86ab731d3bd/PyYAML-6.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:80bab7bfc629882493af4aa31a4cfa43a4c57c83813253626916b8c7ada83476", size = 767542, upload-time = "2024-08-06T20:32:31.881Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d4/00/dd137d5bcc7efea1836d6264f049359861cf548469d18da90cd8216cf05f/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:0833f8694549e586547b576dcfaba4a6b55b9e96098b36cdc7ebefe667dfed48", size = 731164, upload-time = "2024-08-06T20:32:37.083Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c9/1f/4f998c900485e5c0ef43838363ba4a9723ac0ad73a9dc42068b12aaba4e4/PyYAML-6.0.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8b9c7197f7cb2738065c481a0461e50ad02f18c78cd75775628afb4d7137fb3b", size = 756611, upload-time = "2024-08-06T20:32:38.898Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/df/d1/f5a275fdb252768b7a11ec63585bc38d0e87c9e05668a139fea92b80634c/PyYAML-6.0.2-cp312-cp312-win32.whl", hash = "sha256:ef6107725bd54b262d6dedcc2af448a266975032bc85ef0172c5f059da6325b4", size = 140591, upload-time = "2024-08-06T20:32:40.241Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0c/e8/4f648c598b17c3d06e8753d7d13d57542b30d56e6c2dedf9c331ae56312e/PyYAML-6.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:7e7401d0de89a9a855c839bc697c079a4af81cf878373abd7dc625847d25cbd8", size = 156338, upload-time = "2024-08-06T20:32:41.93Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ef/e3/3af305b830494fa85d95f6d95ef7fa73f2ee1cc8ef5b495c7c3269fb835f/PyYAML-6.0.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:efdca5630322a10774e8e98e1af481aad470dd62c3170801852d752aa7a783ba", size = 181309, upload-time = "2024-08-06T20:32:43.4Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/45/9f/3b1c20a0b7a3200524eb0076cc027a970d320bd3a6592873c85c92a08731/PyYAML-6.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:50187695423ffe49e2deacb8cd10510bc361faac997de9efef88badc3bb9e2d1", size = 171679, upload-time = "2024-08-06T20:32:44.801Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7c/9a/337322f27005c33bcb656c655fa78325b730324c78620e8328ae28b64d0c/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0ffe8360bab4910ef1b9e87fb812d8bc0a308b0d0eef8c8f44e0254ab3b07133", size = 733428, upload-time = "2024-08-06T20:32:46.432Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a3/69/864fbe19e6c18ea3cc196cbe5d392175b4cf3d5d0ac1403ec3f2d237ebb5/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:17e311b6c678207928d649faa7cb0d7b4c26a0ba73d41e99c4fff6b6c3276484", size = 763361, upload-time = "2024-08-06T20:32:51.188Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/04/24/b7721e4845c2f162d26f50521b825fb061bc0a5afcf9a386840f23ea19fa/PyYAML-6.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:70b189594dbe54f75ab3a1acec5f1e3faa7e8cf2f1e08d9b561cb41b845f69d5", size = 759523, upload-time = "2024-08-06T20:32:53.019Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2b/b2/e3234f59ba06559c6ff63c4e10baea10e5e7df868092bf9ab40e5b9c56b6/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:41e4e3953a79407c794916fa277a82531dd93aad34e29c2a514c2c0c5fe971cc", size = 726660, upload-time = "2024-08-06T20:32:54.708Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/fe/0f/25911a9f080464c59fab9027482f822b86bf0608957a5fcc6eaac85aa515/PyYAML-6.0.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:68ccc6023a3400877818152ad9a1033e3db8625d899c72eacb5a668902e4d652", size = 751597, upload-time = "2024-08-06T20:32:56.985Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/14/0d/e2c3b43bbce3cf6bd97c840b46088a3031085179e596d4929729d8d68270/PyYAML-6.0.2-cp313-cp313-win32.whl", hash = "sha256:bc2fa7c6b47d6bc618dd7fb02ef6fdedb1090ec036abab80d4681424b84c1183", size = 140527, upload-time = "2024-08-06T20:33:03.001Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/fa/de/02b54f42487e3d3c6efb3f89428677074ca7bf43aae402517bc7cca949f3/PyYAML-6.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:8388ee1976c416731879ac16da0aff3f63b286ffdd57cdeb95f3f2e085687563", size = 156446, upload-time = "2024-08-06T20:33:04.33Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "referencing"
|
||||||
|
version = "0.36.2"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "attrs" },
|
||||||
|
{ name = "rpds-py" },
|
||||||
|
{ name = "typing-extensions", marker = "python_full_version < '3.13'" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/2f/db/98b5c277be99dd18bfd91dd04e1b759cad18d1a338188c936e92f921c7e2/referencing-0.36.2.tar.gz", hash = "sha256:df2e89862cd09deabbdba16944cc3f10feb6b3e6f18e902f7cc25609a34775aa", size = 74744, upload-time = "2025-01-25T08:48:16.138Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c1/b1/3baf80dc6d2b7bc27a95a67752d0208e410351e3feb4eb78de5f77454d8d/referencing-0.36.2-py3-none-any.whl", hash = "sha256:e8699adbbf8b5c7de96d8ffa0eb5c158b3beafce084968e2ea8bb08c6794dcd0", size = 26775, upload-time = "2025-01-25T08:48:14.241Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "requests"
|
||||||
|
version = "2.32.3"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "certifi" },
|
||||||
|
{ name = "charset-normalizer" },
|
||||||
|
{ name = "idna" },
|
||||||
|
{ name = "urllib3" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/63/70/2bf7780ad2d390a8d301ad0b550f1581eadbd9a20f896afe06353c2a2913/requests-2.32.3.tar.gz", hash = "sha256:55365417734eb18255590a9ff9eb97e9e1da868d4ccd6402399eaf68af20a760", size = 131218, upload-time = "2024-05-29T15:37:49.536Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f9/9b/335f9764261e915ed497fcdeb11df5dfd6f7bf257d4a6a2a686d80da4d54/requests-2.32.3-py3-none-any.whl", hash = "sha256:70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6", size = 64928, upload-time = "2024-05-29T15:37:47.027Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "rich"
|
||||||
|
version = "14.0.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "markdown-it-py" },
|
||||||
|
{ name = "pygments" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/a1/53/830aa4c3066a8ab0ae9a9955976fb770fe9c6102117c8ec4ab3ea62d89e8/rich-14.0.0.tar.gz", hash = "sha256:82f1bc23a6a21ebca4ae0c45af9bdbc492ed20231dcb63f297d6d1021a9d5725", size = 224078, upload-time = "2025-03-30T14:15:14.23Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0d/9b/63f4c7ebc259242c89b3acafdb37b41d1185c07ff0011164674e9076b491/rich-14.0.0-py3-none-any.whl", hash = "sha256:1c9491e1951aac09caffd42f448ee3d04e58923ffe14993f6e83068dc395d7e0", size = 243229, upload-time = "2025-03-30T14:15:12.283Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "rpds-py"
|
||||||
|
version = "0.25.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/8c/a6/60184b7fc00dd3ca80ac635dd5b8577d444c57e8e8742cecabfacb829921/rpds_py-0.25.1.tar.gz", hash = "sha256:8960b6dac09b62dac26e75d7e2c4a22efb835d827a7278c34f72b2b84fa160e3", size = 27304, upload-time = "2025-05-21T12:46:12.502Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7f/81/28ab0408391b1dc57393653b6a0cf2014cc282cc2909e4615e63e58262be/rpds_py-0.25.1-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:b5ffe453cde61f73fea9430223c81d29e2fbf412a6073951102146c84e19e34c", size = 364647, upload-time = "2025-05-21T12:43:28.559Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2c/9a/7797f04cad0d5e56310e1238434f71fc6939d0bc517192a18bb99a72a95f/rpds_py-0.25.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:115874ae5e2fdcfc16b2aedc95b5eef4aebe91b28e7e21951eda8a5dc0d3461b", size = 350454, upload-time = "2025-05-21T12:43:30.615Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/69/3c/93d2ef941b04898011e5d6eaa56a1acf46a3b4c9f4b3ad1bbcbafa0bee1f/rpds_py-0.25.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a714bf6e5e81b0e570d01f56e0c89c6375101b8463999ead3a93a5d2a4af91fa", size = 389665, upload-time = "2025-05-21T12:43:32.629Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c1/57/ad0e31e928751dde8903a11102559628d24173428a0f85e25e187defb2c1/rpds_py-0.25.1-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:35634369325906bcd01577da4c19e3b9541a15e99f31e91a02d010816b49bfda", size = 403873, upload-time = "2025-05-21T12:43:34.576Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/16/ad/c0c652fa9bba778b4f54980a02962748479dc09632e1fd34e5282cf2556c/rpds_py-0.25.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d4cb2b3ddc16710548801c6fcc0cfcdeeff9dafbc983f77265877793f2660309", size = 525866, upload-time = "2025-05-21T12:43:36.123Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2a/39/3e1839bc527e6fcf48d5fec4770070f872cdee6c6fbc9b259932f4e88a38/rpds_py-0.25.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9ceca1cf097ed77e1a51f1dbc8d174d10cb5931c188a4505ff9f3e119dfe519b", size = 416886, upload-time = "2025-05-21T12:43:38.034Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7a/95/dd6b91cd4560da41df9d7030a038298a67d24f8ca38e150562644c829c48/rpds_py-0.25.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2c2cd1a4b0c2b8c5e31ffff50d09f39906fe351389ba143c195566056c13a7ea", size = 390666, upload-time = "2025-05-21T12:43:40.065Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/64/48/1be88a820e7494ce0a15c2d390ccb7c52212370badabf128e6a7bb4cb802/rpds_py-0.25.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:1de336a4b164c9188cb23f3703adb74a7623ab32d20090d0e9bf499a2203ad65", size = 425109, upload-time = "2025-05-21T12:43:42.263Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/cf/07/3e2a17927ef6d7720b9949ec1b37d1e963b829ad0387f7af18d923d5cfa5/rpds_py-0.25.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:9fca84a15333e925dd59ce01da0ffe2ffe0d6e5d29a9eeba2148916d1824948c", size = 567244, upload-time = "2025-05-21T12:43:43.846Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d2/e5/76cf010998deccc4f95305d827847e2eae9c568099c06b405cf96384762b/rpds_py-0.25.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:88ec04afe0c59fa64e2f6ea0dd9657e04fc83e38de90f6de201954b4d4eb59bd", size = 596023, upload-time = "2025-05-21T12:43:45.932Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/52/9a/df55efd84403736ba37a5a6377b70aad0fd1cb469a9109ee8a1e21299a1c/rpds_py-0.25.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:a8bd2f19e312ce3e1d2c635618e8a8d8132892bb746a7cf74780a489f0f6cdcb", size = 561634, upload-time = "2025-05-21T12:43:48.263Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ab/aa/dc3620dd8db84454aaf9374bd318f1aa02578bba5e567f5bf6b79492aca4/rpds_py-0.25.1-cp312-cp312-win32.whl", hash = "sha256:e5e2f7280d8d0d3ef06f3ec1b4fd598d386cc6f0721e54f09109a8132182fbfe", size = 222713, upload-time = "2025-05-21T12:43:49.897Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/a3/7f/7cef485269a50ed5b4e9bae145f512d2a111ca638ae70cc101f661b4defd/rpds_py-0.25.1-cp312-cp312-win_amd64.whl", hash = "sha256:db58483f71c5db67d643857404da360dce3573031586034b7d59f245144cc192", size = 235280, upload-time = "2025-05-21T12:43:51.893Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/99/f2/c2d64f6564f32af913bf5f3f7ae41c7c263c5ae4c4e8f1a17af8af66cd46/rpds_py-0.25.1-cp312-cp312-win_arm64.whl", hash = "sha256:6d50841c425d16faf3206ddbba44c21aa3310a0cebc3c1cdfc3e3f4f9f6f5728", size = 225399, upload-time = "2025-05-21T12:43:53.351Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/2b/da/323848a2b62abe6a0fec16ebe199dc6889c5d0a332458da8985b2980dffe/rpds_py-0.25.1-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:659d87430a8c8c704d52d094f5ba6fa72ef13b4d385b7e542a08fc240cb4a559", size = 364498, upload-time = "2025-05-21T12:43:54.841Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1f/b4/4d3820f731c80fd0cd823b3e95b9963fec681ae45ba35b5281a42382c67d/rpds_py-0.25.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:68f6f060f0bbdfb0245267da014d3a6da9be127fe3e8cc4a68c6f833f8a23bb1", size = 350083, upload-time = "2025-05-21T12:43:56.428Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d5/b1/3a8ee1c9d480e8493619a437dec685d005f706b69253286f50f498cbdbcf/rpds_py-0.25.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:083a9513a33e0b92cf6e7a6366036c6bb43ea595332c1ab5c8ae329e4bcc0a9c", size = 389023, upload-time = "2025-05-21T12:43:57.995Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3b/31/17293edcfc934dc62c3bf74a0cb449ecd549531f956b72287203e6880b87/rpds_py-0.25.1-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:816568614ecb22b18a010c7a12559c19f6fe993526af88e95a76d5a60b8b75fb", size = 403283, upload-time = "2025-05-21T12:43:59.546Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/d1/ca/e0f0bc1a75a8925024f343258c8ecbd8828f8997ea2ac71e02f67b6f5299/rpds_py-0.25.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3c6564c0947a7f52e4792983f8e6cf9bac140438ebf81f527a21d944f2fd0a40", size = 524634, upload-time = "2025-05-21T12:44:01.087Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/3e/03/5d0be919037178fff33a6672ffc0afa04ea1cfcb61afd4119d1b5280ff0f/rpds_py-0.25.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5c4a128527fe415d73cf1f70a9a688d06130d5810be69f3b553bf7b45e8acf79", size = 416233, upload-time = "2025-05-21T12:44:02.604Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/05/7c/8abb70f9017a231c6c961a8941403ed6557664c0913e1bf413cbdc039e75/rpds_py-0.25.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a49e1d7a4978ed554f095430b89ecc23f42014a50ac385eb0c4d163ce213c325", size = 390375, upload-time = "2025-05-21T12:44:04.162Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7a/ac/a87f339f0e066b9535074a9f403b9313fd3892d4a164d5d5f5875ac9f29f/rpds_py-0.25.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:d74ec9bc0e2feb81d3f16946b005748119c0f52a153f6db6a29e8cd68636f295", size = 424537, upload-time = "2025-05-21T12:44:06.175Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1f/8f/8d5c1567eaf8c8afe98a838dd24de5013ce6e8f53a01bd47fe8bb06b5533/rpds_py-0.25.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:3af5b4cc10fa41e5bc64e5c198a1b2d2864337f8fcbb9a67e747e34002ce812b", size = 566425, upload-time = "2025-05-21T12:44:08.242Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/95/33/03016a6be5663b389c8ab0bbbcca68d9e96af14faeff0a04affcb587e776/rpds_py-0.25.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:79dc317a5f1c51fd9c6a0c4f48209c6b8526d0524a6904fc1076476e79b00f98", size = 595197, upload-time = "2025-05-21T12:44:10.449Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/33/8d/da9f4d3e208c82fda311bff0cf0a19579afceb77cf456e46c559a1c075ba/rpds_py-0.25.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:1521031351865e0181bc585147624d66b3b00a84109b57fcb7a779c3ec3772cd", size = 561244, upload-time = "2025-05-21T12:44:12.387Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e2/b3/39d5dcf7c5f742ecd6dbc88f6f84ae54184b92f5f387a4053be2107b17f1/rpds_py-0.25.1-cp313-cp313-win32.whl", hash = "sha256:5d473be2b13600b93a5675d78f59e63b51b1ba2d0476893415dfbb5477e65b31", size = 222254, upload-time = "2025-05-21T12:44:14.261Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5f/19/2d6772c8eeb8302c5f834e6d0dfd83935a884e7c5ce16340c7eaf89ce925/rpds_py-0.25.1-cp313-cp313-win_amd64.whl", hash = "sha256:a7b74e92a3b212390bdce1d93da9f6488c3878c1d434c5e751cbc202c5e09500", size = 234741, upload-time = "2025-05-21T12:44:16.236Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/5b/5a/145ada26cfaf86018d0eb304fe55eafdd4f0b6b84530246bb4a7c4fb5c4b/rpds_py-0.25.1-cp313-cp313-win_arm64.whl", hash = "sha256:dd326a81afe332ede08eb39ab75b301d5676802cdffd3a8f287a5f0b694dc3f5", size = 224830, upload-time = "2025-05-21T12:44:17.749Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/4b/ca/d435844829c384fd2c22754ff65889c5c556a675d2ed9eb0e148435c6690/rpds_py-0.25.1-cp313-cp313t-macosx_10_12_x86_64.whl", hash = "sha256:a58d1ed49a94d4183483a3ce0af22f20318d4a1434acee255d683ad90bf78129", size = 359668, upload-time = "2025-05-21T12:44:19.322Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/1f/01/b056f21db3a09f89410d493d2f6614d87bb162499f98b649d1dbd2a81988/rpds_py-0.25.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:f251bf23deb8332823aef1da169d5d89fa84c89f67bdfb566c49dea1fccfd50d", size = 345649, upload-time = "2025-05-21T12:44:20.962Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e0/0f/e0d00dc991e3d40e03ca36383b44995126c36b3eafa0ccbbd19664709c88/rpds_py-0.25.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8dbd586bfa270c1103ece2109314dd423df1fa3d9719928b5d09e4840cec0d72", size = 384776, upload-time = "2025-05-21T12:44:22.516Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9f/a2/59374837f105f2ca79bde3c3cd1065b2f8c01678900924949f6392eab66d/rpds_py-0.25.1-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:6d273f136e912aa101a9274c3145dcbddbe4bac560e77e6d5b3c9f6e0ed06d34", size = 395131, upload-time = "2025-05-21T12:44:24.147Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/9c/dc/48e8d84887627a0fe0bac53f0b4631e90976fd5d35fff8be66b8e4f3916b/rpds_py-0.25.1-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:666fa7b1bd0a3810a7f18f6d3a25ccd8866291fbbc3c9b912b917a6715874bb9", size = 520942, upload-time = "2025-05-21T12:44:25.915Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7c/f5/ee056966aeae401913d37befeeab57a4a43a4f00099e0a20297f17b8f00c/rpds_py-0.25.1-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:921954d7fbf3fccc7de8f717799304b14b6d9a45bbeec5a8d7408ccbf531faf5", size = 411330, upload-time = "2025-05-21T12:44:27.638Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/ab/74/b2cffb46a097cefe5d17f94ede7a174184b9d158a0aeb195f39f2c0361e8/rpds_py-0.25.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f3d86373ff19ca0441ebeb696ef64cb58b8b5cbacffcda5a0ec2f3911732a194", size = 387339, upload-time = "2025-05-21T12:44:29.292Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/7f/9a/0ff0b375dcb5161c2b7054e7d0b7575f1680127505945f5cabaac890bc07/rpds_py-0.25.1-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:c8980cde3bb8575e7c956a530f2c217c1d6aac453474bf3ea0f9c89868b531b6", size = 418077, upload-time = "2025-05-21T12:44:30.877Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/0d/a1/fda629bf20d6b698ae84c7c840cfb0e9e4200f664fc96e1f456f00e4ad6e/rpds_py-0.25.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:8eb8c84ecea987a2523e057c0d950bcb3f789696c0499290b8d7b3107a719d78", size = 562441, upload-time = "2025-05-21T12:44:32.541Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/20/15/ce4b5257f654132f326f4acd87268e1006cc071e2c59794c5bdf4bebbb51/rpds_py-0.25.1-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:e43a005671a9ed5a650f3bc39e4dbccd6d4326b24fb5ea8be5f3a43a6f576c72", size = 590750, upload-time = "2025-05-21T12:44:34.557Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/fb/ab/e04bf58a8d375aeedb5268edcc835c6a660ebf79d4384d8e0889439448b0/rpds_py-0.25.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:58f77c60956501a4a627749a6dcb78dac522f249dd96b5c9f1c6af29bfacfb66", size = 558891, upload-time = "2025-05-21T12:44:37.358Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/90/82/cb8c6028a6ef6cd2b7991e2e4ced01c854b6236ecf51e81b64b569c43d73/rpds_py-0.25.1-cp313-cp313t-win32.whl", hash = "sha256:2cb9e5b5e26fc02c8a4345048cd9998c2aca7c2712bd1b36da0c72ee969a3523", size = 218718, upload-time = "2025-05-21T12:44:38.969Z" },
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b6/97/5a4b59697111c89477d20ba8a44df9ca16b41e737fa569d5ae8bff99e650/rpds_py-0.25.1-cp313-cp313t-win_amd64.whl", hash = "sha256:401ca1c4a20cc0510d3435d89c069fe0a9ae2ee6495135ac46bdd49ec0495763", size = 232218, upload-time = "2025-05-21T12:44:40.512Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "setuptools"
|
||||||
|
version = "80.8.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/8d/d2/ec1acaaff45caed5c2dedb33b67055ba9d4e96b091094df90762e60135fe/setuptools-80.8.0.tar.gz", hash = "sha256:49f7af965996f26d43c8ae34539c8d99c5042fbff34302ea151eaa9c207cd257", size = 1319720, upload-time = "2025-05-20T14:02:53.503Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/58/29/93c53c098d301132196c3238c312825324740851d77a8500a2462c0fd888/setuptools-80.8.0-py3-none-any.whl", hash = "sha256:95a60484590d24103af13b686121328cc2736bee85de8936383111e421b9edc0", size = 1201470, upload-time = "2025-05-20T14:02:51.348Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "shellingham"
|
||||||
|
version = "1.5.4"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/58/15/8b3609fd3830ef7b27b655beb4b4e9c62313a4e8da8c676e142cc210d58e/shellingham-1.5.4.tar.gz", hash = "sha256:8dbca0739d487e5bd35ab3ca4b36e11c4078f3a234bfce294b0a0291363404de", size = 10310, upload-time = "2023-10-24T04:13:40.426Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e0/f9/0595336914c5619e5f28a1fb793285925a8cd4b432c9da0a987836c7f822/shellingham-1.5.4-py2.py3-none-any.whl", hash = "sha256:7ecfff8f2fd72616f7481040475a65b2bf8af90a56c89140852d1120324e8686", size = 9755, upload-time = "2023-10-24T04:13:38.866Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "sniffio"
|
||||||
|
version = "1.3.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/a2/87/a6771e1546d97e7e041b6ae58d80074f81b7d5121207425c964ddf5cfdbd/sniffio-1.3.1.tar.gz", hash = "sha256:f4324edc670a0f49750a81b895f35c3adb843cca46f0530f79fc1babb23789dc", size = 20372, upload-time = "2024-02-25T23:20:04.057Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/e9/44/75a9c9421471a6c4805dbf2356f7c181a29c1879239abab1ea2cc8f38b40/sniffio-1.3.1-py3-none-any.whl", hash = "sha256:2f6da418d1f1e0fddd844478f41680e794e6051915791a034ff65e5f100525a2", size = 10235, upload-time = "2024-02-25T23:20:01.196Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "snowballstemmer"
|
||||||
|
version = "3.0.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/75/a7/9810d872919697c9d01295633f5d574fb416d47e535f258272ca1f01f447/snowballstemmer-3.0.1.tar.gz", hash = "sha256:6d5eeeec8e9f84d4d56b847692bacf79bc2c8e90c7f80ca4444ff8b6f2e52895", size = 105575, upload-time = "2025-05-09T16:34:51.843Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c8/78/3565d011c61f5a43488987ee32b6f3f656e7f107ac2782dd57bdd7d91d9a/snowballstemmer-3.0.1-py3-none-any.whl", hash = "sha256:6cd7b3897da8d6c9ffb968a6781fa6532dce9c3618a4b127d920dab764a19064", size = 103274, upload-time = "2025-05-09T16:34:50.371Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "sse-starlette"
|
||||||
|
version = "2.3.5"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "anyio" },
|
||||||
|
{ name = "starlette" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/10/5f/28f45b1ff14bee871bacafd0a97213f7ec70e389939a80c60c0fb72a9fc9/sse_starlette-2.3.5.tar.gz", hash = "sha256:228357b6e42dcc73a427990e2b4a03c023e2495ecee82e14f07ba15077e334b2", size = 17511, upload-time = "2025-05-12T18:23:52.601Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/c8/48/3e49cf0f64961656402c0023edbc51844fe17afe53ab50e958a6dbbbd499/sse_starlette-2.3.5-py3-none-any.whl", hash = "sha256:251708539a335570f10eaaa21d1848a10c42ee6dc3a9cf37ef42266cdb1c52a8", size = 10233, upload-time = "2025-05-12T18:23:50.722Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "starlette"
|
||||||
|
version = "0.46.2"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "anyio" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/ce/20/08dfcd9c983f6a6f4a1000d934b9e6d626cff8d2eeb77a89a68eef20a2b7/starlette-0.46.2.tar.gz", hash = "sha256:7f7361f34eed179294600af672f565727419830b54b7b084efe44bb82d2fccd5", size = 2580846, upload-time = "2025-04-13T13:56:17.942Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8b/0c/9d30a4ebeb6db2b25a841afbb80f6ef9a854fc3b41be131d249a977b4959/starlette-0.46.2-py3-none-any.whl", hash = "sha256:595633ce89f8ffa71a015caed34a5b2dc1c0cdb3f0f1fbd1e69339cf2abeec35", size = 72037, upload-time = "2025-04-13T13:56:16.21Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "tokenize-rt"
|
||||||
|
version = "6.2.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/69/ed/8f07e893132d5051d86a553e749d5c89b2a4776eb3a579b72ed61f8559ca/tokenize_rt-6.2.0.tar.gz", hash = "sha256:8439c042b330c553fdbe1758e4a05c0ed460dbbbb24a606f11f0dee75da4cad6", size = 5476, upload-time = "2025-05-23T23:48:00.035Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/33/f0/3fe8c6e69135a845f4106f2ff8b6805638d4e85c264e70114e8126689587/tokenize_rt-6.2.0-py2.py3-none-any.whl", hash = "sha256:a152bf4f249c847a66497a4a95f63376ed68ac6abf092a2f7cfb29d044ecff44", size = 6004, upload-time = "2025-05-23T23:47:58.812Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "typer"
|
||||||
|
version = "0.16.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "click" },
|
||||||
|
{ name = "rich" },
|
||||||
|
{ name = "shellingham" },
|
||||||
|
{ name = "typing-extensions" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/c5/8c/7d682431efca5fd290017663ea4588bf6f2c6aad085c7f108c5dbc316e70/typer-0.16.0.tar.gz", hash = "sha256:af377ffaee1dbe37ae9440cb4e8f11686ea5ce4e9bae01b84ae7c63b87f1dd3b", size = 102625, upload-time = "2025-05-26T14:30:31.824Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/76/42/3efaf858001d2c2913de7f354563e3a3a2f0decae3efe98427125a8f441e/typer-0.16.0-py3-none-any.whl", hash = "sha256:1f79bed11d4d02d4310e3c1b7ba594183bcedb0ac73b27a9e5f28f6fb5b98855", size = 46317, upload-time = "2025-05-26T14:30:30.523Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "typing-extensions"
|
||||||
|
version = "4.13.2"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/f6/37/23083fcd6e35492953e8d2aaaa68b860eb422b34627b13f2ce3eb6106061/typing_extensions-4.13.2.tar.gz", hash = "sha256:e6c81219bd689f51865d9e372991c540bda33a0379d5573cddb9a3a23f7caaef", size = 106967, upload-time = "2025-04-10T14:19:05.416Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/8b/54/b1ae86c0973cc6f0210b53d508ca3641fb6d0c56823f288d108bc7ab3cc8/typing_extensions-4.13.2-py3-none-any.whl", hash = "sha256:a439e7c04b49fec3e5d3e2beaa21755cadbbdc391694e28ccdd36ca4a1408f8c", size = 45806, upload-time = "2025-04-10T14:19:03.967Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "typing-inspection"
|
||||||
|
version = "0.4.1"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "typing-extensions" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/f8/b1/0c11f5058406b3af7609f121aaa6b609744687f1d158b3c3a5bf4cc94238/typing_inspection-0.4.1.tar.gz", hash = "sha256:6ae134cc0203c33377d43188d4064e9b357dba58cff3185f22924610e70a9d28", size = 75726, upload-time = "2025-05-21T18:55:23.885Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/17/69/cd203477f944c353c31bade965f880aa1061fd6bf05ded0726ca845b6ff7/typing_inspection-0.4.1-py3-none-any.whl", hash = "sha256:389055682238f53b04f7badcb49b989835495a96700ced5dab2d8feae4b26f51", size = 14552, upload-time = "2025-05-21T18:55:22.152Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "urllib3"
|
||||||
|
version = "2.4.0"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/8a/78/16493d9c386d8e60e442a35feac5e00f0913c0f4b7c217c11e8ec2ff53e0/urllib3-2.4.0.tar.gz", hash = "sha256:414bc6535b787febd7567804cc015fee39daab8ad86268f1310a9250697de466", size = 390672, upload-time = "2025-04-10T15:23:39.232Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/6b/11/cc635220681e93a0183390e26485430ca2c7b5f9d33b15c74c2861cb8091/urllib3-2.4.0-py3-none-any.whl", hash = "sha256:4e16665048960a0900c702d4a66415956a584919c03361cac9f1df5c5dd7e813", size = 128680, upload-time = "2025-04-10T15:23:37.377Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "uvicorn"
|
||||||
|
version = "0.34.2"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "click" },
|
||||||
|
{ name = "h11" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/a6/ae/9bbb19b9e1c450cf9ecaef06463e40234d98d95bf572fab11b4f19ae5ded/uvicorn-0.34.2.tar.gz", hash = "sha256:0e929828f6186353a80b58ea719861d2629d766293b6d19baf086ba31d4f3328", size = 76815, upload-time = "2025-04-19T06:02:50.101Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/b1/4b/4cef6ce21a2aaca9d852a6e84ef4f135d99fcd74fa75105e2fc0c8308acd/uvicorn-0.34.2-py3-none-any.whl", hash = "sha256:deb49af569084536d269fe0a6d67e3754f104cf03aba7c11c40f01aadf33c403", size = 62483, upload-time = "2025-04-19T06:02:48.42Z" },
|
||||||
|
]
|
||||||
|
|
||||||
|
[[package]]
|
||||||
|
name = "virtualenv"
|
||||||
|
version = "20.31.2"
|
||||||
|
source = { registry = "https://pypi.org/simple" }
|
||||||
|
dependencies = [
|
||||||
|
{ name = "distlib" },
|
||||||
|
{ name = "filelock" },
|
||||||
|
{ name = "platformdirs" },
|
||||||
|
]
|
||||||
|
sdist = { url = "https://files.pythonhosted.org/packages/56/2c/444f465fb2c65f40c3a104fd0c495184c4f2336d65baf398e3c75d72ea94/virtualenv-20.31.2.tar.gz", hash = "sha256:e10c0a9d02835e592521be48b332b6caee6887f332c111aa79a09b9e79efc2af", size = 6076316, upload-time = "2025-05-08T17:58:23.811Z" }
|
||||||
|
wheels = [
|
||||||
|
{ url = "https://files.pythonhosted.org/packages/f3/40/b1c265d4b2b62b58576588510fc4d1fe60a86319c8de99fd8e9fec617d2c/virtualenv-20.31.2-py3-none-any.whl", hash = "sha256:36efd0d9650ee985f0cad72065001e66d49a6f24eb44d98980f630686243cf11", size = 6057982, upload-time = "2025-05-08T17:58:21.15Z" },
|
||||||
|
]
|
||||||
Reference in New Issue
Block a user