Active Scanner for Mitmproxy2Swagger
This module provides an automated way to crawl, inspect, and fuzz API endpoints to generate a high-quality OpenAPI specification with latency metrics.
Components
- Crawler (
src/scanner/crawler.py): Explores a website, renders JavaScript (Playwright), and captures all network traffic. - Prober (
src/scanner/prober.py): Actively probes discovered endpoints with multiple requests to gather statistical performance data. - Core (
mitmproxy2swagger): Converts the captured traffic (HAR/Flow) into an OpenAPI Spec.
Usage Guide
Usage Guide
Run the full active scan in one click:
uv run scanner https://example.com
This will automatically:
- Crawl the website to discover endpoints.
- Start a proxy (mitmdump) in the background.
- Probe/Fuzz the endpoints through the proxy.
- Generate the final OpenAPI spec (
final_spec.yaml).
With Authentication
Using Headers (e.g., Bearer Token):
uv run scanner https://api.example.com \
--header "Authorization: Bearer YOUR_TOKEN"Using Cookies (e.g., Session ID):
uv run scanner https://dashboard.example.com \
--cookie "session_id=xyz123"Advanced Usage
You can still customize the run:
uv run scanner https://example.com \ --depth 3 \ --proxy-port 8081 \ --final-spec my_api.yaml
Requirements
- Python 3.10+
- uv package manager
- Playwright (
uv run playwright install)
Setup
1. Installation
-
Install
uv(if not already installed):curl -LsSf https://astral.sh/uv/install.sh | sh -
Sync dependencies:
-
Install Playwright browsers:
uv run playwright install
2. Development Setup
To ensure code quality, we use ruff and pre-commit.
-
Install pre-commit hooks:
uv run pre-commit install
-
Run linting manually (optional):
uv run ruff check . uv run ruff format .