Metadata-Version: 2.4
Name: a2a-pack
Version: 0.1.0
Summary: Developer SDK + CLI for building, packaging, and deploying A2A agents.
Project-URL: Homepage, https://a2acloud.io
Project-URL: Documentation, https://docs.a2acloud.io
Project-URL: Repository, https://gitea.a2acloud.io/gitea_admin/a2a-pack
Project-URL: Issues, https://gitea.a2acloud.io/gitea_admin/a2a-pack/issues
Author-email: a2a cloud <hello@a2acloud.io>
License: MIT License
        
        Copyright (c) 2026 a2a cloud
        
        Permission is hereby granted, free of charge, to any person obtaining a copy
        of this software and associated documentation files (the "Software"), to deal
        in the Software without restriction, including without limitation the rights
        to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
        copies of the Software, and to permit persons to whom the Software is
        furnished to do so, subject to the following conditions:
        
        The above copyright notice and this permission notice shall be included in all
        copies or substantial portions of the Software.
        
        THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
        IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
        FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
        AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
        LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
        OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
        SOFTWARE.
License-File: LICENSE
Keywords: a2a,agent,agents,ai,llm,marketplace,mcp,microvm,model-context-protocol,sandbox
Classifier: Development Status :: 4 - Beta
Classifier: Environment :: Web Environment
Classifier: Framework :: FastAPI
Classifier: Intended Audience :: Developers
Classifier: License :: OSI Approved :: MIT License
Classifier: Operating System :: OS Independent
Classifier: Programming Language :: Python :: 3
Classifier: Programming Language :: Python :: 3 :: Only
Classifier: Programming Language :: Python :: 3.11
Classifier: Programming Language :: Python :: 3.12
Classifier: Topic :: Software Development :: Libraries :: Application Frameworks
Classifier: Topic :: Software Development :: Libraries :: Python Modules
Classifier: Typing :: Typed
Requires-Python: >=3.11
Requires-Dist: fastapi>=0.110
Requires-Dist: httpx>=0.27
Requires-Dist: jinja2>=3
Requires-Dist: pydantic>=2.6
Requires-Dist: pyyaml>=6
Requires-Dist: rich>=13
Requires-Dist: typer>=0.12
Requires-Dist: uvicorn[standard]>=0.27
Provides-Extra: dev
Requires-Dist: build>=1.2; extra == 'dev'
Requires-Dist: httpx>=0.27; extra == 'dev'
Requires-Dist: pytest-asyncio>=0.23; extra == 'dev'
Requires-Dist: pytest>=8; extra == 'dev'
Requires-Dist: twine>=5; extra == 'dev'
Description-Content-Type: text/markdown

# a2a-pack

**Developer SDK + CLI for building, packaging, and deploying [A2A](https://a2acloud.io) agents.**

One Python class becomes a sandboxed, discoverable, MCP-compatible AI
agent on the [a2a cloud](https://a2acloud.io) platform. Other agents
reach yours via HMAC-signed grants. The platform owns deployment,
execution, permissions, and (when you're ready) billing.

```bash
pip install a2a-pack
a2a signup --email you@example.com --password ...
a2a init research-agent
cd research-agent
a2a deploy
# → https://research-agent.a2acloud.io   (TLS, MCP, OpenAPI, all wired)
```

## What an agent looks like

```python
from pydantic import BaseModel
from a2a_pack import (
    A2AAgent, LLMProvisioning, NoAuth, Pricing, RunContext, skill,
)


class GreeterConfig(BaseModel):
    suffix: str = "!"


class Greeter(A2AAgent[GreeterConfig, NoAuth]):
    name = "greeter"
    description = "Say hi."
    version = "0.1.0"
    config_model = GreeterConfig
    auth_model = NoAuth

    # Use the caller's own LLM key (forwarded by the platform) — the
    # author's price stays small; the LLM bill goes to the caller's
    # provider directly.
    llm_provisioning = LLMProvisioning.CALLER_PROVIDED
    pricing = Pricing(price_per_call_usd=0.01, caller_pays_llm=True)

    @skill(description="Greet someone.")
    async def greet(self, ctx: RunContext[NoAuth], who: str) -> str:
        await ctx.emit_progress(f"greeting {who}")
        return f"hello {who}{self.config.suffix}"
```

That's it. `a2a deploy` packages the source, the control plane builds
the image, ArgoCD reconciles, you get a public URL.

## Public surface

| Concept | Where |
|---|---|
| `A2AAgent` base class + `@skill` decorator | `a2a_pack.agent` |
| `RunContext`, `ctx.llm`, `ctx.ask`, `ctx.request_scope` | `a2a_pack.context` |
| Grant mint/verify (HMAC, audience-bound, glob-filtered, time-limited) | `a2a_pack.grants` |
| Workspace negotiation surface | `a2a_pack.workspace` |
| Sandbox client (microVM via libkrun) | `a2a_pack.sandbox` |
| Agent-to-agent client (HTTP, in-memory, custom) | `a2a_pack.a2a_client` |
| MCP server (skills → tools, mountable into your FastAPI app) | `a2a_pack.mcp` |
| Lifecycle / Resources / Pricing / LLMProvisioning declarations | `a2a_pack.runtime` |
| Card schema (auto-derived from your class) | `a2a_pack.card` |

Full reference + auto-generated docs at **https://docs.a2acloud.io**.

## Self-hosting

The platform pieces (control plane, sandbox runtime, gitea, ArgoCD,
MinIO, LiteLLM) live at
[gitea.a2acloud.io](https://gitea.a2acloud.io) — the SDK is the only
piece you need on PyPI. If you want to run the whole stack locally
or in your own cluster, the bootstrap recipe is in the platform
[README](https://gitea.a2acloud.io/gitea_admin/a2a-pack).

## License

MIT — see [LICENSE](LICENSE).
